Distributed Peer Review (DPR)

Period 111
 

The DPR will be open on October 14th, 2022. The deadline for delivering the reviews (grades and comments) is

 

November 16th, 2022, 12:00 CET

 

Instructions for reviewers are provided in this presentation.

 

Access the Proposal Evaluation Interface (PEI) here. The PEI is active only during the proposal evaluation phase, and is accessible only to PIs of DPR proposals (or their delegates).

Distributed Peer Review Rules and Guidelines

Introduction

ESO introduced Distributed Peer Review (DPR) in P110. More details and background information on DPR at ESO can be found in Patat et al. (2019). In this paradigm, first introduced by Merrifield & Saari (2009) and deployed for the Fast Turnaround channel at Gemini, all PIs of proposals qualifying for DPR accept to review a number of proposals submitted by their peers during the same cycle. Similarly, they accept that their proposal/s is/are reviewed by an equal number of peers who submitted proposals in the same cycle.

This paradigm has been deployed at ALMA as of Cycle 8. The implementation of DPR at ESO is based on the results of the DPR experiment and on the experience gained at ALMA.

The deployment of DPR at ESO stems from the Time Allocation Working Group (TAWG) report, and aims at addressing the following issues:

  • With the growth in the number of proposals submitted to ESO, the load on the panels and the Observing Programmes Committee (OPC) members has become unsustainable;
  • If, on the one hand, the introduction of triage (for the bottom 30%) has reduced the panels workload, on the other it has significantly degraded the quality of feedback for the triaged proposals;
  • Because of the exceptional workload, it has become progressively harder to find scientists willing to serve in the panels and in the OPC;
  • The community has been steadily reporting a significant level of dissatisfaction with the feedback they receive from the reviewers; although this, to some extent, is certainly inherent to the subjectivity of the process, it is probably also a consequence of the heavy load on the referees;
  • Because of the large number of proposals, the increasing level of specialisation in the time requests, and the limited number of reviewers who can be managed via the classical panel paradigm, it has become progressively more difficult to find optimal proposal-referee matches;
  • The classical panel schema and its implicit logistical/organisational constraints limit the number of reviews one can have for each single proposal, hence reducing the statistical basis of the final evaluation.

 

Although at ESO the OPC and panel members are nominated by the community via the Users Committee, they constitute an elitary set of scientists. Offering the possibility to a significantly larger fraction of researchers to shape ESO's scientific output has the potential of providing a fairer and more diverse representation of the community's needs. DPR gives the chance of participating to this important process and to fully realise all the difficulties inherent to the proposal evaluation and feedback process. In addition to the non-negligible educational aspects this entails, it contributes to increase the awareness about the peer-review limitations and to better put the feedback in context.

Proposals qualifying for Distributed Peer Review

The criteria describing the proposals which qualify for DPR are presented in the Call for Proposals of the cycle for which the telescope time requests are submitted. These may change from cycle to cycle, especially in the first semesters following the DPR deployment.

The criteria for P111 are as follows (note that there no change in respect to P110):

  1. All proposals requesting a total time (including overheads) less than 16 hours are assigned to DPR
  2. Exceptions to this general rule are:
    • Proposals including at least one ToO run;
    • Calibration proposals;
    • Joint VLT-XMM proposals.
  3. All other proposals submitted during a regular cycle will be reviewed in the classical way by the OPC and the panels

The time threshold is initially set to have an approximate 50/50 distribution between DPR and panels, and it is based on the time request statistics compiled in recent cycles. The threshold will likely evolve in the next semesters. Update (30.03.2022): in P110, 436 proposals (50.1% of all submitted proposals) were assigned to the DPR. The time requested by these proposals amounts to about 16% of the total time request in this period.

Important note: In P111, the review channel (DPR vs. panels) is assigned at the time of proposal submission, based on the above rules. The PI (or delegated PI, hereafter dPI) is informed about the assigned review process and prompted to formally accept the conditions at the time of submission. At this stage the PI/dPI can delegate the reviewer's role to one of the co-Is listed in the proposal. The delegation can also occurr when the list of co-Is is specified.

 

By submitting a proposal qualifying for DPR, the PI/dPI commits to follow the rules and regulations presented in the next sections.

Rules and guidelines for applicants

The time request of a proposal should be dictated by the goals of the science case driving the request, and should not be tuned to make the proposal fit into a pre-selected review process (DPR vs panels). In P111 the review process is assigned at submission time according to the above rules. However, in future semesters the assignement will probably be moved to after the proposal submission deadline, to match the time request distribution and/or to fulfill other operational or proposal distribution needs. PIs should therefore focus only on the science case and on the objectives they wish to achieve.

  • If a proposal qualifies for DPR, by submitting it the PI/dPI accepts the following terms and conditions:
    1. the PI/dPI (or the delegated reviewer indicated from the list of co-Is) will receive 10 proposals to review. This applies to each proposal submitted by the same PI/dPI (e.g. a PI submitting 2 proposals will have to provide reviews for 20 proposals);
    2. Failing to provide the reviews by the deadline will lead to the automatic rejection of the proposal/s submitted by the given PI/dPI;
    3. The PI/dPI (or the delegated reviewer) will have to sign the non-disclosure agreement before starting the review;
    4. The reviewer is expected to carefully read all the assigned proposals, grade them and write the feedback to the applicants following the rules and guidelines.
  • All PIs/dPI and co-Is are eligible for the reviewer's role. By default, the reviewer's role is assigned to the PI/dPI. However, should the PI/dPI not wish to act as reviewer, the role can be delegated to any of the co-Is listed in the proposal. By submitting the proposal, the PI/dPI takes full responsibility for the implicit commitment by the delegated collaborator to complete the review as specified above;
  • The reviewer role cannot be changed after the proposal submission deadline has expired. Only very exceptional cases (e.g. illness, urgent care for family members) will be considered;
  • No filtering will be applied during the proposal-reviewer assignement. The level of expertise and seniority distribution of the reviewers will reflect that of the underlying PI/dPI population. The algorithm used for the reviewer-proposal matching will optimize the distribution as to ensure a fair match for all proposals;
  • Each proposal will be assigned to 10 reviewers, choosen among all eligible PI/dPI (or delegated reviewer) of the given cycle, and selected based on their expertise as declared in their User Portal profile via the Scientific Keywords. It is therefore very important that the User Portal profile is updated and the Scientific Keywords properly listed;
  • Note that each prospective co-I of a proposal must have a User Portal profile in order for the PI/dPI to be able to include them in the proposal. Furthermore, it is not possible to be a co-I of a proposal unless the relevant Scientific Keywords have been specified in the User Portal profile;
  • The proposal-referee assignements will be performed using the Scientific Keywords specified in the proposal under the dedicated section (for more details see the p1 user manual). It is therefore very important that the PI/dPI specifies the Scientific Keywords in the most complete way in the proposal;
  • Both in the User Portal and in the proposal, the Scientific Keywords must be listed in descending importance order. The drag function of the p1 and User Portal web interfaces can be used to adjust the order of the specified keywords;
  • In the proposals, a minimum of two and a maximum of five Scientific Keywords can be specified. The keywords must be selected keeping in mind their main purpose, that is to enable an optimised proposal-referee matching. Poorly specified keywords may lead to mismatches between the science case of the proposal and the reviewer's area of expertise;
  • Like in the case of the DPR Experiment, a fraction of the reviewers will be selected among the non-experts, to reduce possible systematic effects.
  • As in the case of all ESO proposals, applicants must comply with the dual-anonymisation rules and guidelines;
  • Once the review process is completed, the PI/dPI of proposals reviewed by DPR will receive the overall ranking expressed in terms of quartiles of the grade distribution and the single, unedited and fully anonymous comments from all the reviewers assigned to their proposal/s;
  • The PI/dPI will be requested to provide a simple evaluation (useful/intermediate/not useful) for each single comment provided by the reviewers for their proposal/s.

Proposal Evaluation Interface

The proposal navigation and review is performed using the Proposal Evaluation Interface (PEI), which you can access directly from your User Portal account. This is a very simple, self-explanatory, web-based interface, which does not require any installation.

PEI becomes active only if your have submitted a proposal which qualifies for DPR (or if you have been delegated the reviewer role), and only after the Observing Programmes Office has set the cycle to the appropriate state. The PEI will also allow you to digitally sign the non-disclosure agreement and to declare your possible conflicts (see below).

Instructions on the usage of PEI are provided in this presentation.

Rules and guidelines for reviewers

General

  • By submitting a proposal which qualifies for the DPR process (as PI/dPI or as delegated reviewer) you have accepted to review 10 proposals submitted by your peers during the same ESO cycle. As such, you are expected to deliver the evaluations and the comments by the deadline. Failing to do this will lead to the automatic rejection of the DPR proposal/s in which your are PI/dPI and/or for which you have been indicated as delegated reviewer.
  • During the whole review process you are expected to behave ethically. This covers the confidentiality matters (as detailed in the non-disclosure agreement that you have to sign before starting the review; see below) but also the part related to the feedback you will be providing for each of the proposals you will be assigned. Violations of this code of conduct will lead to the rejection of the proposal/s in which you are PI/dPI and/or for which you have been indicated as delegated reviewer.
  • If you are a PhD student and you wish to obtain advice from your supervisor, you can request the permission to share the material you have been given access to. You will remain fully responsible for the fulfillment of the non-disclosure conditions. The same procedure applies to the case in which you are a supervisor and you wish to share some of the material with your student/s, for well justified didactical purposes.
  • As a reviewer, you are supposed to provide constructive feedback, using appropriate, factual and non-offensive language. In doing this, keep in mind that your comments will be passed unhedited to the PIs. ESO will take very seriously possible cases of offensive and inappropriate language used by the reviewers.
  • The algorithm that distributes the proposals takes into account the obvious conflicts (PI-, coI-ship, affiliations, etc). During the initial phase of the review you will have the possibility of declaring scientific conflicts (e.g. you are a direct competitor) and/or personal conflicts (e.g. despite of the anonymisation, you unambiguously identified the proposal as being submitted by a close collaborator of yours). Note that you will have a maximum number of conflicts to be flagged, and therefore you should restrict yourself to those you feel you are not in a position of expressing an objective opinion. Conflict flagging must not be used to reduce the number of reviews you have to deliver by the given deadline
  • You will be able to flag your conficts directly in the Proposal Evaluation Interface (PEI), which will also allow you to digitally sign the non-disclosure agreement.
  • The proposal for which you flag a conflict (be it scientific or personal) will be automatically removed from your list in the PEI, and you will not have access to it. The flagged proposals will not be replaced by other proposals.
  • As of P106, all ESO PIs/dPIs are required to abide to the anonymisation rules and guidelines. In our experience, the vast majority of the users do conform to the anonymisation policy. Nevertheless, should you feel there is an anonymisation issue, the PEI will allow you to signal it to the Observing Programmes Office (you will also have the possibility of entering some explanatory text).
  • The evaluation of the proposals should proceed irrespective of the possible violation/s of the anonymisation rules. Once you have signalled the violation, you should ignore it and proceed with the evaluation. The Observing Programmes Office will take care of the necessary actions, including the possible disqualification of the proposal.
  • Note that ESO will consider proposals for disqualification only if they contain information which leads directly and unambiguously to the identification of the proposing team. Other violations (typically requiring some further and deliberate research on the reviewer's side) will be considered as minor, and the PI/dPI will only be warned and discouraged from introducing similar instances in the future.
  • During the review you should also keep in mind the spirit of the anonymisation: focus on the science and not on the scientists. Do not try to double-guess the team's identity. Also, be aware of the fact that in some cases it is practically impossible for certain teams to conceal their identities, even in the case they have thoroughly followed the anonymisation guidelines. Conversely, the experience shows that, in many cases, the guesses are wrong.
  • As a reviewer, you must provide your feedback in a completely anonymous way. The phrasing must be neutral and must not disclose, directly or indirectly, your identity.

 

Important note on confidentiality

As a reviewer you will have access to information which is covered by intellectual property. None of that information can be disseminated, copied or plagiarised. At the end of the review, once the process is completed, the Observing Programmes Office will remove your proposal access rights. Should you have downloaded on your disk or printed the proposals which were assigned to you, you must remove/destroy them. ESO will take violations to the non-disclosure agreement very seriously. For the exact confidentiality terms, please carefully read the non-disclosure agreement before you sign it and you start the review.

 

Proposal review and grading

  • For each proposal you will be providing a grade (between 1=outstanding and 5=unsuitable). As in the case of the classical review, the meaning of the scale is as follows:

1.0 – outstanding: breakthrough science

1.5 – excellent: definitely above average

2.0 – very good: no significant weaknesses

2.5 – good: minor deficiences do not detract from strong scientific case

3.0 – fair: good scientific case, but with definite weaknesses

3.5 – rather weak: limited science return prospects

4.0 – weak: little scientific value and/or questionable scientific strategy

4.5 – very weak: deficiences outweight strengths

5.0 – unsuitable

  • Proposals with grades larger than 3.0 will not be considered for scheduling;
  • The full grade scale (1 to 5) should be used so as to ensure that the resulting ranking of the proposals is as meaningful as possible. Grades can and should be specified with one decimal digit (e.g. 2.7); 
  • While evaluating a proposal, do not try to double guess what grade you should give it in order to have it scheduled. Keep in mind that you are only reviewing a very small fraction of proposals, and that the final grade will be computed as the aggregated value resulting from the evaluations of another 7-9 reviewers. Also, the same grade has very different implications at different telescopes, depending on their demand. Finally, other constraints (like RA distribution, moon restrictions and atmospheric conditions) have a critical role in the final outcome of the scheduling process. You should focus only on the evaluation of the scientific cases which are assigned to you.
  • While reviewing the proposals you should keep in mind these aspects:
    • Does the proposal clearly indicate which important, outstanding question/s will be addressed?
    • Is there sufficient background/context for the non-expert (i.e., someone not specialized in this particular sub-field)?
    • Are previous results (either by proposers themselves or in the published literature) clearly presented?
    • Are the proposed observations and the Immediate Objectives pertinent to the background description?
    • Is the sample selection clearly described, or, if a single target, is its choice justified?
    • Are the instrument modes, and target location(s) specified clearly?
    • Is the signal-to-noise ratio specified in the proposal sufficient to reach the scientific goals?
    • Will the proposed observations add significantly to the knowledge of this particular field?
  • In general, the scientific merit should be assessed solely on the content of the proposal, according to the above criteria. Proposals may contain references to published papers. Consultation of those references should not, however, be required for a general understanding of the proposal.
  • Please note that ESO encourages reviewers to give full consideration to well-designed, high-risk/high-impact proposals even if there is no guarantee of a positive outcome or definite detection.
  • We have observed that there is a systematic negative attitude against proposal requesting fairly large amounts of time. Although the DPR review is by construction limited to short proposals, keep in mind this aspect when comparing short requests (e.g. few hours) to the largest requests in your sample (e.g. 15 hours).
  • After the DPR and classical panel reviews are completed, proposals with high chances of being allocated time are submitted to a technical feasibility assessement by the instrument specialists. Therefore, as a rule, your should not worry about the technical aspects of the proposal. In case of major concerns, please contact the Observing Programmes Office using the ESO ticketing system, describing the potential issue/s and clearly indicating the proposal ID (for this, the last for characters are sufficient; e.g. 23QK).
  • Note that you will be asked to categorize your expertise level with respect to each single proposal you review (expert/intermediate/non expert) via the PEI. This information will not be shared with the PI/dPI, but will be used by ESO to monitor the performance of the proposal distribution tool and for other statistical purposes.
  • Should you not feel sufficiently prepared in the specific sub-field of the proposal, just declare your expertise level and proceed with the review, based on your competencies. Be aware of the fact that the applicants are explicitly told that: "Proposers should keep in mind the need for each OPC panel to cover a broad range of scientific areas. As a result, a particular proposal may not fall within the main area of specialisation of any of the panel members. Proposers should make sure that the context of their project and its relevance for general astrophysics, as well as any recent related results, are emphasised in a way that can be understood by their peers regardless of their expertise." (Call for Proposals, Section 1.2). A failure in achieving this goal may be considered as a weakness of the proposal.

 

Feedback to the PI/dPI

One of the most frequent complaints about the review process (if not the only one) is related to the alleged poor quality of the feedback the PIs receive from the reviewers. As the DPR is meant to mitigate this issue, it is particularly important that you, as a reviewer, provide a comprehensive and constructive feedback to your peers. In this process, it greatly helps keeping in mind that you should provide a feedback of the same quality you are expecting from your peers. Also, consider that the main purpose of the feedback is to describe the weaknesses of the proposals, possibly suggesting ways of improving both quality and clarity.

The following points should help you to complete the final task of the review process:

  • Summarize both the strengths and weaknesses of the proposal.
    • A summary of both the strengths and weaknesses can help PIs understand what aspects of the project are strong, and which aspects need to be improved.
    • Reviews should focus on the major strengths and major weaknesses. Avoid giving the impression that a minor weakness was the cause of a poor ranking. Many proposals do not have obvious weaknesses but are just less compelling than others; in such a case, acknowledge that the considered proposal is good but that there were others that were more compelling.  
    • Take care to ensure that the strengths and weaknesses do not contradict each other.
  • Be objective
    • Be as specific as possible when commenting on the proposal. Avoid generic statements that could apply to most proposals.
    • If necessary, provide references to support your critique.
    • All reviews should be impersonal, critiquing the proposal and not the proposal team. For example, do not write "The PI did not [...].", but instead write "The proposal did not [...]”.
    • At the time of writing your feedback, you will not know whether the proposal will be allocated time (ESO will add a statement to this effect). Therefore, the reviews should be phrased in such a way that they are meaningful regardless of the final outcome. 
  • Be concise
    • It is not necessary to write a lengthy review. An informative review can be only a few sentences in length if it is concise and informative. But, please avoid writing only a single, generic sentence. In addition to being useless, this will most certainly trigger a reaction on the PI's side, which ESO will have to follow up.
  • Be professional and constructive
    • Do not use inappropriate, offensive, sarcastic and/or insulting language, even if you think a proposal could be greatly improved.
    • Use complete sentences when writing your reviews. Try to use correct grammar, spelling, and punctuation.
    • Keep in mind that your feedback is going to be passed unedited to the applicants, and that you are the only responsible for the content of the comments and their integrity.
  • Other best practices
    • Do not summarize the proposal: the applicants know it very well. If you reckon it is useful, start with a brief outline of the application.
    • Do not include statements about scheduling or technical feasibility. These will be addressed by the Observatory.
    • Do not include explicit references to other proposals that you are reviewing (e.g. programme IDs).
    • Do not ask questions: this is not an iterative process. If the question stems from a weakness, state the weakness explicitly.
  • Re-read your reviews and scientific rankings
    • Once you have completed your assessments, re-read your comments as if you were the recipient. If they do not sound useful and/or constructive, edit them.
    • Check that strengths and weaknesses are consistent with the scientific merit implied by the grade. Definitely avoid cases in which you do not list any weakness although you gave a poor grade.

 

Example  of good feedback

Earth's centrality and immobility are fundamental tenets of Aristotelian cosmology. Their validity has been recently questioned by new experimental evidences, which challenge our understanding of the universe. [short proposal description]

The proposed observations are very well justified and have the potential of disproving the assumptions on which Aristotelian cosmology is based on. The targets are well selected and the observing strategy and data reduction are properly described and well thought. [strengths]

The proposal, however, lacks a proper explanation about how the data will be interpreted. In particular, it is not clear how they will allow the discrimination between the two chief systems of the world. While on the one hand the confirmation and the characterisation of the phases of Venus will provide a strong argument for it to be orbiting the sun, it is not clear why this should then apply to Earth. The arguments put forward by the proposal are not convincing and not sufficiently quantitative. The proposal should have included a thorough discussion on how the proposed observations will lead to the rejection of the Ptolemaic model, and at which confidence level. [weaknesses and suggestions for improvement]

 

Example of bad feedback

Poorly written proposal, not making a sufficiently compelling case.

 

Unconscious bias

Unconscious bias is an omnipresent ingredient in all processes involving human judgemnt. And telescope time allocation is no exception. See Patat (2016) for an analysis of the ESO case, and refer to this page and to these references for more information on this subject.

We all have innate biases and/or mental schemas against gender/nationality/ethnicity/seniority and/or teams/techniques/instruments/affiliations. In nature, these biases are very efficient in protecting us from potentially dangerous situations, but they also constitute what can easily turn into an inaccurate way of processing information.

Awareness is the only key to overcome this problem.

Proposal anonymisation has greatly contributed to address some of these biases. However, in your reviewer's role, your should not underestimate these systematic effects. When you try to indentify the best science, you need to remain fully conscious of other potential biases, mental schemas or factors that may influence your objectivity.

Like all similar organisations, ESO is committed to awarding telescope time purely on the basis of scientific merit. Therefore, ESO strives to rise the awareness about the role that unconscious bias can play in the review process. Reviewers should also recognize that English is a second language for many, if not most, ESO users. Although applicants should make every possible effort to improve the quality and clarity of the language they use in their time requests, the reviewers should focus on the science case, and not be distracted and/or influenced by grammatical errors or other language imperfections.

Frequently asked questions on Distributed Peer Review

Why is ESO introducing DPR?

ESO has introduced DPR following the recommendations of the Time Allocation Working Group, after consulting with the Scientific Technical Commitee, the Users Committee and the OPC, and after running a pilot study (the DPR Experiment). Council has approved the usage of DPR in its telescope time allocation process to the discretion of the Director General (see the ESO Optical/Infrared Telescopes Science Operations Policies document). DPR offers a number of advantages over the classical panel schema, whose main (but not only) limitation is the necessarily small number of reviewers. In addition to causing a very significant overload on the referees (with measureable consequences on the level of feedback they can provide), it prevents a flexible and more effective proposal-reviewer matching. In addition, ESO plans to introduce a Fast Track Channel which, by construction, will require a DPR process.

How and when will I know whether my proposal/s will be reviewed by the panels or via DPR?

In P111 the review process will be assigned automatically at the time of submission, based on the criteria described above. You will be informed in the moment you push the "submit" button. In case your proposal qualifies for DPR, you will be prompted to accept the conditions it implies.

Which tool will I use to review the proposals assigned to me?

You will be using the Proposal Evaluation Interface (PEI), which you will access directly from your User Portal account. This is a very simple, self-explanatory web-based tool which does not require any installation. It is a simplified version of the one used by the panels. Instructions for its usage can be found in this presentation. The proposals assigned to you will be visible only when the Observing Programmes Office will open the review. See the top of this page for the relevant information on the review timeline. Once the review is completed, the access to the proposals will be disabled.

How was the DPR time threshold set?

The time threshold was initially set to roughly split the DPR vs. panel distribution in a 50/50 fraction, based on the observed time request distributions of the recent semesters. The rationale is that, with this choice, the load on the panels will be reduced by the same amount, bringing down the number of proposals they have to review to a more reasonable value. In turn, this will free resources for the review and discussion of larger time requests, including the Large Programmes. In P111 the time threshold is set and announced a priori in the Call for Proposals. In the future, ESO will likely postpone this to after the proposal deadline, when the real time distribution is known. This will allow for a more optimised splitting of the DPR vs. panel workload. For reference the actual DPR/panel distribution in P110 was 50.1/49.9% (436 and 435 proposals, respectively).

When is the deadline for delivering my reviews?

The deadline changes from period to period and is provided at the top of this page.

How much time do I have to complete my reviews?

You typically have between 4 and 5 weeks to complete your reviews. The exact extent of the proposal review phase is provided at the top of this page.

What happens if I do not deliver the reviews by the deadline?

The proposal/s which you have submitted via the DPR channel as PI (or for which you were delegated as reviewer by the PI) will be automatically rejected.

Why are proposals containing at least one Target of Opportunity run assigned to the panel review?

This is because the Science Policy foresees that the list of recommended Target of Opportunity runs is approved by the Observing Programmes Committee.

I am a PhD student, I do not feel comfortable with acting as a reviewer, but I indeed wish to submit a proposal as PI. What can I do?

In this case you should retain the PI role, and delegate the reviewer's role. You can do this indicating the delegated person from the list of co-Is you have specified in your proposal. This can be done at any time during the proposal preparation in the p1 interface. When you submit the proposal, you will be asked to confirm and, by doing this, you will take full responsibility for the selection of the delegated reviewer and their commitment to compete the task by the given deadline. Please note that it will not be possible to change the delegation after the proposal submission deadline has expired.

I am a PhD student, and I would like to get some advice from my supervisor. Would this break the non-disclosure agreement I signed?

As indicated in the non-disclosure agreement, you can ask for the permission of sharing the confidential information. Once you have obtained a written statement from ESO, you can share the material wth your supervisor. In this case you take the responsibility for the spervisor to abide to the terms of the non-disclosure agreement you have signed.

Why is ESO asking me to provide feedback on the quality of the comments I will receive from the DPR process?

One of the reasons which triggered the introduction of DPR at ESO and elsewhere is the significant level of unsatisfaction for the feedback which is provided to the users. For this reason, we are asking you to evaluate its quality in terms of usefulness for improving your proposal/s. It is very important for us that you keep this in mind when you express your evaluation. This will allow us to monitor a number of aspects, including the performance of single reviewers, the overall satisfaction level and the possible correlations between the usefulness of the feedback and the declared (or estimated) properties of the reviewer.

Why is ESO asking me to provide a self-evaluation of my expertise level for the DPR proposals I am reviewing? How is this information going to be used and by whom?

This information is collected for pure monitoring and statistical purposes. It will allow a robust comparison between the self-perceived and the estimated levels of expertise with respect to the given science case, to validate the performance of the distribution algorithm. It will also provide another indicator to be used in the study of correlations between the quality of feedback and the reviewers properties. The data are going to be accessible only to ESO and will not be shared with the PIs.

Who will review my proposal in case it qualifies for DPR?

Your proposal will be assigned to 10 peers selected among the full list of PI/dPI (and/or delegated reviewers) for the given cycle, based on their self-declared expertise (extracted from their User Portal profiles) and the scientific keywords you have specified in your proposal.

In case my proposal is assigned to DPR, will I be reviewing exactly 10 proposals?

Not necessarily. That is  to be considered as the maximum number of proposals you will be assigned. The actual number may be smaller, depending on a number of factors, including the conflicts you will declare.

Will my proposal be reviewed by students? Does ESO plan to weight the reviews based on the seniority level?

Your proposal will be reviewed by 10 peers. The composition of this set of reviewers will reflect the underlying population of PI/dPI (and or delegated reviewers). As, on average, ~10% of ESO proposals are submitted by student PIs, it is expected that the set of reviewers for each proposal will include, on average, one student. At the time of writing these notes it is not clear how many students will prefer to delegate the reviewer's role to a more senior member of the proposing team. However, if any, the above estimate is probably an upper limit. And, no, ESO does not plan to use weights for the evaluations provided by the reviewers based on any of their characteristics (age, gender, nationality, professional seniority, ...).

I understand my proposal will be reviewed by N peers, who will all have to review N proposals each. Will these peers review the same set of N proposals?

No. The proposals will be assigned based on an optimal distribution of scores, but in such a way that there is the smallest possible intersection between the sets of proposals assigned to each reviewer. This minimizes the impact of possible systematic effects.

What are ESO's plans for DPR in future cycles?

The plan is to progressively increase the fraction of proposals assigned to the DPR process. The fraction and the timing will be decided based on the experience gained and the feedback collected during the first semesters. Also, ESO is actively exploring the possibility of using a machine learning approach for computing the proposal-referee matching scores to be used in the proposal distribution, along the lines of the DeepThought approach used in the DPR Experiment.

How does ESO plan to monitor the performance of DPR?

ESO will monitor the level of satisfaction about the feeback provided by the DPR using the data provided by the users. It is known, from the Users Committee reports, that the classical panel paradigm deployed so far at ESO is producing a ~30% level of unsatisfaction. Changes in this fraction will provide an immediate measure of the performance of the new paradigm, at least in this area. Then, a number of other studies will be possible including, for instance, the performance of the distribution algorithm (by correlating the level of feedback usefulness with the proposal-referee matching score or the self-declared level of expertise). Other systematic trends (e.g. gender/country/seniority/scientific area/etc) will also be monitored. The plan is to publish a first report after a few DPR cycles have been completed.

I am concerned about the fact that my proposal will be seen by a random person. I fear this will lead to possible cases of uncontrolled plagiarism. How will ESO ensure this will not happen?

In the classical schema, a panel member has access to all the proposals assigned to their panel (typically 80). In the DPR case this number is reduced to 8-10. Therefore, under the reasonable hypothesis that the fraction of “malevolent” scientists is the same in both review bodies, one can reasonably expect that, on average, the two processes are similarly prone to confidentiality issues. In this respect, ESO plans to proceed in the same way as for the panels. As in that case, it is impossible to prevent information leakage (or, in the worst case, full plagiarism). However, all cases of demonstrated violation of the non-disclosure agreement will be taken very seriously. To reassure you, we have been made aware of very few such cases in the whole ESO proposals submission history and we do not expect this to change with the deployment of DPR.

I am at a loss and I do not know what to do. Could you please help me?

You can submit your request via the ESO ticketing system. If you are already logged in the User Portal, you can access it directly from here. Otherwise you can access it from your User Portal account (under "Ask for Help/Contact us"). Please, include all the required details in your ticket, so that we can provide you with a prompt and comprehensive reply.

 

Back to the top of the page.

 

Last update: Fri Apr 29 12:24:14 CEST 2022