🔍

Science Highlight

 

 

More...
news!

Guidelines for Reviewers

Review criteria

Each proposal contains a cover sheet, a Scientific Justification, and a Technical Justification. Reviewers need to read each of these sections. Note in particular that the Technical Justification often contains a detailed justification of the requested sensitivity, angular resolution, and correlator setup that will be useful in evaluating the proposal.

Reviewers should assess the scientific merit of the proposals to the best of their ability using the following criteria:

The overall scientific merit of the proposed investigation and its potential contribution to the advancement of scientific knowledge.

  • Does the proposal clearly indicate which important, outstanding questions will be addressed?
  • Will the proposed observations have a high scientific impact on this particular field and address the specific science goals of the proposal? ALMA encourages reviewers to give full consideration to well-designed high-risk/high-impact proposals even if there is no guarantee of a positive outcome or definite detection.
  • Does the proposal clearly describe how the data will be analyzed in order to achieve the science goals?

The suitability of the observations to achieve the scientific goals.

  • Is the choice of target (or targets) clearly described and well justified?
  • Are the requested signal-to-noise ratio, angular resolution, largest angular scale, and spectral setup sufficient to achieve the science goals and well justified?
  • Does the proposal justify why new observations are needed to achieve the science goals?
  • For Joint Proposals (see the Proposer’s Guide), does the proposal clearly describe why observations from multiple observatories are required to achieve the science goals?

In general, the scientific merit should be assessed solely on the content of the proposal, according to the above criteria. Proposals may contain references to published papers (including preprints) as per standard practice in the scientific literature. Consultation of those references should not, however, be required for a general understanding of the proposal.

Technical feasibility.  

The ALMA Observing Tool (OT) validates most technical aspects of the proposal; e.g., the OT verifies that the angular resolution can be achieved, verifies the correlator setup is feasible, and provides an accurate estimate of the integration time needed to achieve the requested sensitivity. Reviewers should assume that the OT technical validation of the proposal is correct. Reviewers should not downgrade a proposal merely because of the amount of time requested by the proposal. However, the reviewers can and should consider if the requested signal-to-noise ratio, angular resolution, largest angular scale, and spectral setup as requested by the PI are sufficient to achieve the scientific goals of the proposal and are well justified.

Reviewers can contact the Proposal Handling Team (PHT) through the ALMA helpdesk by opening a ticket to the department called "Proposal Review Support".   Reviewers in the distributed peer review process may also note any technical concerns of a proposal in their comments to the JAO in the Reviewer Tool, and the panel members of the ALMA Proposal Review Committee (APRC) may also do so by adding a comment to the PHT in the ARP Meeting Tool during the panel meeting. ALMA will evaluate these technical concerns if the proposal is accepted.

Scheduling feasibility.

Reviewers should not consider the scheduling feasibility in assigning their rankings. ALMA will assess the scheduling feasibility when building the observing queue and forward this information to the PI when needed.

Resubmissions.

Reviewers should evaluate a resubmission of a previously accepted proposal as they would any other proposal according to the review criteria. If the proposal is accepted, any science goals which have already been observed will be descoped.

Joint proposals.

Joint proposals should be evaluated as any other proposal, following the review criteria stated above.

Additional review criteria for Large Programs

For Large Programs, in addition to the review criteria above, reviewers should also consider the following criteria:

  • Does the Large Program address a strategic scientific issue and have the potential to lead to a major advance or breakthrough in the field that cannot be achieved by combining regular proposals?
  • Are the data products that will be delivered by the proposal team appropriate given the scope of the proposal and will the products be of value to the community?
  • Is the publication plan appropriate for the scope of the proposal?
  • Is the organization of the team and available computing resources sufficient to complete the project in a timely fashion?

Team expertise

The ALMA Proposal Review Committee (APRC) will evaluate the team expertise statements for Large Programs to assess if the proposal team is prepared to complete the project in a timely fashion. The team expertise statements will be evaluated only after the APRC has completed the scientific rankings of the Large Programs. The evaluation of the team expertise statements will not be used to modify the scientific rankings. Any concerns that the APRC has about the team expertise of a Large Program will be communicated to the ALMA Director, who will make the final decision on whether to accept the proposal.

Technical and scheduling feasibility.  

ALMA will assess the technical feasibility and scheduling feasibility of the Large Programs and report the results to the APRC.

 

Conflict criteria

The goal of the review assignments is to provide informed, unbiased assessments of the proposals. In general, a reviewer has a major conflict of interest when their personal or work interests would benefit if the proposal under review is accepted or rejected.

To allow better identification of conflicts of interest, a reviewer has the option to submit a list of investigators for whom they have a major conflict of interest. If this list is provided, reviewers will not be assigned a proposal in which a PI, co-PI, or co-I is in their list. Reviewers can set their conflicts of interest list through their user preferences on the ALMA Science Portal, where one can search for members in the ALMA user database. Reviewers only need to identify conflicts of interest that are registered ALMA users because all proposers, and therefore all reviewers, must be registered.  The list of conflicts of interest should include the following:

  • Close collaborators, that are defined as a substantial collaboration on three or more papers within the past three years or an active, substantial collaboration on a current project. Membership in a large project team on its own does not constitute a conflict of interest
  • Students and postdocs under supervision of the reviewer within the past three years
  • A reviewer’s supervisor (for student and postdoc reviewers)
  • Close personal ties (e.g., family member, partner) that are ALMA users
  • Any other reason in which a reviewer believes a major conflict of interest exists

Before assigning proposals, the PHT will identify major conflicts of interest based on the following criteria:

  • The PI, reviewer, or mentor of the submitted proposal is a PI or co-I of the proposal to be reviewed.
  • The PI, one of the co-PIs, or one of the co-Is of the proposals to be reviewed is in the conflicts-of-interest list provided by the reviewer or mentor of the submitted proposal.
  • If the reviewer or mentor does not provide such a list of investigators for the conflicts of interest check, conflicts of interest will be identified as when the reviewer or mentor of the submitted proposal and the PI of the proposal to be reviewed have been a PI/co-PI vs. co-I combination in three or more proposals in the past three cycles, including DDT cycles and supplemental calls.

When reviewers receive their proposal assignments, they may identify additional conflicts of interest that were not identified by the above checks. Potential conflicts of interest at this stage include:

  • The reviewer is proposing to observe the same object(s) with similar science objectives.
  • The reviewer had provided significant advice to the proposal team on the proposal even though they are not listed as an investigator.
  • Other reasons that the reviewer believes there is a strong conflict of interest.

Reviewers should not declare a conflict of interest solely because they suspect they can identify the proposal team, or because of lack of expertise with a given proposal scientific category and keyword.

Reviewers should inform the PHT of any major conflict of interest in their assignments by rejecting the proposal assignment and indicate why they believe a major conflict of interest exists. This is done through the Reviewer Tool (for distributed peer review) or the Assessor Tool (for the APRC ). The PHT will evaluate the reported conflict(s). In distributed peer review, if the conflict of interest is approved, a new proposal will be assigned to the reviewer.

Student reviewers participating in distributed peer review must declare any additional conflict that applies to either themselves or to their mentor. Mentors have read-only access to their mentees' proposal sets, thus their possible conflicts of interest with the proposals assigned to their mentees have to be declared by their mentees. Student reviewers should work with their mentor to ensure that the conflicts of interest are identified accurately.

 

Writing reviews to the PIs

Clear and thoughtful reviews from reviewers can help PIs improve their proposed project and write stronger proposals in the future. Reviews must be written in English.

Reviewers in the distributed peer review process shall provide a scientific review and a rank for each proposal. The reviews will be sent anonymously to PIs without any editing by the PHT, along with the rank each reviewer provided. If reviewers participate in Stage 2, then the revised reviews and ranks will be sent to the PIs; if not, the Stage 1 reviews and ranks will be sent. The typical length of an individual review is approximately 700 characters, or about 6 sentences. The Reviewer Tool will require reviews to be at least 200 characters.

In the panel review process, Primary Assessors shall prepare consensus reports based on the discussion of the proposal at the panel meeting. Consensus reports will be sent to the PIs.

Here are guidelines to assist reviewers in writing useful reviews.

 Guidelines

  1. Summarize both the strengths and weaknesses of the proposal
  • A summary of both the strengths and weaknesses can help PIs understand what aspects of the project are strong, and which aspects need to be improved in any future proposal.
  • Reviews should focus on the major strengths and major weaknesses. Whenever possible, state how the proposal could have been better. However, avoid giving the impression that a minor weakness was the cause of a poor ranking. Many proposals do not have obvious weaknesses but are just less compelling than others; in such a case, acknowledge that the considered proposal is good but that there were others that were more compelling.
  • Take care to ensure that the strengths and weaknesses do not contradict each other and reflect the rank given to the proposal.
  1. Be objective.
  • Be as specific as possible when commenting on the proposal. Avoid generic statements that could apply to most proposals.
  • If necessary, provide references to support your statements.
  • All reviews should be impersonal, commenting on the proposal and not on the proposal team. For example, do not write "The PI did not adequately describe recent observations of this source.", but instead write "The proposal did not adequately describe recent observations of this source.”.
  • Reviewers cannot be sure at the time of writing reviews whether the proposed observations will be scheduled for execution. The reviews should be phrased in such a way that they are sensible and meaningful regardless of the final outcome.
  1. Be concise.
  • It is not necessary to write a lengthy review. A meaningful review can be only a few sentences in length if it is concise and informative. However, avoid writing only a single sentence that does not address any specific strengths and weaknesses.
  1. Be professional and constructive.
  • It is never appropriate to write inflammatory or inappropriate comments, even if you think a proposal could be greatly improved.
  • Do not use sarcasm or any insulting language.
  1. Be aware of unconscious bias.
  • We all have biases and we need to make special efforts to review the proposals objectively. A discussion of unconscious bias is provided here.
  1. Be anonymous.
  • Do not identify yourself in the reviews to the PIs. In case of distributed peer review, these reviews will not be examined and edited by the PHT. They will be sent verbatim to the PIs, and they will also be shared with other reviewers during Stage 2.
  • Do not spend time trying to guess who is the proposal team behind the proposal you are reviewing. Your review should be based solely on the scientific merit of the proposal. The identity of the proposal team is not relevant for your review. For dual-anonymous guidelines, click here.
  1. Other best practices.
  • A review is not a summary of the proposal. While the reviewer can provide a brief overview, the bulk of the content need to discuss the strengths and weaknesses of the proposal.
  • Use complete sentences when writing your reviews. We understand that many reviewers are not native English speakers, but please try to use correct grammar, spelling, and punctuation.
  • Do not include statements about scheduling feasibility. ALMA will assess the scheduling feasibility when building the observing queue and forward this information to the PI when needed.
  • Do not include explicit references to other proposals that you are reviewing, such as project codes.
  • Do not ask questions. A question is usually an indirect way to indicate there is a weakness in the proposal, but the weakness should be stated explicitly. For example, instead of "Why was a sample size of 10 chosen?" write "The proposal did not provide a strong justification for why 10 sources need to be observed."
  1. Re-read your reviews and scientific rankings.
  • Once you have completed your assessments, re-read your reviews and ask how you would react if you received them. If you feel that the reviews would upset you, revise them.
  • Check to see if the strengths and weaknesses in the reviews are consistent with the scientific rankings. If not, consider revising the reviews or the rankings.

Here are example reviews that conform with the above guidelines.

 

Example review #1

example-review 

Example review #2

Example2-guidelines.png

Example review #3

Example3-guidelines.png

Unconscious bias  

Unconscious bias in the review process occurs when a reviewer holds a bias (of which they are often unaware) in favor of, or against, a proposal for reasons other than scientific merit. Because these biases are a result of our own culture and experiences, all reviewers are influenced by unconscious bias. Examples include culture, age, prestige, language, gender, and institutional biases.

ALMA has found systematics in the proposal rankings that may indicate bias exists in the proposal review process (Carpenter et al. 2022) . Similar studies have been published by Reid (2014) in an analysis of Hubble Space Telescope (HST) proposals and by Patat (2016) for ESO proposals.

ALMA is committed to awarding telescope time purely on the basis of scientific merit. As part of this commitment, ALMA implemented dual-anonymous review in Cycle 8, and would like to make reviewers  aware of the role that unconscious bias can play in the review process[1]. Reviewers should also recognize that English is a second language for many, if not most, PIs. ALMA reminds the reviewers to focus their reviews on the scientific merit of the proposals. 

Dual-anonymous

Please refer to the dual-anonymous guidelines for reviewers for guides on how to approach the proposal review under the dual-anonymous policy, and for a description on the procedure to follow in case you find a problem with anonymization.

 

Guidelines for Mentors

Reviewers participating in the distributed review process who do not have a PhD are required to have a mentor who will assist with the proposal review. The mentors are specified in the OT when preparing the proposal. In general, the role of the mentor is to provide whatever guidance is necessary for the reviewer during the review process. They have access to the proposals and the reviews of their mentees through the Reviewer Tool in read-only mode.

Specific roles of a mentor include:

  1. Work with the reviewer to declare any conflicts of interest on the assigned proposals. The conflicts of interest criteria apply to both the reviewer and the mentor.
  2. Provide advice to the reviewer as needed on the scientific assessment of the proposals.
  3. Provide guidance to the reviewer on providing constructive feedback to the PIs.
  4. Review the comments to the PI before they are submitted.

 

Code of conduct and confidentiality

All participants in the review process are expected to behave in an ethical manner.

  • Reviewers will judge proposals solely on their scientific merit.
  • Reviewers will be mindful of bias in all contexts.
  • Reviewers will declare all major conflicts of interest.
  • The proposal reviews will be constructive and avoid any inappropriate language.

All proposal materials related to the review process are strictly confidential.

  • The assigned proposals may not be distributed or used in any manner not directly related to the review process.
  • Any data, intellectual property, and non-public information shown in the proposals may be used only for the purpose of carrying out the requested proposal review.
  • The assigned proposals and the reviews may not be discussed with anyone other than the Proposal Handling Team, the APRC, or the assigned mentor when applicable.
  • All electronic and paper copies of the proposal materials must be destroyed as soon as a reviewer completes the proposal review process.

 


1 For more information on unconscious bias, reviewers can read the Unconscious bias training module from the Canada Research Chairs and take online tests at Project Implicit.

 


Return to the main ALMA Proposal Review page