Proposal Evaluation Process

The LNLS proposal evaluation process is based on a distributed double-anonymous (DDA) system, which consists of peerreview where reviewers are unaware of the proponents and their  institutes – and vice versa – and all proponents and principal investigators (PI) of the proposals  are potential reviewers.

The evaluation process also includes proposals submitted by CNPEM researchers.  The allocation of beam time in beamlines and laboratories is based on the following scientific evaluation process: 

New to this call



Phase 1 (Distributed Double-Anonymous): The proposals previously anonymized in the SAU Online platform are distributed and peer-reviewed based on the research areas indicated by the proposer in the submission form (area committee). The evaluation will be based on the scientific merit of the proposal, its contribution to the advancement of the scientific area, originality of the experiment and the hypothesis to be tested in the experiment following the evaluation criteria described in the next section. 

Phase 2 (Analysis of merit grades): The Scientific Evaluation Committee of Proposals (CACIP) analyzes and classifies the competing proposals based on the grades received during Phase 1 and prepares the feedback texts to the proponents.  Particular attention is paid to cases of great discrepancy between the evaluations received.  SPEC will define the final grade of the proposals following the same criteria as in Phase 1.  

Phase 3 (Classification of proposals): Internal evaluation by the beamline Allocation Committee, formed by the Directors of the LNLS, which classifies the priority order in which the proposals will be scheduled into available user beamtime.  

Phase 4 (Technical Feasibility): The beamline coordinators certify the feasibility of proposals to be performed at Sirius and determine the need for a pre-contact with the proponent´s team to evaluate any special needs of the experiments.  As with beamlines, support laboratories will also certify proposals that request the use of these open facilities. Sometimes the requested experiment may need the use of these laboratories, as they are equipped with instruments for advanced sample preparation. If the user does not choose appropriate support laboratories during proposal submission, the beamline coordinators may suggest additional laboratories to include in the proposal, which would then be analyzed later by the coordinators of these support laboratories.  

Phase 5 (Safety and Security): Internal evaluation by the Security team of the best ranked proposals for compliance with security issues. In case of a problem, the proponent receives a message from the SAU Online website and must provide all additional information immediately upon request. 

Phase 6 (Evaluation Result): Proponents receive a message from SAU Online about the outcome of the proposal evaluation along with the CACIP review (when it applies). 

Phase 7 (Communication and Instructions): Proponents receive a message from SAU Online with the scheduled beam-time period and instructions to prepare for their stay at LNLS. If the user has any issues upon the scheduling of their experiments, it is advisable to contact SAU and the beamline managers to reschedule. 


Reviewers are instructed to provide a detailed evaluation on the technical and scientific quality of the proposal, paying attention to the following items that will guide the report and the final grade: 

  • Are the scientific motivation, goals/objectives, contribution to the advancement of the scientific area, originality of the experiment and hypothesis of the experiment clearly described? 
  • Are the samples to be studied clearly described and previously characterized? 
  • Are there any preliminary studies that justify measurements with synchrotron light? 
  • Do the expected results using synchrotron light seem to have scientific or technical relevance? Is the experimental technique adequate? 

Grades from 1 (lowest) to 5 (highest) will be assigned according to the guidelines below, and the overall grade will be the average between them. 


Score and Criteria for evaluation of proposals:


Criterion 1 – Scientific Motivation, Goals, and Originality 

5 – Extraordinary: Exceptionally clear motivation, well-defined goals, and highly original hypothesis. 

4 – Excellent: Well-presented motivation, clear goals, and notable originality in the hypothesis. 

3 – Good: Adequate motivation, defined goals, and reasonable hypothesis originality. 

2 – Regular: Somewhat unclear motivation, imprecise goals, and limited hypothesis originality. 

1 – Poor: Unclear/absent motivation, poorly defined goals, and minimal hypothesis originality. 


Criterion 2 – Sample Description and Characterization 

5 – Extraordinary: Extensively described samples with thorough previous characterization. 

4 – Excellent: Well-described samples with solid previous characterization. 

3 – Good: Adequate sample description with some previous characterization. 

2 – Regular: Limited sample details and characterization. 

1 – Poor: Unclear/insufficient sample information and characterization. 


Criterion 3 – Synchrotron Justification 

5 – Extraordinary: Comprehensive and compelling justification to the use of synchrotron radiation including very good preliminary studies and extraordinary alignment with the purpose of the applied facility. 

4 – Excellent: Convincing justification to the use of synchrotron radiation with some preliminary studies and good match with the purpose of the applied facility. 

3 – Good: Reasonable justification to the use of synchrotron radiation and reasonable match with the purpose of the applied facility. 

2 – Regular: Limited justification to the use of synchrotron and not aligned with the purpose of the applied facility. 

1 – Poor: Lack of justification to the use of synchrotron and the applied facility 


Criterion 4 – Expected Results and Experimental Technique 

5 – Extraordinary: Exceptionally relevant results, meticulously chosen technique. 

4 – Excellent: Highly relevant results, well-suited technique. 

3 – Good: Scientifically and technically important results, appropriate technique. 

2 – Regular: Limited relevance in results, questionable technique suitability. 

1 – Poor: Insignificant results, inadequate or poorly chosen technique. 


Weighted score ranking  

A weighting value applied to a reviewer’s score based on their knowledge and expertise in the area of a proposal being evaluated aims to communicate the confidence of the reviewer in the grade assigned during the review process. Even if the reviewer is not an expert in the scientific area of the proposal, this weighting will be applied as a correction factor, and the value of the final grade will be adjusted by the CACIP committee accordingly. 

A drop-down menu next to the grade to be assigned in the review screen of the proposal brings values of 0.5, 1.0 and 1.5, where the reviewer at the time of the evaluation will assign the weighting value.  The CACIP will then jointly evaluate the grade assigned to the proposal and the weighting value to prepare the final grade and comments to users.  The table below summarizes the attribution of the weight of the familiarity between the expertise area of the reviewer and the research area of the proposal to be evaluated. 

Weight Knowledge in the research area
0.5 I do not have enough knowledge in the area
1.0 I have enough knowledge in the area to produce a qualified review.
1.5 I am a specialist working in the area and I can produce a well-qualified review.


The Committee for the Scientific Evaluation of Proposals (CACIP), formed by renowned researchers external to the CNPEM and experienced in the use of synchrotron light, will analyze the scientific merit of the research proposals based on the opinions of anonymous reviewers. The final grade of each proposal will be based exclusively on the distributions of the reviewers’ evaluations and the information provided in the proposals. 

The formation of the CACIP with their respective affiliations, the areas of activity in the committee and the mandate are shown in the table below. In case of non-availability of any member for any reason, LNLS will indicate a substitute. 

Members  Filiation  Country   CACIP Area  Mandate 
Watson Loh  UNICAMP  Brazil  Chemistry  2022-24 
Marcelo Ceolin  INIFTA  Argentina  Chemistry  2022-24 
Maria Luiza Rocco  UFRJ  Brazil  Chemistry  2022-24 
Regina Cely Cardoso  UFRJ  Brazil  Sustainability and Earth Sciences/Life Sciences  2022-24 
Theogenes Senna de Oliveira  UFV  Brazil  Sustainability and Earth Sciences/Life Sciences  2022-24 
Wania Duleba  USP  Brazil  Sustainability and Earth Sciences/Life Sciences  2022-24 
Antonio Gomes  UFC  Brazil  Physics/Engineering  2022-24 
Altair Soria Ferreira  UFRGS  Brazil  Physics/Engineering  2022-24 
Jonder Morais  UFRGS  Brazil  Physics/Engineering  2022-24 
Paulo de Tarso UFC Brazil Physics/Engineering  2023-25


Criteria evaluated by LNLS (Step 6 of the proposal flowchart): 

  • Is the scientific proposal adequate with the beamline required by the proposer? 
  • Are the proposed experiments aligned with the technical capacity of the beamline equipment available in the experimental stations? 
  • Does the scientific proposal mention the use of support laboratories? Do the support laboratories have the equipment and techniques requested by the user? In case the LNLS team judges that the proposal requires the use of a support laboratory, it may include it later in the proposal. 
  • Does the team have well trained personnel to carry out the proposal? Are there enough participants for beam time to be used effectively? Does the team have participants with experience in the technique? Typically, for two- or three-day experiments, the team should contain at least two or three members. A synchrotron light experiment is rarely a solitary experiment. 

Attention: Given the importance of previous scientific results for the general process of evaluating proposals, it is recommended that Users check and update their publications on the SAU Online portal. 


Every proposal must be written in the third person, so as not to intentionally identify the candidates.  

Here are some tips to help hide the candidate’s identity and ensure a fairer proposal evaluation process: 

  • Do not include author names or affiliations anywhere in the proposal text fields; 
  • When citing references in the proposal, use neutral and third-person words. This applies especially to self-reference. For example, replace phrases like “as we showed in our previous work (José et al. 2020)” with “as José et al. (2020) showed…”;  
  • Do not reference previous projects using language that reveals the identity of the candidate(s); 
  • Use references to published works, including works citable by DOI, without including information that may reveal the identity of the candidate(s); 
  • Do not include acknowledgments or the source of any funding in the experimental section of the submission; 
  • The experience and history of the team are provided as supplementary information and should not be made available in the experimental section of the submission; 
  • All proposals should contain a Graphic Summary of the Proposal that explains schematically the hypothesis that will be tested in the experiment, with the main length scales and chemical and crystallographic elements involved in the experiment. Avoid as much as possible the identification of affiliations and other information that can identify the authors of the proposal. 

Please contact the beamline teams and the heads of the scientific divisions of LNLS to discuss their ideas (Beamlines). For questions related to the guidelines for the proposal submission process, please contact the User Support Services (SAU – sau@cnpem.br).