Review and submit for first review
Last updated
Was this helpful?
Last updated
Was this helpful?
Validation of Step C includes three stages: two reviews conducted by the LandScale team and a local review process. These stages collectively ensure that the assessment adheres to the LandScale guidelines and meets the necessary quality standards for data and analysis.
By completing these three stages, the reassessment results are thoroughly refined and validated. This rigorous process ensures the credibility, reliability, and overall quality of the reassessment results, providing confidence to stakeholders and users relying on the outcomes for decision-making.
The following materials must be submitted through the LandScale platform for the first review:
Final list of all datasets used in the reassessment, including documentation of any changes from the baseline assessment (e.g., updated sources, new methodologies, or modifications to data collection approaches).
Summary of the methods used to process and analyze data to derive metric results, highlighting any differences from the baseline assessment. If methods remain unchanged, this should be explicitly stated.
If there are differences between baseline and reassessment data sources or methodologies, documentation of how risks of misinterpreting trends were mitigated when generating the reassessment results.
If any human rights indicators within Goal 2.2 are included in the assessment scope, documentation demonstrating how these were assessed in accordance with the . Optionally, this documentation may also be submitted for any user-added human rights-related indicators.
Documentation of how the governance indicators within Goal 3.2 that require the use of the —specifically 3.2.1, 3.2.2, 3.2.3, and optionally 3.2.4—were assessed, as outlined in the .
Metric results for all metrics for which reassessment results were generated.
Tracking progress against targets and milestones (if established):
Identification of achieved targets and milestones, including the date of achievement.
Specification of any new landscape targets and milestones, ensuring they use the same—or clearly convertible—units of measurement as the corresponding metric(s).
Proposed selection of local reviewers, in accordance with the (if this selection was not already validated in a prior step).
Documentation of the proposed local reviewers' expertise. This may include CVs, resumes, course certifications, or other suitable evidence.
The first review will confirm whether the following requirements of Step C have been fulfilled:
Comparability with the baseline assessment:
Metric results use the same—or clearly convertible—units as the baseline assessment to enable direct comparisons.
Any changes to methods, data sources, or the scope of the assessment are clearly documented and justified, with adequate plans in place to mitigate the risk of misinterpretation.
The metric results, based on a review of data limitations and analysis methods, do not have evident data quality deficiencies that would render them substantially incorrect or misrepresentative of landscape conditions. This evaluation is not based on specific local knowledge but may include consultation with local reviewers or other experts regarding source data and analysis methods.
The review will assess the above-mentioned requirements for documentation and the quality of data and metric results. It will determine if any data sources or metric results exhibit one or more of the following three issues, requiring further attention from the assessment team:
Insufficient documentation: Additional information is needed regarding the data selection process, data quality, and/or data analysis methods.
Potential data quality deficiency: Based on the evidence reviewed, the result may be significantly incorrect or misrepresentative of the landscape's conditions. Results with unresolved quality deficiencies will not be validated or published.
Undocumented data quality limitations: While the result appears substantially correct and representative, it may lack sufficient spatial resolution, sampling intensity, disaggregation, full spatial coverage, or other characteristics needed for a reasonably complete and nuanced understanding of landscape performance and trends. Such limitations must be documented. Results with documented quality limitations may still be published, provided these limitations are disclosed alongside the result.
The LandScale team will provide metric-specific comments to the assessment team, outlining any additional documentation requirements, potential quality deficiencies or limitations, and suggestions for addressing these issues. In some cases, the LandScale team may request further documentation or clarification to resolve gaps or determine the presence and nature of potential quality issues before completing its review and issuing findings.
[Insert screenshot of sample comments provided by the LandScale team after the first review]
Metric results are complete and presented in the required form, as specified in the .
Metric results include statements of limitation for any results where significant limitations may exist, in line with the .
Any human rights indicators within Goal 2.2 included in the assessment scope were reassessed in accordance with the , and adequate documentation of this process has been submitted.
The relevant governance indicators within Goal 3.2 (3.2.1, 3.2.2, 3.2.3) were reassessed using the , and adequate documentation of this process has been submitted.
Local reviewers were selected in accordance with the . If the assessment team is unable to meet the recommended minimum of two reviewers per indicator, this requirement may still be validated if the team demonstrates good faith efforts to recruit reviewers and shows that the shortfall is due to external limitations (e.g., lack of interest or availability). Such exceptions must be supported by documentation, such as stakeholder mapping results, formal invitations, and other relevant correspondence. At least one local reviewer per indicator is mandatory.