Skip Ribbon Commands
Skip to main content
MRV/Transparency helpdesk


ForumComments :DefaultUse SHIFT+ENTER to open the menu (new window).Open MenuOpen Menu

Comments for each discussion article
Technical resources for implementing the measurement, reporting and verification arrangements under the Convention and the enhanced transparency framework under the Paris Agreement.
It is good to use Monte Carlo for both of these scenarios since the conditions for applying it are so flexible. The times when it is better to use Monte Carlo than propagation of error does not depend on the number of variables. Applying Monte Carlo will lead to more accurate results when the variables have high uncertainty, non-normal equations, data are correlated, models are complex, and uncertainty varies between inventory years.
(Anna McMurray, Winrock International)
Is it still good to use Monte Carlo approach when we have only one emission factor value? (the activity data are varied). Or is it more applicable for analysis involving more than one value of both EF & AD? 
Ideally, you should identify the probability density function (PDF) of each individual variable, then run the simulations based on the variable, and then combine the simulations to identify the final distribution of the variable in question. You should only make assumptions about the distribution when you do not have access to the underlying dataset. If this is the case, then you have to make educated assumptions about the data. There is extensive literature on the distribution of tree sizes, so you could research studies on this topic focusing on the geographic region and forest type of interest to come up with the reasonable probability density function. Because it is a natural phenomenon, one could assume that the distribution is normal. This, however, is a broad assumption, and it is better to verify with an actual distribution fit to a PDF using a goodness of fit test or through scientific literature review.
(Anna McMurray, Winrock  International)
In step 1 of the Monte Carlo approach, which distribution would you use when it comes to combined variables, such as. i.e. tree dbh's that are either in forest or non-forest. Individually they may be normal, but their interaction may not be 
You can find the guidance document we developed on this approach, in English and in Spanish, here: Each software will have their own guidance materials on the specific application of the software.
(Anna McMurray, Winrock International)
Do you have any documents/tutorials about Monte Carlo approach? 
While R does require some knowledge of programming and therefore may not be considered as easy to use as Excel, there is a huge amount of information available online on how to use it (google “R Cran” or “R statistics”). There are also a number of free courses available that will teach you how to use it. That being said, to our knowledge, there is no ready-made 'R' script specifically designed to run Monte Carlo for our purposes.
(Anna McMurray, Winrock International)
'R' is open source but we do not know programming. Is there any ready-made 'R' script or any suggestion to use 'R'? 
While there is no perfect software out there, for Excel-based software, XLSTAT is the most comprehensive (i.e., will allow you to perform all the steps needed to carry out the Monte Carlo approach) from our experience. We cannot claim to have reviewed all possible software available and can only tell you our experience.
(Anna McMurray, Winrock International)
As most Software need to purchased, what is the most reliable software that we should choose? Excel is easier if available. 
Section of Uncertainty Chapter in the 2006 Guidelines addresses this question. See below. Note that Approach 1 refers to propagation of error and Approach 2 refers to Monte Carlo.

“Where the conditions for applicability are met (relatively low uncertainty, no correlation between sources except those dealt with explicitly by Approach 1), Approach 1 and Approach 2 will give identical results. However, and perhaps paradoxically, these conditions are most likely to be satisfied where Tier 2 and Tier 3 methods are widely used and properly applied in the inventory, because these methods should give the most accurate and perhaps also the most precise results. There is therefore no direct theoretical connection between choice of Approach and choice of Tier. In practice, when Tier 1 methods are applied, Approach 1 will usually be used while the ability to apply Approach 2 is more likely where Tier 2 and 3 methods are being used, moreover for quantifying the uncertainty of emissions/removal estimates of complex systems such as in the AFOLU Sector.

When Approach 2 is selected, as part of QA/QC activities inventory agencies also are encouraged to apply Approach 1 because of the insights it provides and because it will not require a significant amount of additional work. Where Approach 2 is used, its estimates of overall uncertainty are to be preferred when reporting uncertainties (see Section”
(Anna McMurray, Winrock International)
Some are saying Monte Carlo Approach should use for Tier 2. Is that correct? 
There is nothing in the 2006 GL and UNFCCC reporting GL that forbids the use of Monte Carlo to estimate emissions. As encouraged by the IPCC Guidelines, Parties can also use national methodologies where they consider these to be better able to reflect their national situation, provided that these methodologies are consistent, transparent and well documented. Because of the potential inconsistencies resulting from just applying Monte Carlo to estimate uncertainties and not to estimate actual emissions, from our experience in the forestry sector, we highly recommend using Monte Carlo for both as resources allow/permit.
(Anna McMurray, Winrock International)
How should compilers’ prioritize use of the Monte Carlo approach? Start application of this approach with estimating uncertainty?  
In the Revised 1996 IPCC Guidelines, there was no specific guidance on QA/QC. Instead, guidance for QA/QC is provided in the Good Practice Guidance (GPG) reports. The 2006 IPCC Guidelines contain guidance on QA/QC which are drawn from those contained in the GPG.  Therefore, there is no substantive change.  If an expert applied QA/QC guidance contained in the GPG reports, that expert will have no issue in switching over and using the 2006 IPCC Guidelines. The modalities, procedures and guidelines for the enhanced transparency framework adopted in Katowice in December 2019 encourages developing countries to encouraged to elaborate an inventory QA/QC plan in accordance with the IPCC guidelines, and implement and provide information on general inventory QC procedures in accordance with their QA/QC plans. This is a new reporting provision compared to current UNFCCC guidelines for national communications and biennial update reports.
- What are the main differences in terms of QA/QC with between the Revised 1996 IPCC Guidelines and the 2006 IPCC Guidelines? 
The 2006 IPCC Guidelines provide guidance for inventory compilation only. On the other side, reporting requirements are decided by Parties under the UNFCCC process. As per the guidelines for the preparation of national communications contained in decision 17/CP.8, there is no explicit requirement for developing countries to include comprehensive CTF tables as an annex.  However, the situation is slightly different for reporting biennial transparency reports under the enhanced transparency framework. Parties are expected to report national inventory document and common reporting tables. Currently, Parties are still negotiating common reporting tables for inventory submissions under the SBSTA. This negotiation should be concluded by COP 26.
If non-Annex I Parties use the 2006 Guidelines for prepare and report national GHG inventory as a part of its national communication, is there a need to include the CTF tables as annex for reporting for National Communications? 
It was so decided by the Conference of the Parties serving as the meeting of the Parties to the Paris Agreement (CMA). Developed country Parties are already doing inventory reporting as such, so relevant practice and examples can be found from their inventory reports on how to disaggregate AFOLU estimates into agriculture and LULUCF sectors.
Agriculture and LULUCF are combined in the 2006 IPCC Guidelines but have to be reported separately. Can you please elaborate on that? 
The 2019 Refinement was produced by the IPCC as the 2006 Guidelines had been produced more than 10 years ago and the IPCC considered partial updates (not full revision) should be made based on the recent scientific knowledge and information. Katowice Climate Package stipulates that any supplement or refinement to 2006 Guidelines will become mandatory for the Parties to use, if the CMA agrees so, but the Parties have not yet discussed this matter. Therefore, to use the 2019 Refinement is not mandatory for the Parties, and there is no training currently being offered on it.
What aspects are the 2019 refinements looking at and will the Parties need additional training to use them? 
​QC (qualitycontrol) refers to the routine checks that inventory compilers apply anddocument to ensure data integrity, completeness and consistency. Forexample, general checks are applied throughout compilation steps to ensurethere are no transcription errors with data or documentation, datareferences are included, and there are also category-specific checkssuggested in the 2006 GL that compilers may use.

QA (qualityassurance) refers to application of review procedures to the inventory byexperts not directly involved in the compilation of inventory estimates.

Verificationinvolves checking estimates with independent data, such as comparingestimates using other data sets, or comparing estimates by applying loweror higher tiered methods in 2006 IPCC guidelines.

You can refer to Volume 1, Chapter 6, of the 2006 IPCC Guidelines for more completeinformation, including definitions and descriptions of the types ofprocedures included under each topic.
Please elaborate the difference between the QA/QC and Verification.  
• IPCC Task Force on National Greenhouse Gas Inventories (TFI) has a Bureau (TFB). During the 26th Meeting of Task Force Bureau (TFB) held on 28 - 29 August 2014 in Ottawa, it concluded that the 2006 IPCC Guidelines for National Greenhouse Gas Inventories provide a technically sound methodological basis of national greenhouse gas inventories, and therefore fundamental revision is unnecessary.  However, to maintain the scientific validity of the 2006 IPCC Guidelines, certain refinements may be required, taking into account scientific and other technical advances that have matured sufficiently since 2006.
• Based on this conclusion, the “2019 Refinement to the 2006 IPCC Guidelines for National Greenhouse Gas Inventories” (2019 Refinement) was adopted/accepted by the IPCC at its 49th Session on 12 May 2019.
• Its overall aim is to provide an updated and sound scientific basis for supporting the preparation and continuous improvement of national greenhouse gas inventories. The 2019 Refinement does not revise nor replace the 2006 IPCC Guidelines, but update, supplement and/or elaborate the 2006 IPCC Guidelines where gaps or out-of-date science have been identified. It should be used in conjunction with the 2006 IPCC Guidelines.
• It provides supplementary methodologies for sources and sinks of greenhouse gases where gaps have been identified, where new technologies and production processes have emerged requiring elaborated methodologies, or for sources and sinks that are not well covered by the 2006 IPCC Guidelines.
• It also provides updated default values of emission factors and other parameters based on the latest available science. Wherever possible, it provides additional or alternative up-to-date information and guidance, as clarification or elaboration of existing guidance in the 2006 IPCC Guidelines.
• For example, there is guidance in the 2019 Refinement to the 2006 IPCC guidelines, which address methane emissions from flooded lands, such as reservoirs.
• Decision 18/CMA.1, paragraph 20 of its annex stipulated that each Party shall use the 2006 IPCC Guidelines, and shall use any subsequent version or refinement of the IPCC guidelines agreed upon by the Conference of the Parties serving as the meeting of the Parties to the Paris Agreement (CMA). The 2019 Refinement to the 2006 IPCC Guidelines is a refinement of the IPCC guidelines, but no agreement has been reached yet by the CMA.
• 2019 Refinement to the 2006 IPCC guidelines can be accessed here:
It has been 14 years since the updated IPCC guidelines (2006 IPCC Guidelines) were released in 2006. Has there a work to further update these with new findings? For example, in Rwanda, there is pumping of methane from a lake and converting it into elec 
• It is important to recognize that the 2006 IPCC Guidelines “are an evolutionary development starting from the Revised 1996 IPCC Guidelines, GPG2000 and GPG-LULUCF. A fundamental shift in methodological approach would pose difficulties with time series consistency in emissions and removals estimation, and incur additional costs, since countries and the international community have made significant investments in inventory systems. An evolutionary approach helps ensure continuity and allows for incorporation of experiences with the existing guidelines, new scientific information, and the results of the UNFCCC review process.” Please refer to the short Overview Chapter of the 2006 IPCC Guidelines, page 8.
• The improvements in the 2006 IPCC guidelines can be characterized as improved accuracy, completeness, clearer guidance, reduced scope for errors and resources-relevant methods. For example, with respect to clearer guidance it would be useful to note that the previous IPCC inventory guidance has been reviewed and, where needed, clarified and expanded to improve its user friendliness.
• In addition, the use of the 2006 IPCC guidelines would also facilitate comparability extensively among all Parties’ national greenhouse gas inventories. As it is known, Annex I Parties to the UNFCCC are obliged to use the 2006 IPCC Guidelines for estimating and reporting GHGs emissions by sources and removals by sinks in their annual GHG Inventories under the Convention and the Kyoto Protocol since 2015.
Could you please summarize what are the advantages and disadvantages of the revised 1996 IPCC guidelines and the 2006 IPCC guidelines and their reasons? ¬ 
Question: Considering that estimated emission quantities based on 2006 IPCC guidelines would be different than those based on revised 1996 guidelines due to the difference in emission factors, how the Non-Annex I countries could deal with the challenge of modifying their NDCs which are prepared based on the revised 1996 IPCC guidelines, considering the capacity constraints they have?
• As noted in 2006 guidelines and the 2006 IPCC guidelines primer, for many categories the methods have not changed, but emission factors may have been updated in the 2006 IPCC guidelines. It is good practice to recalculate estimates of all previous inventories, or the time series when you introduce a methodological refinement, and/or an updated or new data is available. Tools like the IPCC Inventory Software, which is based on the 2006 IPCC guidelines can help facilitate this work.
Considering that estimated emission quantities based on 2006 IPCC guidelines would be different than those based on revised 1996 guidelines due to the difference in emission factors, how the Non-Annex I countries could deal with the challenge of modify 
Key category analysis includes all source and sink categories. By implementing KCA, inventory compilers can prioritise their efforts and improve their overall estimates in the national GHG inventory. A precise overview on the significance of the KCA in terms of the inventory aspects is presented in Volume 1 of the 2006 IPCC Guidelines (Chapter 4, Methodological choice and identification of key categories, page 4.5).
What is the importance of key category analysis (KCA) and is it required to have sectoral key categories?  
• The biases in the estimates of emissions by sources and removals by sinks can be minimized by following good practice.  IPCC defined good practices as “In order to promote the development of high-quality national greenhouse gas inventories a collection of methodological principals, actions and procedures were defined in the previous guidelines and collectively referred to as good practice. The 2006 Guidelines retain the concept of good practice including the definition introduced with GPG2000. This has achieved general acceptance amongst countries as the basis for inventory development and says that inventories consistent with good practice are those which contain neither over- nor under-estimates so far as can be judged, and in which uncertainties are reduced as far as practicable.”
• Following the collection of good practice guidance procedures as you apply 2006 IPCC Guidelines in choosing methods, applying QA/QC, etc. (including data and other key parameters like emission factors) will help ensure that you reduce uncertainties and biases over time.
• Please see: Introduction Chapter to 2006 IPCC GL:
How can an inventory compiler ensure that the estimates are free of biases?  
• There is no specification for the use of a common projection models. However, most countries use models that are widely used internationally.
• In the context of reporting information on tracking progress of implementation and achievement of NDCs under the enhanced transparency framework, the modalities, procedures and guidelines contained in annex to decision 18/CMA.1 provide definition of scenarios that should be used in reporting projections. It includes 3 scenarios: ‘with measures’, ‘without measures’ and “‘without measures’ projection.”
In the context of NDCs under the Paris Agreement, projections of GHG emissions and removals have been developed using different scenarios. Is there a consensus for the use of a common projections model?