BG2.5
Quality of stable isotope data - Methods and tools for producing high quality data.
Co-organized by HS1.1
Convener: Sergey Assonov | Co-conveners: David Soto, Philip Dunn, Grzegorz Skrzypek
Displays
| Attendance Wed, 06 May, 16:15–18:00 (CEST)

This multidisciplinary session invites contributions on the use of methods and tools aimed to obtain reliable stable isotope data in various areas. The number of papers using stable isotopes as a tool has increased enormously in the last years. Though this become a very common technique in many science fields (biogeosciences, atmospheric, environment, ecology, forensics, etc), such datasets are difficult to compare / combine as the data quality is often unknown. Different protocols used in different labs, not optimal use of Reference Materials (RMs), isotope fractionation during sample-preparation and within TCEA peripherals, exchangeable hydrogen and oxygen, different data corrections – these are a few examples of potential pitfalls. Evaluating data quality may be especially difficult for novel methodologies such as atmospheric research (e.g. N2O), applications using matrices with exchangeable Hydrogen, CSIA (e.g. fatty acids, amino acids). The session calls for papers that try to search flaws in analytical methods, in comparison of different datasets produced in different labs/methods, creating protocols and tools for QA/QC, investigation of proper RMs to be used for the fit-for-purpose. This session is a plea for high quality stable isotope data to be applied in many sciences and produce data that can be utilized for the future (this is important considering all efforts in OA journals, datasets, etc) including creating large reference datasets as based on data produced by different labs in areas such as biological species, soils, atmospheric observations, forensics. Often such reference datasets should not be used in any case without a proper QC applied.