Improved decision-making in geochemical sampling based on both frequency and Bayesian frameworks
- 1EarthByte Group, School of Geosciences, University of Sydney, Sydney, Australia (z5218858@zmail.unsw.edu.au)
- 2Earth and Sustainability Science Research Centre, School of Biological, Earth and Environmental Sciences, University of New South Wales, Sydney, Australia (z5218858@zmail.unsw.edu.au)
A common problem in geochemical exploration projects is the limited number of collected samples due to budgetary, time, and other constraints. Therefore, to study spatial mineralisation patterns using available samples in both sampled and unsampled areas, the interpolation of the available data is essential to assign estimates to unsampled areas. Because interpolation estimates are based on the data available only within the search window, in continuous field geochemical modelling such interpolations using any single method are often the main source of uncertainty. Error propagation analysis needs to be considered to evaluate interpolation errors’ effects in geochemical anomaly detection. One method for analysing the propagation of errors in models and evaluating their stability is Monte Carlo Simulation (MCSIM). In this method, the P50 (median) value (called ‘return’) and the uncertainty value (called ‘risk’) are calculated. Here the uncertainty is calculated as 1/(P90-P10) for which P10 (lower decile) and P90 (upper decile) are the average 10th and 90th percentiles of the multiple simulated values, correspond to each element. We have applied this method to Swedish till data, collected throughout the country by the Geological Survey of Sweden. The main concern is whether to evaluate if the samples are sufficient and representative of the target elements concentrations for geochemical studies. To address this concern, the sampling uncertainty in a statistical sense (not geochemical) per element was studied using the return-risk matrix. This matrix was applied to volcanogenic massive sulfide (VMS) target elements, then subsequently to the samples per bedrock. Therefore, a large number of simulated values (e.g., 5,000, which is higher than the number of the samples, i.e., 2,578) was generated using MCSIM. Where the quantified return is low or negative, and the quantified uncertainty is high, particularly higher than its relevant return, additional sampling is required to achieve the minimum required spatial continuity in the data or the stability of the later applied classification models. This affects the certainty of the models generated in the study area. In the Sweden data, all the elements assessed have relatively high returns and low uncertainty, demonstrating the stability of the parameters. The process was subsequently applied to samples separated into the main lithological categories or geological domains to determine if the stability in the patterns is affected by rock type (and associate natural variability in the background). In Swedish till samples, the statistical sampling quality is acceptable in the bedrocks of Exotic Terranes, Archean, Baltoscandian, and Idefjorden. However, it is not acceptable in the Palaeoproterozoic units and the Eastern Segment, due to the risks being higher than the returns, which may increase the error propagation effect on the interpolated map and efficiency of the classes obtained by different classification models.
How to cite: Sadeghi, B., Cohen, D., and Müller, D.: Improved decision-making in geochemical sampling based on both frequency and Bayesian frameworks, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-14, https://doi.org/10.5194/egusphere-egu22-14, 2022.