GI1.1
Open session on geoscience instrumentation and methods

GI1.1

Open session on geoscience instrumentation and methods
Convener: Vira Pronenko | Co-conveners: Lara Pajewski, Francesco Soldovieri
vPICO presentations
| Wed, 28 Apr, 11:00–12:30 (CEST)

vPICO presentations: Wed, 28 Apr

Chairperson: Vira Pronenko
11:00–11:02
|
EGU21-1107
|
ECS
Mizuki Ogasawara, Junichiro Ohta, Mizuki Ishida, Moei Yano, Kazutaka Yasukawa, and Yasuhiro Kato

The Re-Os isotope system is an effective tool in geological studies, especially in radiometric dating. Since both Re and Os are highly siderophile and chalcophile elements, they tend to be concentrated in various sulfide minerals. Therefore, the Re-Os geochronology has been employed for direct age determination of sulfide mineralization [1, 2]. However, conventional analytical methods for the Re-Os dating are complex and consume much time and cost.

Here we present an improved analytical method for Re-Os in sulfides combined with acid digestion using HClO4 [3] and sparging introduction of Os [4]. In our method, 0.4 g of powdered sulfide was digested by 1 mL of HClO4 in addition to 4 mL of inverse aqua regia in Carius tube, and then the Re and Os isotope ratios were measured by MC-ICP-MS. We applied this method to the GSJ geochemical reference materials JCu-1 (copper ore from Kamaishi mine, northeastern Japan) and JZn-1 (zinc ore from Kamioka mine, central Japan). The Re-Os concentrations of JCu-1 and JZn-1 were 255-280 ppt and 4622-4828 ppt for Re, and 39.7-41.7 ppt and 21.7-30.0 ppt for Os, respectively. Furthermore, the analytical results (Re-Os concentrations, 187Os/188Os, and 187Re/188Os) of separated chalcopyrite from Kamaishi mine showed good agreements with those by the conventional method digesting 0.5 g of sample by 10 mL of inverse aqua regia and measured with N-TIMS.

The new method, using less total volume of acids for sample digestion, enables MC-ICP-MS analysis of sulfides with relatively lower Re and Os concentrations. In addition, for Os isotopes, a sparging method using MC-ICP-MS [4] can be utilized as a simplified analytical procedure. This simplified and improved method may be useful for dating a wider range of sulfide deposits efficiently.

 

1: Nozaki, T. et al. (2013) Sci. Rep. 3, 1889.

2: Kato, Y. et al. (2009) Earth Planet. Sci. Lett., 278, 40-49.

3: Gao, B. et al. (2019) Microchem. J. 150, 104165.

4: Nozaki, T. et al. (2012) Geostand. Geoanal. Res. 36, 131-148.

How to cite: Ogasawara, M., Ohta, J., Ishida, M., Yano, M., Yasukawa, K., and Kato, Y.: An improved analytical method for Re-Os isotope analysis and its application to GSJ geochemical reference materials, JCu-1 and JZn-1, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-1107, https://doi.org/10.5194/egusphere-egu21-1107, 2021.

11:02–11:04
|
EGU21-1523
|
ECS
|
Kacper Pluta and Gisela Domej

The process of transforming point cloud data into high-quality meshes or CAD objects is, in general, not a trivial task. Many problems, such as holes, enclosed pockets, or small tunnels, can occur during the surface reconstruction process, even if the point cloud is of excellent quality. These issues are often difficult to resolve automatically and may require detailed manual adjustments. Nevertheless, in this work, we present a semi-automatic pipeline that requires minimal user-provided input and still allows for high-quality surface reconstruction. Moreover, the presented pipeline can be successfully used by non-specialists and only relies commonly available tools.

Our pipeline consists of the following main steps: First, a normal field over the point cloud is estimated, and Screened Poisson Surface Reconstruction is applied to obtain the initial mesh. At this stage, the reconstructed mesh usually contains holes, small tunnels, and excess parts – i.e., surface parts that do not correspond to the point cloud geometry. In the next step, we apply morphological and geometrical filtering in order to resolve the problems mentioned before. Some fine details are also removed during the filtration process; however, we show how these can be restored – without reintroducing the problems – using a distance guided projection. In the last step, the filtered mesh is re-meshed to obtain a high-quality triangular mesh, which – if needed – can be converted to a CAD object represented by a small number of quadrangular NURBS patches.

Our workflow is designed for a point cloud recorded by a laser scanner inside one of seven artificially carved caves resembling chapels with several niches and passages to the outside of a sandstone hill slope in Georgia. We note that we have not tested the approach for other data. Nevertheless, we believe that a similar pipeline can be applied for other types of point cloud data, – e.g., natural caves or mining shafts, geotechnical constructions, rock cliffs, geo-archeological sites, etc. This workflow was created independently, it is not part of a funded project and does not advertise particular software. The case study's point cloud data was used by courtesy of the Dipartimento di Scienze dell'Ambiente e della Terra of the Università degli Studi di Milano–Bicocca.

How to cite: Pluta, K. and Domej, G.: From Point Clouds to Surfaces: Overview on a Case Study, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-1523, https://doi.org/10.5194/egusphere-egu21-1523, 2021.

11:04–11:06
|
EGU21-2510
|
ECS
|
Alfonso Gonzalez Jimenez, Yakov Pachepsky, José Luis Gómez Flores, Mario Ramos Rodríguez, and Karl Vanderlinden

On-the-go field measurements of soil and plant characteristics, including yield, are commonplace in current Precision Agriculture applications. Yet, such measurements can be affected by positional inaccuracies that result from equipment configuration or operation characteristics (e.g. GPS antenna position with respect to sensor position) and delays in the data transmission, reception or logging. The resulting time and position lags cause a misfit between the measurements and their attributed GPS position.

In order to compensate for this effect a simple coordinate translation along the measurement direction is proposed, depending on the local velocity and a field- and measurement configuration-specific time lag, which is estimated by minimizing the average absolute difference between the nearest neighbors. The correction procedure is demonstrated using electromagnetic induction data with different spatial configurations and by comparing
variograms for corrected and non-corrected data.


Best results are obtained when overlapping measurements are available, obtained in opposite driving directions, while the worst results are found when no overlapping measurements exist or only those corresponding to headland turns. Further improvements in the nearest neighbor search algorithm, e.g. by imposing the search in adjacent measurement swaths are discussed. The results are valid beyond motorized soil sensing applications.

Acknowledgement
This work is funded by the Spanish State Agency for Research through grant PID2019-104136RR-C21 and by IFAPA/FEDER through grant AVA2019.018.

How to cite: Gonzalez Jimenez, A., Pachepsky, Y., Gómez Flores, J. L., Ramos Rodríguez, M., and Vanderlinden, K.: Correcting position of delayed on-the-go field measurements by optimizing nearest neighbor statistics, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-2510, https://doi.org/10.5194/egusphere-egu21-2510, 2021.

11:06–11:08
|
EGU21-4112
|
ECS
|
Remi Dallmayr, Johannes Freitag, Maria Hörhold, Thomas Laepple, Johannes Lemburg, Damiano Della Lunga, and Frank Wilhelms

The validity of any glaciological paleo proxy used to interpret climate records is based on the level of understanding of their transfer from the atmosphere into the ice sheet and their recording in the snowpack. Large spatial noise in snow properties is observed, as the wind constantly redistributes the deposited snow at the surface routed by the local topography. To increase the signal-to-noise ratio and getting a representative estimate of snow properties with respect to the high spatial variability, a large number of snow profiles is needed. However, the classical way of obtaining profiles via snow-pits is time and energy-consuming, and thus unfavourable for large surface sampling programs. In response, we present a dual-tube technique to sample the upper metre of the snowpack at a variable depth resolution with high efficiency. The developed device is robust and avoids contact with the samples by exhibiting two tubes attached alongside each other in order to (1) contain the snow core sample and (2) to access the bottom of the sample, respectively. We demonstrate the performance of the technique through two case studies in East Antarctica where we analysed the variability of water isotopes at a 100 m and 5 km spatial scales.

How to cite: Dallmayr, R., Freitag, J., Hörhold, M., Laepple, T., Lemburg, J., Della Lunga, D., and Wilhelms, F.: A dual-tube sampling technique for snowpack studies, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-4112, https://doi.org/10.5194/egusphere-egu21-4112, 2021.

11:08–11:10
|
EGU21-6055
|
Michael Sommer, Christoph von Rohden, Tzvetan Simeonov, and Ruud Dirksen

One of the main goals of the GCOS Reference Upper Air Network (GRUAN) is to perform reference observations of profiles of atmospheric temperature, humidity and wind for monitoring climate change. Two essential criteria for establishing a reference observation are measurement-traceability and the availability of measurement uncertainties. Radiosoundings have proven valuable in providing in-situ profiles of temperature, humidity, pressure and wind at unmatched vertical resolution. Data products from commercial radiosondes often rely on black-box or proprietary algorithms, which are not disclosed to the scientific user. Furthermore, long-term time-series from these products are frequently hampered by changes in the hardware and/or the data processing.

The GRUAN data products (GDP’s) comply with the above-mentioned criteria for a reference product. Correction algorithms are open-source and well documented and the data include vertically resolved best estimates of the uncertainties. Another major advantage of a GRUAN data product is that it includes the radiosonde’s raw measurement data, which allows for reprocessing when new or improved corrections become available. Currently, GDP’s are available for the Vaisala RS92 and Meisei RS-11G radiosondes. Data products for additional radiosonde models, as well as for other measurement techniques are in the making. The GDP’s are used to determine trends, constrain and calibrate data from more spatially‐comprehensive observing systems (including satellites and current radiosonde networks), and provide appropriate data for studying atmospheric processes.

This presentation introduces the GRUAN processing of Vaisala RS41 radiosoundings, the correction algorithms that are applied, and the derivation of the vertically resolved uncertainty estimates. Well-known, dominant error sources for the RS41 profiles are related to solar radiation, causing a temperature error, and time-lag of the humidity sensor at low temperatures. The corrections for these error sources are based on dedicated experiments that were performed at Lindenberg observatory to measure the response of the RS41 temperature sensor to solar irradiance and to determine the time-lag of the humidity sensor at temperatures down to -70 °C. The RS41-GDP.1 is planned to become available in 2021. The majority of the 30, globally distributed, GRUAN sites employ the RS41, and its predecessor the RS92 before, establishing a continuous data record of more than 10 years of reference climate observations.

How to cite: Sommer, M., von Rohden, C., Simeonov, T., and Dirksen, R.: RS41 GRUAN Data Product Version 1 (RS41-GDP.1) - Reference Radiosonde Data for the Troposphere and Lower Stratosphere, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-6055, https://doi.org/10.5194/egusphere-egu21-6055, 2021.

11:10–11:12
|
EGU21-6646
|
ECS
|
Federico Dallo, Daniele Zannoni, Jacopo Gabrieli, Paolo Cristofanelli, Francescopiero Calzolari, Fabrizio de Blasi, Andrea Spolaor, Dario Battistel, Rachele Lodi, Warren R. L. Cairns, Ann Mari Fjæraa, Paolo Bonasoni, Fred Bauman, and Carlo Barbante

We present the results obtained using an original open-source low-cost sensor (LCS) system developed to measure tropospheric O3 in a remote high altitude alpine site. We conducted our study at the Col Margherita Observatory (2543 m a.s.l.), a World Meteorological Organization Global Atmosphere Watch Regional Station (WIGOS Id: 0-380-0-MRG), located in the Italian Eastern Alps. The sensing system mounts three equivalent commercial low-cost sensors that have been calibrated using a laboratory standard (Thermo 49iPS), referenced to the Standard Reference Photometer #15 calibration scale by the WMO, before field deployment. Intra and inter-comparison between sensors and reference (Thermo 49c) have been conducted for six months from May to December 2018. The sensor’s dependence on the environmental meteorological variables has been considered and discussed. The evaluation of the analytical performances of this sensing system provides a limit of detection < 5 ppb, limit of quantitation < 17 ppb, linear dynamic range up to 250 ppb, intra-Pearson correlation coefficient (PCC) up to 0.96, inter-PCC > 0.8, bias > 3.5 ppb and ±8.5 at 95% of confidence. Thanks to the first implementation of an LCS System in an alpine site, we demonstrated how it is possible to obtain valuable data from a low-cost instrument in a remote harsh environment. This opens new perspectives for the adoption of a low-cost sensor network in atmospheric sciences. We further present our recent experience using LoRa to integrate the sensing system into a low-power wide-area network (LPWAN). We developed an end-node and a gateway, designing PCBs derived from the Arduino Mega, optimizing their power consumption and equipping them with batteries, a proper solar panel or wind turbine to ensure their autonomy while collecting environmental ozone and meteorological (T, RH, WS, WD) data. We drafted the communication software to send compressed data from end-nodes to gateways. The gateways are part of an openVPN with the main server located in Venice. The server also provides a postgreSQL database and a R-shiny web application for data visualization and manipulation. To enhance redundancy, the local data are also synchronized to a cloud database. In the next years, thanks to the Marie Skłodowska-Curie grant PIONEER, we will exploit our experiences to provide a comprehensive low-cost wireless sensor network to characterize transport of polluted air masses and provide long term climate data collection in support of the state-of-the-art instrumentation and established networks in remote alpine areas.

Bibliography

Dallo F. et al.: Calibration and assessment of electrochemical low-cost sensors in remote alpine harsh environments, Atmospheric Measurement Techniques, amt-2020-483. In review

Acknowledgements. This work was part of the Arctic Field Grant O3NET project, funded by the Research Council of Norway. The work received financial support by the National Project of Interest Next-Data (MIUR). The exploitation of the LoRa technology was performed with the ITIS "Max Planck" through the Remote Observatory SYstem (ROSY) "Alternanza scuola-lavoro" project. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 844526

How to cite: Dallo, F., Zannoni, D., Gabrieli, J., Cristofanelli, P., Calzolari, F., de Blasi, F., Spolaor, A., Battistel, D., Lodi, R., Cairns, W. R. L., Fjæraa, A. M., Bonasoni, P., Bauman, F., and Barbante, C.: Low-cost sensor network in remote alpine environments, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-6646, https://doi.org/10.5194/egusphere-egu21-6646, 2021.

11:12–11:14
|
EGU21-6960
|
ECS
|
Yu Ting Yan, Stephen Chua, Thomas DeCarlo, Philipp Kempf, Kyle Morgan, and Adam Switzer

X-ray computed tomography (CT) is a non-destructive imaging technique that provides three-dimensional (3D) visualisation and high-resolution quantitative data in the form of CT numbers. CT numbers are derived as a function of the X-ray energy, effective atomic number and density of the sample. The sensitivity of the CT number to changes in material density allows it to successfully identify facies changes within sediment cores by detecting downcore shifts in sediment properties, and quantify skeletal linear extension rates and the volume of internal voids from biological erosion of coral cores. Here we present two algorithms to analyse CT scan images specific to geoscience research packaged within an open source MATLAB application (Core-CT). The first algorithm facilitates the computation of representative CT numbers from a user-defined region of interest to identify boundaries of density change (e.g. sedimentary facies, laminations, coral growth bands). The second algorithm enables the segmentation of regions with major density contrast (e.g. internal void space or biogenic material) and the geometric measurements of these irregularities. The versatility of Core-CT for geoscience applications is then demonstrated by utilising CT scans from a range of environmental settings comprising both sediment (Lake Huelde, Chile and Kallang River Basin, Singapore) and coral cores (Thuwal region of Red Sea, Saudi Arabia). Analysis of sediment cores show the capabilities of Core-CT to: 1) locate tsunami deposits from lacustrine sediments, 2) provide rapid and detailed measurement of varved sediments, and 3) identify sedimentary facies from an unsplit shallow marine sediment core. Analysis of coral cores allow us to successfully measure skeletal linear extension from annual growth bands, and provide volumetric quantification and 3D visualisation of internal bioerosion. Core-CT is an accessible, multi-use MATLAB based program that is freely available at GitHub  (https://github.com/yuting-yan/Core-CT).

 

How to cite: Yan, Y. T., Chua, S., DeCarlo, T., Kempf, P., Morgan, K., and Switzer, A.: Core-CT:  A MATLAB application for the quantitative analysis of sediment and coral cores from X-ray computed tomography (CT), EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-6960, https://doi.org/10.5194/egusphere-egu21-6960, 2021.

11:14–11:16
|
EGU21-7984
Pierre-Yves Morvan and Gary Bagot

Improving operational efficiency is a recurring challenge for subsea operations. Throughout the life of a field, from construction up to decommissioning, several subsea vehicles will be deployed to cover various tasks to perform underwater observations. An ROV or AUV assigned to a specific task will require multiple positioning sensors (LBL, USBL, INS…) to complete its mission. Defining the “good enough” subsea positioning strategy, i.e. to ensure a minimum accuracy without compromise on safety, can be a complex exercise. For instance, an overestimation of the LBL transponders required will directly induce vessel time and finally costly operations. On the other hand, a certain level of positioning redundancy may be requested for a vehicle operating close to a subsea asset in production.

To ease the design and monitoring of a subsea vehicle navigation, iXblue has developed an integrated solution. Not only has the company broadened its product range with the new intelligent Canopus LBL Transponder and the new generation Ramses transceiver, but with Delph Subsea Positioning Software, iXblue now provides a complete integrated solution for subsea positioning that goes a step further by bringing significant efficiency. Divided in 4 modules (LBL Array Planning, Navigation Simulation, Operations, DelphINS) with an intuitive user interface, Delph Subsea Positioning (DSP) is an integrated software suite for the preparation, the operation and the post-processing of iXblue positioning devices (USBL, LBL and INS).

How to cite: Morvan, P.-Y. and Bagot, G.: Innovative software solutions for subsea positionings, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-7984, https://doi.org/10.5194/egusphere-egu21-7984, 2021.

11:16–11:18
|
EGU21-8624
|
Dorothee Rebscher, Yves Guglielmi, Inma Gutierrez, Edi Meier, and Senecio Schefer

In order to enable investigations and further comprehensive understanding of dynamical processes, it is clear one has to identify all relevant parameters and aim to record them all under best conditions concerning e.g. resolution, coverage in space, and in many cases on a multitude of scales in time. Obviously, it is also difficult to satisfy all these constrains in full. Especially scientific long-term observations often suffer the lack of necessary lasting commitment; secure funding, continual high quality maintenance, protected environment, or sufficient planning stability. Fortunately, the Swiss Mont Terri rock laboratory, with its history of now 25 years of forefront scientific expertise, a long-standing fruitful cooperation formed by the partners of the consortium and in consequence thereof state-of-the-art results obtained through 100 completed individual experiments and 45 additional experiments actually ongoing, ensures the conditions listed above.

Based on this favorable prospect, a now growing tiltmeter array is established at the underground laboratory. The instruments are embedded in several multidisciplinary experiments, dedicated to numerous, different scientific questions. Starting in April 2019, the first two platform tiltmeters became operational. Less than two years later, ten biaxial instruments are quasi-continuously monitoring deformation at various locations within the galleries and niches at Mont Terri. The envisioned, increasing spatial coverage of the network will facilitate geodetic observations of the underground rock laboratory as a whole and of its subregions as well.

Already in September 2012, a 50 m long hydrostatic levelling system (HLS) was installed along a gallery in the underground laboratory to detect displacements across an active geological fault zone. The combination of both, i.e. the uniaxial, integral deformations data provided by HLS together with the array of biaxial, point measurements acquired by the tiltmeters offers a unique concerted opportunity to achieve detailed deformation data in a large underground rock laboratory and to survey the associated dynamical processes occurring on timeframes covering seconds to decades.

How to cite: Rebscher, D., Guglielmi, Y., Gutierrez, I., Meier, E., and Schefer, S.: Investigations based on biaxial tiltmeter array and uniaxial hydrostatic levelling system at the Mont Terri rock laboratory, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-8624, https://doi.org/10.5194/egusphere-egu21-8624, 2021.

11:18–11:20
|
EGU21-9287
|
ECS
|
|
Nasrine Medjdouba, Zahia Benaissa, and Sabiha Annou

The main hydrocarbon-bearing reservoirover the study area is the lower Triassic Argilo-Gréseux reservoir. The Triassic sand is deposited as fluvial channels and overbank sands with a thickness ranging from 10 to 20 m, lying unconformably on the Paleozoic formations. Lateral and vertical distribution of the sand bodies is challenging which makes their mapping very difficult andnearly impossible with conventional seismic analysis. 

In order to better define the optimum drilling targets, the seismic attribute analysis and reservoir characterization process were performed targeting suchthin reservoir level, analysis of available two seismic vintages of PSTM cubes as well as post and pre stack inversion results were carried out.The combination of various attributes analysis (RMS amplitude, Spectral decomposition, variance, etc.) along with seismic inversion results has helped to clearly identify the channelized feature and its delineation on various horizon slices and geobodies, the results were reviewed and calibrated with reservoir properties at well location and showed remarkable correlation.

Ten development wells have been successfully drilledbased on the seismic analysis study, confirming the efficiency of seismic attribute analysis to predicted channel body geometry.

Keywords: Channel, Attributes, Amplitude, Inversion, Fluvial reservoir.

How to cite: Medjdouba, N., Benaissa, Z., and Annou, S.: A Novel Approach for a better exploitation of a 3D seismic on a development field, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-9287, https://doi.org/10.5194/egusphere-egu21-9287, 2021.

11:20–11:22
|
EGU21-9793
|
ECS
Inese Varna, Ansis Zarins, and Augusts Rubans

The portable digital zenith camera (DZC) VESTA (VErtical by STArs) determines the deflection of vertical (DoV) components as the difference between the direction to the ellipsoidal zenith (calculated from reference star observations, fixing precise time moment and site coordinates) and the direction of the plumb line, which is determined with a sensitive tiltmeter. DZC VESTA was developed at the University of Latvia and has been used extensively for the determination of the Latvian quasi-geoid model since 2016. The typical accuracy of VESTA is ~0.1 arc second.

Unlike levelling, relative gravimetry or GNSS measurements, the method used by the zenith camera determines DoV directly, allowing validation of data from other geodetic techniques, for example, DZCs are used for geoid slope validations.

Currently, the focus is on further improving the accuracy of the digital zenith camera. Investigation of the limiting factors to achieve the highest accuracy for applied and scientific applications includes:

  • testing the digital zenith camera in different environments to investigate and mitigate the phenomenon of anomalous refraction at zenith; anomalous refraction is the main limiting factor of ground-based astrometric observation’s precision, as it causes irregular angular displacements of the observed stars. The proposed tests include more thorough long-term observations to search for the anomalous refraction properties at multiple test sites (such as near the seacoast, on a hill slope, in a forest, in an open field, covering a wide range of environmental conditions) under different weather conditions. Simultaneous observations with several adjacent DZCs would be an efficient method to distinguish instrument-related variations from changes in the measured quantity itself and to find the spatial properties of the anomalous refraction effects.
  • performing accuracy analysis in a permanent test site to esti­mate the spatial and temporal properties of the measured DoV values. Astrogeodetic determination of DoVs is an absolute observational technique, and any undetected systematic errors remain in the data. Various instrumental settings will be tested during the observations.

However, digital zenith cameras are not limited to geodetic applications; there are other ideas for possible fields of application:

  • research focusing on applications in a geological survey;
  • monitoring of changes in mass distribution in the Earth's crust in case of active tectonic movements.

These areas of geoscience would benefit from an additional measurement technique that complements the traditional method of relative gravimetry.

This research is funded by PostDoc Latvia grant contract No. 1.1.1.2/VIAA/4/20/666.

How to cite: Varna, I., Zarins, A., and Rubans, A.: Digital zenith camera VESTA and its applications, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-9793, https://doi.org/10.5194/egusphere-egu21-9793, 2021.

11:22–11:24
|
EGU21-9848
|
Carla Braitenberg, Alberto Pastorutti, Barbara Grillo, and Marco Bartola

Decade-long series of tilt- and strain-meter observations in NE Italy allow monitoring the crustal deformation from short transient to long-term phenomena. These recordings, some of them started in 1960, are generated by sources spanning a wide spectrum of spatial scales, such as sudden underground flooding due to extreme rainfall [1, 2], years-long fluid diffusion transients due to fault behavior [3], the free oscillation arising from megathrust earthquakes (e.g. Chile 1960, Sumatra 2004, Tohoku 2011).
The instrumental sites lie on karst formations, in an area of continental collision and active seismicity, the northeastern portion of the Adria microplate, where the south-directed thrusts of the Alpine system merge with the NW-SE transpressive regime of the External Dinarides. Measurements include the ongoing interseismic strain accumulation processes, including the peculiar observation of episodic disturbances and southward tilting in the three years preceding the 1976 Mw6.4 Friuli earthquake [4].

The channel systems of Karst hydrology, which undergo complete flooding and overpressure buildup in extremely short time spans (e.g. near-simultaneous flooding over a distance of 30 km) result in observable surface deformation and a change in the gravity field. Tilt time series allow to extract and model this type of hydrology-forced uplift and associated deformation [2,5].

Tilt- and strain-meters allow for accuracy and precision in measuring crustal deformation, to a level which space-borne geodesy cannot provide. The main drawback, however, is that only point measurements are provided, in locations where stations could be set up.
On the other hand, the thousands of points on the surface that DInSAR can provide are affected by coarser accuracy and influenced by atmospheric effects - resulting in LoS displacements uncorrelated to the actual surface deformations. We aim at enabling the transfer of knowledge from tilt- and strain-meters observations to DInSAR-derived data, thus allowing a first assessment of ground-truth constrained displacement models.

[1] Braitenberg C. (2018). The deforming and rotating Earth - A review of the 18th International Symposium on Geodynamics and Earth Tide, Trieste 2016 , Geodesy and Geodynamics, 187-196, doi::10.1016/j.geog.2018.03.003

[2] Braitenberg C., Pivetta T., Barbolla D. F., Gabrovsek F., Devoti R., Nagy I. (2019). Terrain uplift due to natural hydrologic overpressure in karstic conduits. Scientific Reports, 9:3934, 1-10, doi.:10.1038/s41598-019-38814-1

[3] Rossi, G., Fabris, P. & Zuliani, D. Overpressure and Fluid Diffusion Causing Non-hydrological Transient GNSS Displacements. Pure Appl. Geophys. 175, 1869–1888 (2018). https://doi.org/10.1007/s00024-017-1712-x

[4] Dragoni M., Bonafede M., and Boschi E. (1985). On the interpretation of slow ground deformation precursory to the 1976 Friuli earthquake. Pure and Applied Geophysics 122, 781–792. doi:10.1007/978-3-0348-6245-5_3

[5] Grillo B., Braitenberg C., Nagy I., Devoti R., Zuliani D., Fabris P. (2018). Cansiglio Karst-Plateau: 10 years of geodetic-hydrological observations in seismically active northeast Italy. Pure and Applied Geophysics, 175, 5, 1765-1781, doi:10.1007/s00024-018-1860-7.

 

How to cite: Braitenberg, C., Pastorutti, A., Grillo, B., and Bartola, M.: The tilt and strainmeter network of NE Italy: multi-decadal observations of crustal deformation as ground truth for DinSAR., EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-9848, https://doi.org/10.5194/egusphere-egu21-9848, 2021.

11:24–11:26
|
EGU21-9974
|
ECS
Jonathan Sittner, Margarita Merkulova, Jose Ricardo da Assuncao Godinho, Axel Renno, Veerle Cnudde, Marijn Boone, Thomas De Schryver, Denis Van Loo, Antti Roine, Jussi Liipo, Bradley Martin Guy, and Stijn Dewaele

Image-based analytical tools in geosciences are indispensable for the characterization of minerals, but most of them are limited to the surface of a polished plane in a sample and lack 3D information. X-ray micro computed tomography (micro CT) provides the missing 3D information of the microstructures inside samples. However, a major drawback of micro CT in the characterization of minerals is the lack of chemical information that makes mineral classification challenging.

Spectral X-ray micro computed tomography (Sp-CT) is a new and evolving tool in different applications such as medicine, security, material science, and geology. This non-destructive method uses a multi-pixel photon-counting detector (PCD) such as cadmium telluride (CdTe) in combination with a conventional CT scanner (TESCAN CoreTOM) to image a sample and detect its transmitted polychromatic X-ray spectrum. Based on the spectrum, elements in a sample can be identified by an increase in attenuation at specific K-edge energies. Therefore, chemically different particles can be distinguished inside a sample from a single CT scan. The method is able to distinguish elements with K-edges in the range from 25 to 160 keV, which applies to elements with Z > 48 (Sittner et al., 2020).

We present results from various sample materials. Different pure elements and element oxides were measured to compare the position of theoretical and measured K-edge energies. All measured K-edge energies are slightly above the theoretical value, but based on the results a correction algorithm could be developed. Furthermore, different monazite grains were investigated, which can be divided into two groups with respect to the content of different RE elements on the basis of the spectrum: La-Ce-rich and La-Ce-poor. In addition, samples from the Au-U Witwatersrand Supergroup demonstrate the potential applications of Sp-CT for geological samples. We measured different drill core samples from the Kalkoenkrans Reef at the Welkom Gold field. Sp-CT can distinguish gold, uraninite and galena grains based on their K-edge energies in the drill core without preparation.

Sittner, J., Godinho, J. R. A., Renno, A. D., Cnudde, V., Boone, M., De Schryver, T., Van Loo, D., Merkulova, M., Roine, A., & Liipo, J. (2020). Spectral X-ray computed micro tomography: 3-dimensional chemical imaging. X-Ray Spectrometry, September, 1–14.

How to cite: Sittner, J., Merkulova, M., Godinho, J. R. D. A., Renno, A., Cnudde, V., Boone, M., De Schryver, T., Van Loo, D., Roine, A., Liipo, J., Guy, B. M., and Dewaele, S.: Spectral X-ray computed micro tomography: a tool for 3-dimensional chemical imaging, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-9974, https://doi.org/10.5194/egusphere-egu21-9974, 2021.

11:26–11:28
|
EGU21-11168
|
ECS
Kirill Kuznetsov, Kiryukhina Elena, Bulychev Andrey, and Lygin Ivan

Magnetic surveys are commonly used for solving variety of geotechnical and geological challenges in offshore areas, jointly with a set of other geophysical methods. The most popular technique employed is hydromagnetic surveying with towed magnetometers. One of the most significant challenges encountered during processing of the magnetic data is related to temporal variations of the Earth's magnetic field. Accounting for diurnal magnetic field variations is often done by carrying out differential hydromagnetic surveys, a technique developed in the 1980-s. It is based on simultaneous measurements of the magnetic field using two sensors towed behind the vessel with a given separation. This technique allows to calculate along-course gradient which is free of magnetic field temporal variations. This measurement system resembles a gradiometer, with the distance between two sensors being referred to as the base of the gradiometer. It is possible to calculate anomalous magnetic field by integrating obtained magnetic field gradient. Studies have shown that accuracy of its reconstruction decreases with increasing base of the gradiometer. This becomes most significant when distance between the sensors and sources of magnetic field anomalies is small. This situation occur when the survey area is located in shallow water (i.e. for shallow marine, river or lake surveys).

An approach for deriving magnetic anomalies and accounting for diurnal variations in differential hydromagnetic surveys based on the frequency (spectral) representation of the measurements was proposed in 1987 [Melikhov, 1987]. This approach utilizes the fact that it is possible to reconstruct the spectrum of magnetic field anomalies along the vessel course from the spectra of measured signals from the first S1(ω) and second S2(ω) sensors. Assuming that the sensors are located at the same depth, it can be achieved via the following transform:

where ω - spatial frequency, l - base of the gradiometer, and i - imaginary unit. Assuming that at a single moment in time magnetic field variations equally affect both sensors, resulting Fourier spectrum T(ω) will correspond the spectrum of anomalous magnetic field, free of the magnetic variations. It should be noted that, similar to the along-course gradient integration approach, anomalous magnetic field is restored to a certain accuracy level.

Estimates made on model examples showed that accuracy of the field reconstruction using this method is comparable to the accuracy levels of modern marine magnetic surveys (±1-3 nT). It could be noted that for gradiometer bases comparable or larger than depths to magnetic anomaly sources, errors of the field reconstruction are significantly lower for the spectral transformation-based approach compared to along-course gradient integration.

References:

Melikhov V.R., Bulychev A.A., Shamaro A.M. Spectral method for solving the problem of separating the stationary and variable components of the geomagnetic field in hydromagnetic gradiometric surveys // Electromagnetic research. - Moscow. IZMIRAN, 1987. - P. 97-109. (in Russian)

 

How to cite: Kuznetsov, K., Elena, K., Andrey, B., and Ivan, L.: Spectral method for processing hydromagnetic survey data at shallow depths, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-11168, https://doi.org/10.5194/egusphere-egu21-11168, 2021.

11:28–11:30
|
EGU21-12116
|
ECS
Eva M. Rückert, Julius Förstel, and Norbert Frank

Palaeoceanographic studies of ocean circulations are crucial for understanding the ocean´s impact on the Earth´s climate system. Circulation patterns and the provenance of water masses can be detected from temporal variations of the neodymium isotopic composition (εNd) of authigenic neodymium, preserved in deep sea sediment.

Inductively coupled plasma source mass spectrometry allows for the precise and accurate determination of εNd-values of samples and reference material.

Here, we reevaluate the mass spectrometric measurement protocol and instrument setting with respect to precision and accuracy defined by neodymium standards.

The shape of the ion beam plays a crucial role, which is manifested in the result that an optimal adjustment of the beam shaping quadrupoles can increase precision by a factor of 4.

In addition, the optimal standard neodymium concentration level is roughly 50 ppb yielding uncertainties of the mean of repeated measurements as low as 0.07 ε units whereas 5 times lower concentrations yield 10 times higher uncertainties.

The statistical nature of precision is further demonstrated through an uncertainty inversely proportional to the square root of N measurements. As a consequence, with an increase from 30 to 80 consecutive measurements precision was improved by a factor of 1.22.

Taking all evaluated aspects into account, precision and accuracy of standards and thus sediment samples can be strongly improved, hence contributing to a better comprehension of past ocean circulation, where neodymium isotope gradients are small.

How to cite: Rückert, E. M., Förstel, J., and Frank, N.: Improvement of high resolution measurements of neodymium isotope compositions to reconstruct past ocean circulation, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-12116, https://doi.org/10.5194/egusphere-egu21-12116, 2021.

11:30–11:32
|
EGU21-12735
|
ECS
Tzvetan Simeonov, Ruud Dirksen, Christoph von Rohden, and Michael Sommer

The GCOS Reference Upper Air Network (GRUAN) consists of 30 globally distributed measurement sites that provide reference observations of essential climate variables such as temperature and water vapour for climate monitoring. At these sites, radiosondes provide in-situ profiles of temperature, humidity and pressure at high vertical resolution. However, data products from commercial radiosondes often rely on black-box or proprietary algorithms, which are not disclosed to the scientific user. Furthermore, long-term time-series from these products are frequently hampered by changes in the hardware and/or the data processing. Therefore, GRUAN data products (GDP) are developed, that employ open-source and well-documented corrections to the measured data, thereby complying with the requirements for reference data, which include measurement traceability and the availability of measurement uncertainties. The GRUAN data processing is applied to the raw measurement data of temperature, humidity, pressure, altitude, and wind, and includes corrections of errors from known sources, such as for example solar radiation error for temperature and sensor time lag for humidity measurements. The vertically resolved uncertainty estimates include the uncertainty of the applied corrections and the calibration uncertainty of the sensors.

A substantial number of GRUAN sites employ the Vaisala RS41 radiosonde, and its predecessor, the RS92, before that. This large-scale change of instrumentation poses a special challenge to the network, and great care is taken to characterize the differences between these instruments in order to prevent inhomogeneities in the data records. As part of this effort, the GRUAN data products for both radiosonde types are compared. In this study we used data from approximately 1000 RS92+RS41 twin-soundings (two sondes on a rig attached to one balloon) that were performed at 11 GRUAN sites, covering the main climate zones.

The first analysis shows that daytime temperature differences in the stratosphere increase steadily with altitude, with RS92-GDP up to 0.5 K warmer than RS41-GDP above 25 km. In addition, at daytime the RS41-GDP is 0.2 K warmer than the manufacturer-processed RS41-EDT product above 15 km. Analysis of the humidity profiles shows a slight moist bias of the RS41 compared to the RS92 for both GDP and manufacturer-processed data. Differences between the RS41-EDT and GDP humidity products are most pronounced in the upper troposphere - lower stratosphere region and are attributed to the time lagcorrection. The analysis of the temperature differences will be refined by investigating the influence of the solar radiation in conjunction with sonde orientation and ventilation. Furthermore, the uncertainty of the humidity data will be assessed by comparing with coincident measurements of the water vapor profile by the Cryogenic Frostpoint Hygrometer (CFH).

Key words: Radiosonde, RS41, RS92, humidity, temperature, uncertainty, GRUAN, troposphere, lower stratosphere

How to cite: Simeonov, T., Dirksen, R., von Rohden, C., and Sommer, M.: Intercomparison of the Vaisala RS92 and RS41 Radiosonde GRUAN Data Products (GDP) in the Troposphere and Lower Stratosphere, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-12735, https://doi.org/10.5194/egusphere-egu21-12735, 2021.

11:32–11:34
|
EGU21-12850
|
Nina Kukowski, Ronny Stolz, Theo Scholtes, Cornelius Schwarze, and Andreas Goepel

The remote location of the Geodynamic Observatory Moxa of Friedrich-Schiller University Jena, about 30 km south of Jena in the Thuringian slate mountains, results in very low ambient noise and thus very good conditions for long-term geophysical observations, which are further improved, as many sensors are installed in the subsurface in galleries or in boreholes.

So far, the focus of Moxa observatory has been on observing transients signals of deformation and fluid movements in the subsurface. This is accomplished by sensors like a superconducting gravimeter CD-034, three laser strain meters measuring nano-strain along three galleries in north-south, east-west and NW-SE directions, or borehole tiltmeters. Further, information on fluid flow is gained from downhole temperature measurements employing an optical fiber. These sensors are complemented by a climate station and two shallow drill-holes, one of which has been fully cored, which in addition to the temperature times series provide information on water level and rock physical properties. Near surface geophysical profiling using e.g. electrical resistivity tomography has led to a good knowledge of the structurally complex subsurface of the observatory.

Recently, a node for the Global Network of Optical Magnetometers for Exotic physics (GNOME) has been installed in the temperature-stabilized room at Moxa observatory close to the superconducting gravimeter. The GNOME is a world-spanning collaboration employing optically pumped magneto­meters (OPM) to search for space-time correlated transient signatures heralding exotic physics beyond the Standard Model. GNOME is sensitive to prominent classes of dark-matter scenarios, e.g., axion or axion-like particles forming macroscopic structures in the Universe. The installation in close vicinity to the superconducting gravimeter ensures well-controlled and -monitored ambient conditions such as temperature, air pressure and especially vibrations, allowing improved vetoing of false-positive detection events in the Moxa GNOME node.

Here, we focus on introducing Moxa Observatory’s sensor systems with an emphasis of actual sensor configurations and further on highlighting how various information on fluid flow coming from the specific sensors lead to an improved understanding of the direction and magnitude of subsurface fluid flow.

How to cite: Kukowski, N., Stolz, R., Scholtes, T., Schwarze, C., and Goepel, A.: Long-term recordings at the FSU Jena Geodynamic Observatory Moxa (Thuringia, central Germany), EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-12850, https://doi.org/10.5194/egusphere-egu21-12850, 2021.

11:34–11:36
|
EGU21-13053
Robert Green, Michael Rast, Michael Schaepman, Andreas Hueni, and Michael Eastwood

In 2018 a joint ESA and NASA airborne campaign was orchestrated with the University of Zurich to advance cooperation and harmonization of algorithms and products from imaging spectrometer measurements.  This effort was intended to benefit the future candidate European Copernicus Hyperspectral Imaging Mission for the Environment (CHIME) and NASA Surface Biology and Geology mission. For this campaign, the Airborne Visible/Infrared Imaging Spectrometer Next Generation was deployed from May to July 2018.  Twenty-four study sites were measured across Germany, Italy, and Switzerland.  All measurements were rapidly calibrated, atmospherically corrected, and made available to NASA and ESA investigators.  An expanded 2021 campaign is now planned with goals to: 1) further test and evaluate new state-of-the-art science algorithms: atmospheric correction, etc; 2)  grow international science collaboration in support of ESA CHIME and NASA SBG; 3) test/demonstrate calibration, validation, and uncertainty quantification approaches;  4) collect strategic cross-comparison under flights of space missions: DESIS, PRISMA, Sentinels, etc.  In this paper, we present an overview of the key results from the 2018 campaign and plans for the 2021 campaign.

 

How to cite: Green, R., Rast, M., Schaepman, M., Hueni, A., and Eastwood, M.: Joint ESA and NASA Imaging Spectrometer Airborne Campaign to Support CHIME and SBG, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-13053, https://doi.org/10.5194/egusphere-egu21-13053, 2021.

11:36–11:38
|
EGU21-13138
|
ECS
Dimitrios Angelis, Craig Warren, Nectaria Diamanti, James Martin, and Peter Annan

The most frequently used survey mode for acquiring Ground Penetrating Radar (GPR) data is common offset (CO) – where a single transmitter and receiver pair move along a survey line at a constant (offset) separation distance. This allows rapid and dense data acquisition, and therefore high-resolution large-scale investigations, to be carried out with relative ease, and at relatively low cost. However, it has long been known that multi-offset survey methods, such as common mid-point (CMP) and wide-angle reflection-refraction (WARR), can offer many benefits over CO: detailed subsurface EM wave velocity models; enhanced reflection sections with higher signal-to-noise ratio (SNR); the potential to adapt well-established advanced seismic processing schemes for GPR data [1-2].

Despite the advantages of multi-offset GPR data, these methods have seen limited adoption as, in the past, they required significantly more time, effort, and hence cost, to collect. However, recent advances in GPR hardware, particularly in timing and control technology, have enabled the development of multi-concurrent sampling receiver GPR systems such as the “WARR Machine” manufactured by Sensors & Software Inc. [3-4]. These newly developed GPR systems have the potential to provide all the aforementioned benefits with considerably less effort and therefore reduced survey cost, as they allow for the fast acquisition of multi-offset WARR soundings.

In this work, we look at the challenges and opportunities from collecting and processing multi-offset GPR data. We demonstrate a processing workflow that combines standard GPR processing approaches, with methods adapted from seismic processing, as well as our own algorithms. This processing framework has been implemented into a GUI-based software written in MATLAB [5], and has been tested using both synthetic [6] and real multi-offset GPR data. Some of the specific challenges with multi-offset GPR that we investigate are time zero misalignments, CMP balancing, velocity analysis, and automated velocity picking. We show how addressing these issues can result in improved velocity analysis, and ultimately in improved subsurface velocity models, and stacked sections.

References

[1] Ursin, B., 1983. Review of elastic and electromagnetic wave propagation in horizontally layered media. Geophysics, 48(8), pp.1063-1081.

[2] Carcione, J. and Cavallini, F., 1995. On the acoustic-electromagnetic analogy. Wave Motion, 21(2), 149-162.

[3] Annan, A. P., and Jackson, S., 2017. The WARR machine. 2017 9th International Workshop on Advanced Ground Penetrating Radar (IWAGPR).

[4] Diamanti, N., Elliott, J., Jackson, R. and Annan, A. P., 2018, The WARR Machine: System Design, Implementation and Data: Journal of Environmental & Engineering Geophysics, 23, pp.469-487.

[5] Angelis, D., Warren, C. and Diamanti, N., 2020. A software toolset for processing and visualization of single and multi-offset GPR data. 18th International Conference on Ground Penetrating Radar.

[6] Warren, C., Giannopoulos, A. and Giannakis, I., 2016. gprMax: Open source software to simulate electromagnetic wave propagation for Ground Penetrating Radar. Computer Physics Communications, 209, pp.163-170.

How to cite: Angelis, D., Warren, C., Diamanti, N., Martin, J., and Annan, P.: Challenges and opportunities from large volume, multi-offset Ground Penetrating Radar data, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-13138, https://doi.org/10.5194/egusphere-egu21-13138, 2021.

11:38–11:40
|
EGU21-13341
|
ECS
|
Dursun Acar, Namık Çagatay, Ş. Can Genç, K. Kadir Eriş, Demet Biltekin, and Nurettin Yakupoğlu

Surface fractures at the filament of X-ray tube increase more with metal fatique or wrong cooling  and heating processes. Fractured filament continue to work as repeating open circuit positions in random times with  turning fully conductive state in short time. We are explaining how open circuit flashes at the filament providing wrong measurement results. Their low voltage electric circuit conductive problems  repeat in milliseconds periods. At  the results, it gives the impression of healthy measurement values. Because that the measured sample absorbs photonic energy and direct it to neighbouring elements in continuous element  electron scattering  circulations , by the way that delayed secondary electron energy scatters hide all electron supply extinctions on the semi broken flament wire and indirect counts  continue by the detector from coming reflection energy. ( real counts are not from exact beam  target of sample surface during energy deprivations , and it is impossible to understand that the measurement is inaccurate because it causes similarity as discrete element counts in sedimentation layers ).  Filament voltage arcs do not warn machine with error reporting systems until to whole ruptured filament touch to anode walls or their far displaced edges of 2 broken filament positioning. Erroneous records take their place in the world of science if the lithology was not followed. We collected faulty measurement data from our experiences for indicate when and  how possible to facing such as events.

For eliminate  explained reasons at above , the tubes must be gently heated and  cooled. Excessive cooling or heating of the tubes or oxid placement and leakeage  at gasget contacts reduces the surface contact areas of the insilators with the corrosion by  condensing water around the rubber insulation gasgets , it causes cooling liquid leakage or increasing humidity at the tube housing block via following serial failures of HV unit such as increasing amounts of the broken tube events. During the replacement of insulating gasgets, enough care should be taken for gasket contact points as oiling  them with  silicone grease as a form of the thin film. High responsibility must be with continuous  result control  and reference correlations on the scientific sample. With this way we can eliminate possible  negative results by malfunctions on measurements.

How to cite: Acar, D., Çagatay, N., Genç, Ş. C., Eriş, K. K., Biltekin, D., and Yakupoğlu, N.: Experiences on production - usage reasoned malfunctions & development of  X-ray tubes used in science and their effects on sediment measurements, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-13341, https://doi.org/10.5194/egusphere-egu21-13341, 2021.

11:40–11:42
|
EGU21-14085
Wen-Chau Lee, Jothiram Vivekanandan, Scott Ellis, Kevin Manning, George Bryan, Lou Lussier, Vanda Grubišić, and Bradley Klotz

The proposed airborne phased array radar (APAR) system consists of four removable, dual-polarized, C-band AESAs (Active Electronic Scanning Array) strategically located on the fuselage of the NSF/NCAR C-130. Conceptually, the radar system is divided into the front-end, the backend, and aircraft-specific section with the front-end primarily consisting of AESAs and the signal processor is in the backend. The aircraft specific section includes a power system and a GPS antenna.

As part of the risk reduction of the APAR development, the APAR Observing Simulator (AOS) system has been developed to provide simulated APAR data collection sampled from a C-130 flying by/through realistic numerical weather simulations of high-impact weather events. Given that APAR is designed to extend beyond capabilities of the existing airborne tail Doppler radars (e.g., NOAA TDRs and the retired NSF/NCAR ELDORA), a verification of signal processing software and algorithms is needed before the radar is physically built to ensure that the signal processing software infrastructure can handle high data rates and complicated, multiplex scanning that will be part of normal APAR operations.  Furthermore, several algorithms that will need to ingest large amounts of APAR data at very high rates are under development, including dual-Doppler wind synthesis, radar reflectivity attenuation correction, rain rate estimation, and hydrometeor classification. These algorithms need to be tested and verified before the implementation. 

The AOS will also serve as a planning tool for future Principal Investigators (PIs) who will use it to design and test different flight and scanning strategies based on simulated storms to yield the best scientific outcomes before their field deployment takes place. This will enable better understanding of trade-offs among various sampling regimes/strategies during the planning and enhance future field programs' efficiency and effectiveness.

How to cite: Lee, W.-C., Vivekanandan, J., Ellis, S., Manning, K., Bryan, G., Lussier, L., Grubišić, V., and Klotz, B.: Overview of the Airborne Phased Array Radar Observing Simulator, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-14085, https://doi.org/10.5194/egusphere-egu21-14085, 2021.

11:42–11:44
|
EGU21-15640
Juergen Kusche and the CLOck NETwork Services - Design Study (CLONETS DS) Team

Precise measurement of time and frequency has been instrumental in the development of modern geosciences. It has enabled us to quantify many observations, including plate motion, the variations of Earth rotation, and modern-day sea level rise.

Over the past decade, European National Metrology Institutes (NMIs), together with National Research and Education Networks (NRENs) and partners from universities and research institutes have pioneered the dissemination of ultra-stable optical frequency and timing signals via optical fibers. Initially started as proof-of-concept experiments, this technology has matured to aim for a paradigm change: making precise time and frequency signals available to the wider scientific community and thereby enabling new research avenues.

The CLOck NETwork Services Design Study (CLONETS-DS) is a research and innovation action intended to facilitate the vision of a sustainable, pan-European optical fiber network for precise time and frequency reference dissemination.

Here, we will present the envisioned technology, its performance parameters, and discuss potential applications, requirements and limitations for geophysical applications, for example in geodesy (chronometric levelling, gravity field observation), seismology, and very-long-baseline interferometry (VLBI).

How to cite: Kusche, J. and the CLOck NETwork Services - Design Study (CLONETS DS) Team: Optical frequency dissemination via fiber networks: The Clock Network Services (CLONETS) project and potential applications in the geosciences, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-15640, https://doi.org/10.5194/egusphere-egu21-15640, 2021.

11:44–11:46
|
EGU21-16275
Tarek Arafa-Hamed, Hossam Marzouk, Michael Becken, Ahmed Lethy, and Hatem Odah
Magnetotelluric loggers are key instruments for deep geophysical studies of crust and mantle. However, conducting a large-scale survey requires the implementation of a series of magnetotelluric instruments to complete the measurements in an efficient time. The main efforts and costs of a magnetotelluric survey are devoted to magnetic recordings. Therefore, using a compination of magnetotelluric stations along with parallel tellurics recorders can significantly reduce the time and costs needed to complete a regional survey. Based on this motivation, we present the construction, implementation and case studies of a long period telluric recorder (LPTR). The telluric recorder is based on a 24 bit ADC with a multiplexer that enables 2 differential channels devoted to the Ex and Ey telluric components. The multiplexer is adjusted to provide 1sample per second from each channel that corresponds to 2Hz sampling rate at the ADC. The multiplexing at this rate reduces the ADC efficient resolution to 20 bit. As the full measuring range is +/- 1.25V the least significant bit LSB is about 2.4 micro V. The output of the ADC is transferred via USB to a mini PC for time stamping and saving. The time of each record is provided from a GPS with accuracy of 1 ms. The LPTR is connected to the ground using a Cu-CuSo4 nonpolarizable electrodes. The electrodes are specially constructed to provide good and longterm connection to the ground in arid environments. The LPTR has been tested throughout several field implementations in Egypt. The setup for contiuous telluric acquisition is realized in Moghra, Dakhla, Farafra and in Fayoum. These locations covers a variety of northern and southern Egypt as well as western desert and Nile valy. During the test implementations the recorder is put to run parallel to an ADU07-e magnetotelluric system for 1-3 days then for 2-4 months to be compared and integrated with the magnetic observatories at Fayoum and Abo Simble. Both observatories are running MAGSON fluxgate magnetometers at a sampling rate of 1 Hz. The resultant data showed that the LPTR synchronizes with the ADU07-e at periods from 5s and with the magnetic observatory data at periods 25s. This indicates an efficient low-cost system that can be used for deep Earth resistivity investigations. A case study of 2-4 months of continuous telluric recordings that have been processed with magnetic observatories data provided impedances for periods up to 42000 seconds. The results are 1D modeled for depths of more than 800KM. A comparison between the obtained 1D MT model and global Earth-models (LITHO1) based on seismological data shows a quite good matching at the deep interfaces like upper crust, middle crust and lower crust. The delineation of seismic discontinuities at 410 KM and 680 KM shows corresponding clear change in resistivity at 410 KM and then at 700 KM as well.

How to cite: Arafa-Hamed, T., Marzouk, H., Becken, M., Lethy, A., and Odah, H.: Development and implementation of a low-cost long-period telluric recorder for deep Earth electrical investigations, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-16275, https://doi.org/10.5194/egusphere-egu21-16275, 2021.

11:46–12:30