Uncertainty of discharge measurement using salt dilution
- 1Norges vassdrags- og energidirektorat (NVE), Oslo, Norway (ahau@nve.no)
- 2Fathom Scientific Ltd., R&D, Canada (gsentlin@telus.net)
Discharge measurement using salt dilution is an old method, but it has been recently more and more used thanks to the development of new sensors making it possible to measure conductivity and compute discharge in real-time. Salt dilution is very well suited for turbulent rivers, such as mountain streams. The ISO standard ISO 9555 propose a normative framework to estimate uncertainty, but it was published in 1994 and is now obsolete for new sensors and computational capabilities. In this article, we propose a complete framework to compute the uncertainty of a salt dilution gauging following the GUM (Guide to the expression of uncertainty in measurement) method that take into account the following error sources: (i) the uncertainty in the mass of salt injected, (ii) the uncertainty in the measurement of time, (iii) the uncertainty in the Conductivity to Concentration law, (iv) the uncertainty if a measurement conductivity is out of the range of the Conductivity to Concentration law, (v) the uncertainty in the computation of the area under the conductivity curve, (vi) the uncertainty due to a not perfect mixing of the tracer if the mixing length between injection and the probes is not reached (vii) the uncertainty due to a loss or a gain of tracer between the injection and the probes if tracer can be adsorbed for example and (viii) the uncertainty due to unsteadiness of the flow i.e. variation of discharge during the measurement. The method for computing each uncertainty source is presented and the new framework is applied to a set of real measurements and compared to the expertise of field hydrologists.
How to cite: Hauet, A., Florvaag-Dybvik, K., Dahl, M.-P. J., Kvernhaugen, F. T., Møen, K. M., and Sentlinger, G.: Uncertainty of discharge measurement using salt dilution , EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-4661, https://doi.org/10.5194/egusphere-egu2020-4661, 2020
Comments on the display
AC: Author Comment | CC: Community Comment | Report abuse
Alex and co-authors, thanks for this sound and very useful work, with a nice software tool at the end. This is really original and necessary to improve operational practices, in a worldwide context of increasing interest for tracer dilution streamgauging techniques.
Here are a few comments for the sake of discussion.
Your results show that uncertainty due to insufficient mixing can be high, as expected. How could we estimate this uncertainty component when only 1 probe is used and the injection is not repeated? (I agree that the best practice is to use at least 2 probes and/or repeat the injection with different locations of the probes, but many single probe measurements do exist).
I'm not sure I agree with the way you estimate the uncertainty due to the noise and resolution of the concentration signal. On your example (slide 16), the uncertainty results look rather severe and I understand this is because you take the variability of the signal as a measure of the mean concentration uncertainty. I rather think that those random errors would average out through the integration of the concentration curve. As a result, I guess the discharge uncertainty would be minimal. I think the standard deviation should be divide by sqrt(N) with N the number of readings, or something like that.
Last, I often observed that in the same conditions different operators sometimes produce scattered discharge results... This strong operator effect seems to be related to the calibration mainly, depending on the manual/visual skills of each operator. I don't see such effect included in your analysis. How could it be estimated and included? I think that interlaboratory comparisons would be helpful for this.
Hi Jerome, thanks for your superb observations. First let me first say that I think Alex has done an excellent and very comprehensive job of the uncertainty analysis, but it's not the final draft. This online forum seems to be a great peer-reviewed path towards finding a consensus on Salt Dilution uncertainty. The silver lining of the online platform, besides a wider audience, is perhaps this permanent record of discussions.
I agree with your comment about the uncertainty in BGECT. In my 2015 post:
I do just that, divide by sqrt(N) where N is the number of points in the sample. This is the central limit theorem, assuming each ECT point in the sample is already a mean of many measurements (it is, >1000) and the distribution about the mean BGECT, when divided by √N, is the new uncertainty in the mean. This does have the possibility of underestimating the sample uncertainty, especially if the sensor resolution is not considered. For that reason, in the QiQuac and Salt Portal, we use
(2)
and R is the sensor resolution, SEBGEC.T is the standard error in the BG EC.T given by:
(3)
Alex has considered R/(2√3) on page 8. One complication of this is assuming that the BGECT is steady. For that reason, we now consider the standard error about the sloping regression line between the Pre- and Post- BGECT in equation 3 above. This also has a tendency to underestimate the BGECT uncertainty when divided by √n.
I'd like to see this GUM approach validated. One way to do this would be to compare measurement pairs from the two sensors, assuming complete mixing. Another way to validate this estimate of uncertainty is to compare the derived uncertainty to the deviation from the established rating curve. I'd like to see if there is correlation between the GUM approach and deviation from the RC, much like Frode presented in Wales.
I'd also like to move away from field derived CFT values, as we've determined that the natural variation of the CFT is on the same order of magnitude as the field sampling uncertainty (http://confluence-jwsm.ca/index.php/jwsm/article/view/1/2) . That's why, in our Salt Portal () we keep a database of historical CFT values at each site with the aim of allowing the user to select the average CFT for a site, or a CFT compensated for BGECTT, since we have determined there is a slight (+1.5% over 0-5000uS/cm) dependence.
So I believe this is a great start, but there is more work to do. Thank you for the comment.
I didn't realize the equations as graphics wouldn't post.
(2) is EC.TUnc = max(SE_BGECT,1/2R)
where R is the sensor resolution and
(3) is SE_BGECT = s_BGECT/√N
where s_BGECT is the standard deviation of N x BGECT samples.
Hi Jérôme,
Thanks for your comments.
When a measurement is realized using only one probe, which is not recommended, it is now not possible to compute the uncertainty due to the possible bad mixing. In the SUNY approach, for such situation, we set a value of uncertainty for the mixing of 15% (at k=2). It is not perfect, of course, but it acts like a warning. If the field hydrologist knows well the stream reach and knows that the mixing is probably OK, this value can be reduced.
For the uncertainty due to the noise, we assume that the noise is probably not random. Most of the time, the noise comes from air bubbles coming through the sensor, reducing the conductivity, resulting in a noise always lower than the signal. We so decided not do reduce the noise estimate by the number of sample.
For your last point about the operator effect, it is taken into account in our framework : as you can see from the screenshot of the software, there is an uncertainty source called “operator and environmental effect” that is part of the uncertainty of the calibration. It is a systematic error, set to 2% (k=2) by default (this value comes from interlaboratory experiment conducted by EDF-DTG).
Please fell free to ask more during the chat if needed !
Alex
Interesting work! Slide 10 mentions a web-interface of SUNY. Is there a link one could use to play around a bit?
Best,
Till