How to quantify the accuracy of mass anomaly time-series based on GRACE data in the absence of knowledge about true signal?

More Info
expand_more

Abstract

A novel technique has been developed to assess noise levels in GRACE-based mass anomaly time-series when the true signal is not known. The technique is based on computing an optimal combination of analyzed time-series in the presence of a regularization. To find the optimal weights associated with individual time-series, variance component estimation is used. In this way, noise variance (and, therefore, noise standard deviation) for each time-series is estimated. To validate the developed technique, altimetry-based water level variations in several lakes are used as independent information. Those variations are compared with mass anomaly time-series extracted from eight GRACE models of time-varying Earth’s gravity field from different data processing centers. The lake tests demonstrate a good performance of the developed technique, provided that the regularization functional is properly chosen. The best results are obtained with a novel regularization functional, which can be understood as a minimization of year-to-year differences between the values of the second time-derivative of the unknown function. Finally, the GRACE models under consideration are analyzed globally. It is found that the models produced at the Institute of Geodesy at Graz University of Technology (ITSG) and at the Center of Space Research of the university of Texas at Austin (CSR) show, in general, the lowest noise levels. The aforementioned lake tests also allow the signal damping in GRACE models to be quantified. It is shown, among others, that regularized GRACE models may suffer from a noticeable signal damping (up to ∼ 15 %).