Groundwater Modelling Uncertainty Analysis

By Neil Manewell

Regional groundwater models are simplistic representations of complex systems, and therefore there is a high level of uncertainty from any single set of predictions from a groundwater model.

Although we try our best to add sufficient complexity to a model, e.g. numerous layers, hydraulic zonation/pilot points, there will always be uncertainty, simply because it is impossible to measure the aquifer and recharge parameters of every square metre of the model domain.

To reduce uncertainty as far as reasonably possible, we calibrate our models to as much observation data as possible, e.g. groundwater levels, sump pumping observations, and hydraulic testing results.

There are an infinite number of combinations of parameters that will produce a ‘calibrated model’, and therefore is it important to explore impact predictions across these parameter interactions.

The uncertainty in the predictions of a model can be explored using a number of methods. A relatively quick process is to undertake a predictive linear analysis. This method assumes a normal distribution of parameters, based around an initial calibrated value.

The probability and range of predictions is calculated by making very small changes to each parameter during the predictive simulation, then projecting the likely impacts out, assuming a normal distribution. One advantage this method has is the quantification of parameters contributing to uncertainty, for example hydraulic conductivity of a layer is contributing ±1 ML/day to mine inflows.

It will also provide information on which groundwater observations most enhance the calibration, therefore reducing uncertainty of predictions.


Because parameter iterations are rarely linear, relying on linearly derived results from complex groundwater systems may not be appropriate in some cases. Non-linear uncertainty analysis actually tests parameter interactions across their predicted ranges to derive the likelihood of groundwater impacts.

One method is a null-space Monte Carlo technique, which uses the information from the calibration dataset to constrain the range and frequency of the parameters attributed to each unit in the model. This process creates several hundred ‘realisations’ of a calibrated groundwater model, which can be used to analyse the range in predictive impacts. Similar to linear predictive analysis, the process of assigning the frequency of parameter values usually assumes a normal distribution.

In reality, aquifer and recharge parameters rarely adopt a normal distribution. Parameters can have skewed, flat, or multi-normal distribution. Neglecting the possibility of a non-normal distribution can have significant implications to the range of groundwater impact predictions. AGE prefer the use of a calibration constrained GLUE (generalised likelihood uncertainty estimation) approach.

GLUE works by starting with a random suite of non-normally distributed models. Each model is tested to ensure calibration is maintained, and parameter distributions are refined by neglecting models outside a measurement threshold, e.g. objective function increase of 100%. One drawback is the many thousands of ‘realisations’ required to properly explore all possible parameter combinations.

DREAM (DiffeRential Evolution Adaptive Metropolis) uncertainty analysis is in our opinion currently the best method to quantify uncertainty. Similar to GLUE, the process assumes non-normal distributions of parameters. Parameter distributions are generated using a generic algorithm, meaning there is no requirement for calibration rejection, as eventually all realisations generated in the analysis are ‘calibrated’.

The results from the above analyses can produce the probability of drawdown, change in groundwater flux, baseflow, or dewatering predictions as a composite result across all ‘realisations’, e.g. 95th percentile drawdown estimates predict Bore_1a will go dry in 2028, or cumulative 80th percentile dewatering requirements will be 120 GL.