Oral Presentation Australian Freshwater Sciences Society Conference 2024

Making the most of historical datasets: techniques to accurately interpret historical long-term water quality monitoring (113433)

Vaughn Grey 1
  1. University of Melbourne, Burnley, VICTORIA, Australia

Long-term stream water quality monitoring has the potential to tell much about changes in the condition of the target stream and collectively, the wider geographical area. However, water quality monitoring is typically expensive, and as such, data collection is often constrained to occur infrequently and at only a small number of locations across a broad catchment. For example, for the more than 20,000 km of waterways in the greater Melbourne region, long-term water quality sampling occurs at just 134 sites, with samples usually collected either monthly or every-second-month. As a result, over a billion litres of water can flow down a river in-between the collection of samples, during which time water quality can vary greatly in response to dry and wet weather or other events in the catchment. Given these resource constraints, the design of programs to monitor river water quality and the subsequent analysis of collected data is a challenging problem if the network is to effectively identify trends and management priorities or evaluate the effectiveness of management interventions.

I will discuss the challenges associated with selecting locations and monitoring frequency for long-term river water quality monitoring programs. I will present findings from my research into how sample collection strategies affect the estimation of trends and site means, what long-term stream water quality data can tell us about the effects of climate change, and ideas for how to optimise existing and future sampling networks. The findings from this work are important for any situation dealing with sparse, low-frequency datasets or aiming to spatially optimise sampling.