Ecosystem-based fisheries management (EBFM), and indeed much fisheries management around the world today, depends on predictions obtained from often-complex computer models. We need models to help us understand the present (for example, how large a fish stock currently is) as well as the future (for example, whether a fish stock could shrink due to rising sea temperatures). We can then draw on these predictions to develop management strategies and measures designed to keep fish stocks and their wider ecosystems healthy, as well as to benefit the livelihoods and populations that fish stocks support.
But as with all science, modelling involves uncertainties. The design of a model, our understanding of the system we are trying to simulate and the quality of the data fed into the model all affect how closely the resulting predictions match reality (or reality to come), and how certain we are of those predictions. This means it is essential to critically assess our models and everything that goes into them. It also means we need to communicate levels of uncertainty relating to these to users of the resulting predictions to allow them to make fully informed decisions. In this report, part of our Communication work theme, SEAwise leads set out best practice guidelines for the use of our models and communicating their uncertainties.
The report covers important considerations and guidance for: model design and inputs; estimating uncertainty in models and their results; and communicating that uncertainty. To deliver these, we conducted a review of existing literature on ways to calculate the “predictive capability” of models (that is, the ability of a model to predict future events based on past information) and several workshops with SEAwise project participants and members of wider scientific and stakeholder communities. Together, their input was critical to developing our ‘uncertainty guidelines’, which form a key pillar of the project’s output quality assurance process.
We identified a range of sources of uncertainty in modelling that affect a model’s predictive capability. Those sources include “observation uncertainty” (limitations to the data we have), “model uncertainty” (the difference between the model system and the ‘real world’) and “process uncertainty” (natural variations in ecosystem processes such as fluctuations in plankton availability between different years, and species’ life patterns such as differences in the number of eggs produced by a population between different years).
Is more complexity always better, then? Not quite. The more complex the model, the less likely it is to overlook important real-world processes and interactions, but the more likely it is to suffer from computational limitations or large data requirements. Does this mean models are useless? Not usually. But it does mean that we have to do everything we can to minimise uncertainties and communicate those which remain. SEAwise’s best practice recommendations to achieve this are to:
The best practice guidelines in this report represent a living document that we will add to and refine over time. We’re already applying our own recommendations to all modelling work we undertake, ensuring that those models hit the “sweet spot” where their predictive ability is optimised in a way that ensures our models are complex enough to be realistic and also certain enough to be useful to people making decisions. The SEAwise project is committed to taking a co-creation approach in developing and using models, with active consideration of what fisheries stakeholders think, know, and need. Doing so will ensure that our research outputs can effectively inform the implementation of EBFM in Europe.
Read the full report here.
Stay up to date with SEAwise news and research, hear about upcoming events, and receive updates on fisheries news from across the European seascape.