I just returned from the Physics of Estuaries and Coastal Seas conference, a small (~150 people), very science-focused event for the niche academic world I primarily live in. I thought it was a great conference, or at least, I liked the people and the science a whole lot.
(Note: since last writing, I accepted a postdoc in the Environmental Systems Dynamics Lab at the University of California, where I will continue to study wetlands but now mostly using satellite imagery! So I will take a step away from the fluid mechanics-heavy world, but I hope I will return some day too...)
This realm of science that seeks to understand coastal waters leans heavily on numerical models of environmental fluid mechanics. In other words, simulations of the physics that govern how water moves through estuaries and coastal seas. Virtually all of this means using the Navier-Stokes equations, a set of partial differential equations that describe the movement of fluids. These equations are then simplified somewhat for environmental flows: we can assume the water is essentially incompressible, and that horizontal scales are much longer than vertical scales. And we add a variety of processes like friction (as water flows over the ground), waves made by winds, and tides. All of our numerical models for solving partial differential equations require discretizing them—i.e., creating a grid of cells, and within each cell the physics are resolved and the equations solved.
from here
Once all together, these models are generally very good. (Their errors add up if you try to use them to predict other things layered on top, though, like sediment transport, or phytoplankton growth.) They are constantly being refined with new numerical techniques, better empirical data, and more powerful computing resources. Environmental fluid mechanics follow some nonlinear processes (e.g. the Navier-Stokes equations have nonlinear terms) and this renders them susceptible to chaos in the classical sense: small differences between the starting points of two scenarios can lead to large differences in their outcomes. Despite this, the hydrodynamics are not absurdly nonlinear, and many systems dissipate energy via friction and turbulence, keeping "runaway" scenarios from happening.
Broadly, chaos and nonlinearity make prediction difficult, because we never have perfect knowledge of the initial conditions (in any empirical science, at least). Additionally, when we make our grid of cells, we are simplifying our system down to some scale, like the width of a cell. These compromises bring in error, and the dance of good modelling is in trying to quantify and minimize these errors. You can always make your grid cells smaller, but then more and more computation time is required; most folks work to find a happy medium between speed and accuracy based on the needs of their project.
(I always think of Jorge Luis Borges' short story, On Exactitude in Science / Del Rigor en la ciencia, where a scale map is made of the world, but the scale is 1:1; the map is the size of the territory. While we might have a perfect model, a model is useless if it is not scaled-down and simplified.)
What we mean by the word "model" itself is pretty messy. Some people call a linear regression a model (predict Y from X!); we can call sophisticated, multi-level statistical relationships models; we can make physical mock-ups; we can write some equations and call it a model. But the fluid-mechanics models I've been talking about are more simulations, in that they aspire to reflect the full physics of a place—a form of model that I think could be called a "digital twin." Lizzie Yarina's essay, "This River is a Model," articulates well the tightness between the actual estuary/river systems of the Netherlands and their simulated hydrodynamic models, then how they are interwoven with the technocracy of place. She insists (and I agree) that the simulated water conditions are imaginaries, upheld by networks of engineers, policy-makers, landowners, and various other stakeholders. This remains true even in the context of operational or real-time models, in which there is constant incorporation of actual measurements made. While I would maintain there is some kind of indelible truth to the physical world, our methods of measurement of the world, our interpretations of those measurements, and our responses—or to our simulated digital twins—are co-created with our internal conceptual models of how the world works according to physics, chemistry, biology, and more: these conceptual models undergird the epistemologies at play.
To the engineer, the quest for better and better environmental models, or even to make the world more behaving-like-a-model, is the quest for better predictability and ostensibly for better control. It is interwoven with risk reduction, power politics, state influence, resource development and extraction. Modeling, when used comparatively, also helps expose how well we understand our systems, where we work to attribute discrepancies between models and "the real world" to processes we have not yet incorporated (or correctly calibrated). I'm excited about this scientific development. I also want to help reduce things like flood risk. But I also believe the logical conclusion of super high-resolution model refinement is, inevitably, bumping into how much knowability, predictability, and controllability is possible. My skepticisms are due to, respectively, the limits and subjective nature of observation; chaotic dynamics; and finite resources for control (or a sort of "technical debt" of civil/geo engineering). And even in me writing this, I am retaining a sort of technocratic view: we should always be asking why do this modeling at all, for whom, and with what politics?
Written by my digital twin,
Lukas
p.s. I'm reading Feyerabend's Against Method with my friend Adam right now, so of course I'm thinking about the unavoidable epistemological blinders we were when we follow any specific methodology for knowledge production. I should have read some Kuhn or Popper beforehand, and a lot of scientific work has changed in the 50 years since its publication. The nugget of the book seems to still stand... more on that later.