The wrench slipped, a dull metallic thwack against the wet concrete floor that echoed through the gallery of station forty-one. I wasn’t even looking at the bolt. I was looking at the new digital display mounted on the wall, a glowing rectangle of cold blue light that told me the dissolved oxygen was at seven-point-one milligrams per liter. The screen was steady, unwavering, and utterly terrifying to the man standing next to me. Miller has been here for twenty-one years, and he treats that screen like a trespasser in his own home. He doesn’t distrust the silicon; he distrusts the hand that bought it.
We’ve entered an era where the primary friction in water treatment isn’t the chemistry of the influent or the age of the cast-iron mains. It’s the silence that happens when a sensor takes over a task that used to require a human touch. Management sees a dashboard; operators see a curtain being drawn between them and the reality of the pipes. There’s a specific kind of anxiety that comes when you realize your value is being shifted from your ability to ‘feel’ a pump vibration to your ability to interpret a line graph on a tablet you aren’t allowed to calibrate.
The Unseen
The Fear
The Machine
Phoenix G. knows this feeling well, though he doesn’t work in water. He’s a watch movement assembler-one of the few left who can tune a hairspring by the sound it makes against the escapement. He sits at a bench that has likely been in use since 1951, using tweezers that cost eighty-one dollars and a loupe that shows him a world of gears no larger than a grain of sand. Phoenix once told me that the hardest part of his job isn’t the precision; it’s the knowledge that a quartz crystal can do his job’s primary function for a fraction of the cost. But the quartz doesn’t know *why* a watch is running fast. It only knows that it is. Phoenix, like Miller at the water plant, is the keeper of the ‘why.’
I fell into a Wikipedia rabbit hole last night, originally looking up the history of the Great Stink of 1858, but I ended up reading about the Antikythera mechanism. It’s an ancient Greek analog computer used to predict astronomical positions. It was a masterpiece of gears, 31 of them at least, and it represents our eternal obsession with automating the complex. But here’s the thing: the people who used the Antikythera mechanism still had to understand the stars. They didn’t just look at the gear and stop looking at the sky. In modern water utility management, there is a dangerous tendency to think that once you have the ‘gear,’ you can fire the astronomer.
Potential Failure
Gallons of Solvent
Management wants automation because it promises a world without human error. They want 101 sensors to tell them exactly when a pH level drops to six-point-one or when turbidity spikes. They want the efficiency of a machine that never sleeps, never asks for a raise, and never gets a divorce. But what they often forget is that data is not information, and information is not wisdom. A sensor can tell you the pH is dropping, but it might not tell you it’s because a delivery truck spilled two-hundred-and-one gallons of cleaning solvent three miles upstream. Miller would know that because he recognizes the specific chemical ‘sweetness’ in the air near the intake.
This tension creates a paradox. We buy the best hardware, like a high-end pH sensor for water for real-time monitoring, and then we fail to integrate it because we haven’t answered the human question: What does the operator do now? If the machine is doing the monitoring, the operator’s job shifts from manual labor to data science, often without the training or the authority to actually act on what they see. We are essentially giving people the stickpit of a fighter jet but telling them they aren’t allowed to touch the stick unless the plane is already crashing.
I’ve made mistakes before. I once ignored a high-pressure alarm for 31 minutes because I thought the sensor was fouled by algae. It wasn’t. A valve had seized, and we nearly blew a seal that would have cost the city forty-one thousand dollars to replace. That mistake was mine, and I own it. But that failure taught me more about the limits of my own intuition than any training manual ever could. When we automate away the possibility of failure, we also automate away the possibility of expertise. Expertise is just the sum of survived mistakes.
In the watch factory, Phoenix G. handles a tiny screw, only zero-point-one millimeters in diameter. If he drops it, it’s gone. He calls it ‘the sacrifice to the floor gods.’ In water treatment, we can’t afford sacrifices. We need the precision of the digital and the instinct of the analog. Yet, we see utilities spending sixty-one percent of their technology budget on hardware and less than one percent on the psychological transition of the workforce. We are building Ferraris and giving the keys to people we’ve spent years telling to keep their hands off the engine.
The Human Element as a Fail-Safe
There is a contrarian view here that most people miss: The operators who resist automation are often the ones who care the most about the water. If they didn’t care, they would embrace the automation because it makes their jobs easier. They could sit in the breakroom for 11 hours and just wait for an alarm. The resistance comes from a place of stewardship. They feel a physical weight of responsibility for the 201,001 people who drink the water they process. When a machine steps between them and that responsibility, it feels like a divorce.
Let’s look at the numbers. A typical mid-sized utility might have 41 remote pump stations. If you automate them fully, you save maybe thirty-one percent on labor costs over a decade. But if a single automated decision goes wrong because the algorithm wasn’t tuned for a once-in-a-century flood event, the resulting fine or infrastructure damage could reach $1,000,001 in a single afternoon. The human is the fail-safe. The human is the one who notices the birds have stopped landing on the settling pond.
Transparency and Trust in Data
I’m not arguing for a return to the dark ages. I’m arguing for a transparency of data. Miller’s distrust stems from the fact that the sensor data goes to a server in a different building before it ever shows up on his screen. He feels like he’s watching a replay of a game rather than playing in it. If we want utilities to thrive, we need to bring the data back to the floor. We need to make the sensors tools for the operators, not just report-generators for the executives.
Consider the way Phoenix G. uses his tools. He doesn’t let the machine assemble the watch. He uses a machine to measure the beat error, and then *he* makes the adjustment. The machine is a magnifying glass for his skill, not a replacement for it. In water, we should be using monitoring technology to magnify the ‘feel’ of the operator. Instead of a screen that just says ‘pH 7.1,’ we should have systems that explain the trend, compare it to historical data from 1991, and ask the operator for their input.
We often talk about ‘smart cities’ as if the intelligence is baked into the concrete and the sensors. But a city is only as smart as the people who maintain it. If we automate the humans out of the loop, the city isn’t smart; it’s just programmed. And programs are brittle. They can’t handle the weird, the edge cases, or the ‘smell’ of a truck spill three miles away. We are currently designing systems that are 91 percent efficient but 0 percent resilient.
The ‘Why’ Behind the Data
I remember reading about the ‘Flash Crash’ in the stock market-another Wikipedia deep dive-where automated algorithms started selling because other algorithms were selling, and within minutes, nearly a trillion dollars vanished. That happened because there was no ‘Miller’ at the terminal to say, ‘Wait, this doesn’t make sense.’ In water treatment, a ‘flash crash’ means people get sick. It means lead leaching into the pipes because the chemistry went sideways for 41 minutes while the computer waited for a human to click ‘OK’ on an update prompt.
The real work of the next decade isn’t going to be building better sensors. We already have incredible sensors. The work will be rebuilding the trust between the person and the device. We need to stop treating operators like they are the ‘human error’ factor and start treating them like the only reason the system actually works. We need to give them the same visibility that the supervisors have. If I can see the same data on my phone that the plant manager sees on his dashboard, I’m not a cog anymore; I’m a partner.
Seeing the Gears, Not Just the Face
Phoenix G. once told me that the most beautiful part of a watch isn’t the face; it’s the movement inside that no one sees. ‘Most people just want to know what time it is,’ he said, picking up a tiny gear with a pair of 11-centimeter tweezers. ‘But I want to know how the time is made.’
We need to let our water operators know how the data is made. We need to let them see the gears. When Miller understands that the sensor isn’t there to replace his eyes but to give him X-ray vision, his fear will evaporate. Until then, he will continue to drop his wrench, look at the blue screen with suspicion, and wait for the day the machine finally asks him for help-a day he knows is coming, likely at two-thirty-one in the morning during a thunderstorm.
What happens to the ‘why’ when we stop asking the humans to find it? If we automate the observation, we eventually automate the curiosity right out of the building. And a utility without curiosity is just a disaster waiting for a catalyst. We have 11 million miles of water pipe in this country, and not a single foot of it was laid by a computer. It was laid by people who knew the soil, the tilt of the land, and the weight of the water. We would do well to remember that as we install the next 101 sensors in the dark.