Alum helping improve wastewater treatment with ‘explainable AI’

A man sits at a table working at a laptop computer.

Most of us never think about what happens after water spirals down a drain. But for the people who run wastewater treatment plants, keeping the promise of clean water is a race against time – and data.

Treatment plant operators face a major challenge: critical water-quality tests take days to complete, while operators often need to make decisions within hours.

A growing field of artificial intelligence – explainable AI, or XAI – is helping bridge that gap, said Fuad Nasir, a water supply specialist with the Wisconsin Department of Natural Resources.

During his graduate studies, Nasir (’25 PhD Civil Engineering) delved into research involving data collection from a local treatment plant. And he noticed something important happening in the broader world of technology: “AI and machine learning were becoming used in many sectors,” he said, “but they were not using AI in wastewater treatment very much in the United States.”

A matter of trust

The problem wasn’t lack of interest – it was lack of trust. Traditional machine learning has been compared to a black box: It takes in data and produces predictions.

“Operators were hesitant to use it because they couldn’t see the process for how it arrived at its prediction,” Nasir said.

For wastewater operators making high-stakes decisions about chemical dosing and treatment timelines, that opacity was a deal-breaker.

That led him to explainable AI. XAI doesn’t just predict an outcome; it highlights the variables that shaped that prediction. “It reveals what’s going on in the background,” Nasir said. “You can literally visualize it when you apply XAI.”

A real-world example

A key example illustrates why this matters. Wastewater plants rely on a lab test called biochemical oxygen demand (BOD) to gauge how much organic material remains in treated water. High BOD can harm aquatic life, so operators need to adjust treatment chemicals to keep it low. But BOD testing takes five days, too long for operators to make decisions.

XAI-guided models can predict BOD using historic data, showing which factors – such as temperature, flow rate or ammonia – are driving the levels up or down.

XAI has grown rapidly. The medical field adopted explainable AI earlier, but wastewater research is now catching up and citing Nasir’s work in the process.

“I’ve seen a surge of people using it in wastewater in the last few years,” he said.

Real-world deployment will take time, he admitted, because utilities need instruments, training and funding, but the direction is clear.

Drawn by UWM research environment

Nasir arrived at UWM in 2019 as a master’s student from Bangladesh, attracted by UWM’s strong research culture and environmental engineering faculty, including Professor Jin Li, his advisor.

“UWM was ranked as an R1, a top research institution, which definitely caught my attention,” Nasir said.

After a year, he switched to the PhD track. Today, Nasir focuses on aspects of public water systems tied to health and regulation. With AI advancing quickly, he believes technical expertise will be increasingly valuable in regulatory spaces.

His advice for undergraduates? Pay attention. “The thing about AI or machine learning is it changes so fast.”

Top Stories