November 2, 2025
November 2, 2025
Just send it all to the data lake, right? That’s where the magic happens; BI, AI, ML, all of it.
But… who actually understands the data once it gets there?
The Real Problem?
⇨ IT owns the data lake, but they don’t understand plant-level process data.
⇨ OT understands the data, but they’re not modelling it at the edge.
The disconnect is that raw data enters the lake 𝐰𝐢𝐭𝐡𝐨𝐮𝐭 𝐭𝐡𝐞 𝐨𝐩𝐞𝐫𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐜𝐨𝐧𝐭𝐞𝐱𝐭 needed to build trustworthy insights.
So what is the solution?
You build 𝐨𝐩𝐞𝐫𝐚𝐭𝐢𝐨𝐧𝐚𝐥 𝐦𝐨𝐝𝐞𝐥𝐬 𝐚𝐭 𝐭𝐡𝐞 𝐞𝐝𝐠𝐞, where the data originates and where the expertise exists.
✅ Let engineers define how metrics are cleansed, calculated, and expressed
✅ Publish those to the UNS and persist on the data lake
✅ Use that model to power both real-time ops and BI/AI dashboards
But here's the key:
Don't ask engineers to model data just for fun.
Make it necessary for them to do the jobs they already care about, improving quality, reducing downtime, hitting production targets.
The Future Architecture Looks Like This:
✅ Engineers model KPIs at the edge
✅ That logic feeds the UNS and the data lake
✅ BI dashboards align with what’s actually happening in operations
✅ AI/ML runs at the edge, where decisions need to be made in real time
Now everyone, from the control room to the boardroom, is working off the same trusted metrics.