November 7, 2025
November 7, 2025
For manufacturing leaders pursuing AI transformation, there's a hard truth: your AI strategy will fail without solving connectivity first. This isn't a technology limitation—it's a fundamental data access problem that most organizations underestimate.
Prof Dr Bernd Hafenrichter, CTO of Soffico and professor at the Technical University of Ingolstadt, has spent decades connecting heterogeneous manufacturing systems. His experience reveals why connectivity isn't just a technical prerequisite—it's the difference between AI success and expensive failure.
Manufacturing environments are a patchwork of disconnected systems. Your production data sits trapped in legacy ERPs, sensor data streams from machines using proprietary protocols, and quality metrics hide in standalone databases. Each system speaks its own language—literally.
When you try to implement AI for predictive maintenance or process optimization, you quickly discover the real problem: you can't analyze data you can't access. And you can't access data that lives in isolated silos.
The core challenges include:
This isn't a problem you can solve by buying better AI models. The best algorithm in the world can't extract insights from data it can't see.
One of the most valuable insights from manufacturing data experts is counterintuitive: start gathering data now, even if you don't yet know what questions you'll ask.
Think of it like the healthcare sector's approach to scalable databases. Hospitals began collecting vast amounts of patient data years before they had specific AI applications in mind. Why? Because they understood that future innovations would require historical data—and you can't retroactively collect information from the past.
In manufacturing, this means:
Consider experienced operators who can't articulate why they adjust settings a certain way—it's intuition built from years of observation. With comprehensive data collection, AI can learn these subtle patterns and make that expertise transferable to new employees.
Solving connectivity isn't about implementing one adapter and calling it done. It requires a systematic approach that acknowledges the complexity of manufacturing environments.
The technical journey typically follows this progression:
First, establish basic connectivity across your heterogeneous systems. This means building or deploying adapters that can communicate with every system generating valuable data—from modern IoT sensors to decades-old PLCs.
Next, normalize the data so different systems' outputs can be understood together. A "cycle complete" signal from one machine needs to mean the same thing as a similar signal from another manufacturer's equipment.
Then, create a unified namespace where all manufacturing information converges. This isn't just a fancy term—it's a architectural pattern that gives every system and person in your organization a consistent view of operational data.
Key implementation considerations:
This infrastructure becomes the foundation for every AI initiative that follows. Without it, you're building on sand.
The operational technology (OT) and information technology (IT) gap isn't just organizational—it's architectural. Manufacturing floor systems operate in real-time with deterministic behavior requirements that enterprise IT systems never had to consider.
Your connectivity strategy must account for these fundamental differences. Real-time sensor data from production equipment needs different handling than transactional data from your ERP. Latency tolerances differ. Reliability requirements are stricter. And the people managing these systems often speak different technical languages.
Successful OT-IT integration requires:
The goal isn't to make OT systems behave like IT systems or vice versa. It's to create a connectivity layer that respects their differences while enabling data to flow where it's needed.
For organizations ready to tackle connectivity as their AI foundation, here's where to start:
Conduct a comprehensive systems inventory: Document every data source in your manufacturing environment, including legacy systems you might be tempted to ignore. These often contain historical patterns crucial for training AI models.
Map your current data flows: Understand where data moves today—and more importantly, where it should move but doesn't. These gaps reveal your connectivity priorities.
Define your unified namespace early: Even if implementation takes time, establish the data model and naming conventions that will govern your connected environment. This prevents rework later.
Start with high-value, low-complexity connections: Pick systems that provide valuable data and are relatively straightforward to connect. These quick wins build momentum and demonstrate ROI while you tackle harder integration challenges.
Invest in data quality governance from the beginning: Connectivity amplifies data quality issues. Garbage data flows just as easily as good data through your infrastructure, so establish quality standards and monitoring before scaling up.
AI models are powerful, but they're not magic. They need comprehensive, accessible, clean data to deliver value. In manufacturing environments filled with legacy systems, proprietary protocols, and disconnected data silos, establishing connectivity is your true first challenge.
The good news? Solving connectivity doesn't just enable AI—it improves operations immediately. When production managers can see real-time data from multiple systems in one place, they make better decisions. When quality teams can correlate sensor data with defect rates, they identify issues faster. These benefits justify the connectivity investment before your first AI model goes into production.
For manufacturing leaders, the message is clear: treat connectivity as a strategic priority, not a technical afterthought. Build the data infrastructure that brings your siloed systems together. Only then can you unlock AI's full potential to transform your operations.
The manufacturers who succeed with AI won't be those with the best algorithms—they'll be those who solved the data access problem first.
Kudzai Manditereza is an Industry4.0 technology evangelist and creator of Industry40.tv, an independent media and education platform focused on industrial data and AI for smart manufacturing. He specializes in Industrial AI, IIoT, Unified Namespace, Digital Twins, and Industrial DataOps, helping digital manufacturing leaders implement and scale AI initiatives.
Kudzai hosts the AI in Manufacturing podcast and writes the Smart Factory Playbook newsletter, where he shares practical guidance on building the data backbone that makes industrial AI work in real-world manufacturing environments. He currently serves as Senior Industry Solutions Advocate at HiveMQ.