November 8, 2025
November 8, 2025

If you're leading data and analytics initiatives at a manufacturing company, you've probably noticed a troubling pattern: despite the hype around AI and machine learning, many organizations still struggle with basic data integration. The problem isn't a lack of sophisticated algorithms or cloud computing power. The real bottleneck is getting clean, standardized data from your plant floor systems in the first place.
Martin Thunman, CEO of Crosser, has spent years working with manufacturers on this exact challenge. His perspective cuts through the noise around Industry 4.0 to reveal what actually works when integrating legacy systems with modern analytics platforms. The answer involves rethinking where and how you process industrial data, and it starts with a hard truth about the automation industry.
A significant challenge in manufacturing data integration often receives limited attention: major automation vendors continue to maintain proprietary protocols despite public commitments to open standards like OPC UA. While these companies champion interoperability in their marketing, they often protect legacy protocol specifications that create barriers for customers.
This creates unnecessary friction for manufacturing data teams. You're already dealing with the inherent complexity of integrating dozens of different machines, sensors, and control systems. Adding artificial barriers from vendors who want to protect their installed base just makes the problem worse.
The good news is that modern bridge technologies and open-source tools are creating workarounds. But you shouldn't need workarounds. The automation industry needs to follow the example of the networking sector, where American vendors have long embraced the principle that shared standards benefit everyone. Until that cultural shift happens, expect to invest significant engineering time just getting data out of your existing systems.
There's significant interest in machine learning and AI applications for manufacturing. However, a critical prerequisite often receives insufficient attention: none of these advanced analytics work without properly structured data.
When you pull data from industrial systems, you're getting information in wildly different formats depending on the protocol, vendor, and sensor type. Before you can run any meaningful analytics, you need to transform this data into a common structure. This isn't glamorous work, but it's foundational.
Think of it like trying to analyze financial data when each department uses different accounting systems, currencies, and reporting periods. You'd never attempt sophisticated financial modeling without first normalizing that data. The same principle applies to manufacturing data, but with even more complexity because you're dealing with real-time information from physical systems.
The ability to do this transformation in real time, distributed across your operations, provides significant advantages. You can standardize data as it's generated rather than moving massive volumes of raw data to a central location for processing. This approach reduces latency, lowers bandwidth costs, and enables faster decision-making at the point where it matters most.
Five years ago, the prevailing wisdom was simple: move everything to the cloud. Major industrial vendors like GE and Siemens pushed this message hard. But that vision never fully materialized, and for good reason.
Manufacturing operations have specific requirements that make pure cloud architectures impractical. You need deterministic response times for control systems. You can't always rely on internet connectivity at remote facilities. Regulatory and security concerns often prevent sending certain data outside your network. And the costs of moving huge volumes of sensor data to the cloud add up quickly.
Edge analytics provides a middle ground. You process data close to where it's generated, using modern cloud-native technologies but on your own infrastructure. This could be a gateway near a remote asset, a server on the factory floor, or compute resources in your data center. The key is creating an intelligent layer between your data sources and any downstream systems, whether those are on-premise applications, cloud platforms, or other machines.
This architecture gives you several practical advantages:
Reduced latency for time-sensitive decisions: When you're detecting quality issues on a production line or identifying equipment anomalies, milliseconds matter. Processing data at the edge lets you trigger immediate responses without round-tripping to the cloud.
Lower infrastructure costs: You only send relevant, processed data to the cloud rather than streaming everything. This dramatically reduces bandwidth and storage expenses, especially when you're dealing with high-frequency sensor data or video feeds.
Improved reliability: Your operations continue even if cloud connectivity goes down. Critical analytics and automation can run locally while synchronized with central systems when connectivity is available.
Better compliance and security: Sensitive data can be processed and anonymized locally before anything leaves your facility, helping you meet data protection requirements.
The current software development landscape presents a significant challenge for manufacturing organizations: there are approximately 1.5 million open developer positions worldwide, and this number continues to grow. For manufacturing companies competing for software talent with technology firms offering competitive compensation packages and remote work options, recruiting qualified developers remains difficult.
This talent shortage has major implications for your data strategy. Traditional approaches that require writing custom code for every integration and analytics workflow simply don't scale. You'll spend months recruiting, then more months getting new developers up to speed on your specific systems and processes. Even hot tech startups struggle with this problem.
Low-code platforms offer a practical solution. By hiding the complexity of coding behind visual interfaces and pre-built modules, these tools let your existing team members become more productive. Your automation engineers, maintenance teams, and data scientists can build analytics workflows using drag-and-drop interfaces rather than waiting for developers to translate their requirements into code.
This isn't about replacing developers entirely. It's about letting different team members contribute according to their expertise without everyone needing to be a software engineer. The automation engineer who understands your machines can configure the data inputs. The data scientist can add analytical models. The IT team can set up the integrations with enterprise systems. All working on the same visual workflow, seeing what others have contributed.
One of the most persistent challenges in manufacturing analytics is the divide between operational technology teams and information technology teams. They speak different languages, have different priorities, and often use incompatible tools.
Visual, low-code platforms help bridge this gap naturally. When both OT and IT team members can see and understand the data flows in a shared interface, collaboration becomes much easier. The OT engineer who knows that a particular sensor reading indicates a lubrication problem can work directly with the IT analyst who understands how to route alerts to the maintenance system.
This collaborative approach is essential because modern manufacturing use cases inevitably cross organizational boundaries. Predictive maintenance requires input from automation specialists, data scientists, maintenance teams, and IT staff. Quality analytics needs expertise from process engineers, data analysts, and business analysts. Trying to force all that knowledge through a single developer who codes everything creates a bottleneck and loses crucial domain expertise.
Based on the insights from this conversation, here are concrete actions you can take:
Start with data standardization: Before investing in advanced analytics, ensure you have a robust approach for transforming data from diverse sources into common formats. This foundational work will accelerate everything that comes after.
Evaluate your edge-to-cloud architecture: Don't assume all processing needs to happen in the cloud. Map out which analytics and decisions need to happen in real time at the edge versus what can be centralized. Design a layered architecture that puts processing where it makes the most sense.
Prioritize tools that enable collaboration: Look for platforms that let your OT teams, data scientists, and IT staff work together effectively. Visual, low-code approaches can dramatically improve cross-functional collaboration while addressing talent constraints.
Build partnerships across your technology stack: No single vendor does everything well. Focus your team's energy on the specific challenges you need to solve, and partner with specialists for different layers of your architecture.
Challenge vendor lock-in: When evaluating automation systems, push back on proprietary protocols and closed specifications. Support vendors who embrace open standards, and build relationships with integration partners who can help you work around artificial barriers.
The path to effective manufacturing analytics isn't about implementing the latest AI model or moving everything to the cloud. It starts with the unglamorous work of data integration and transformation. It requires building the right architecture that balances edge and cloud processing. And it depends on empowering your existing teams with tools that enable collaboration rather than creating new bottlenecks.
The manufacturers who succeed won't be those who jump on every technology trend. They'll be the ones who methodically build data infrastructure that works with their operational realities, enables their teams to collaborate effectively, and positions them to take advantage of advanced analytics when the time is right. Edge analytics isn't just another buzzword to add to your strategy deck. It's a practical approach to solving real problems that are blocking progress in manufacturing data initiatives today.