November 8, 2025
November 8, 2025

Manufacturing data integration has evolved significantly over the past three decades. What began as a Windows-specific protocol for connecting automation equipment has transformed into a comprehensive framework that supports modern cloud architectures and enterprise-scale analytics. Understanding this evolution helps data and analytics leaders make informed decisions about infrastructure investments and integration strategies.
Praveen Kumar Singh, Chief OPC Solutions Architect at Utthunga with over 21 years implementing OPC solutions across industry verticals, provides valuable perspective on how OPC UA PubSub addresses contemporary manufacturing data challenges. This article examines the architectural shift from client-server to publish-subscribe patterns and explores practical applications for moving operational data to analytics platforms.
The original OPC specification emerged in the late 1990s when automation companies needed a standard way to exchange data between different vendors' equipment. This first generation, built on Microsoft's COM/DCOM technology, solved a critical interoperability problem and achieved widespread industry adoption.
However, this success revealed limitations. The Windows dependency restricted deployment options. Multiple separate specifications existed for different data types—historical data, alarms and events, and real-time data access—each requiring separate implementations. As organizations in different verticals wanted to adopt the technology, these architectural constraints became increasingly problematic.
OPC UA, introduced and stabilized around 2011-2012, represented a fundamental architectural shift. Rather than simply updating the protocol, the OPC Foundation created a comprehensive framework with three key layers. The communication protocol became just one component. An information modeling layer allowed definition of custom object types and hierarchies. A type system provided extensibility, enabling other standards bodies to build companion specifications on top of the base framework.
The framework approach proved successful. Organizations could use TCP/IP for plant-floor real-time communication while the same information models remained accessible through other protocols. This flexibility enabled OPC UA to span different automation layers—from field devices and PLCs to SCADA systems and manufacturing execution systems.
Around 2014-2015, cloud platforms like Azure and AWS gained significant traction in enterprise IT. Manufacturing organizations recognized opportunities to leverage cloud infrastructure for advanced analytics, pattern recognition, and machine learning on operational data.
This created a new integration challenge. Cloud platforms operate primarily on messaging architectures—publish-subscribe patterns where data producers publish to topics and consumers subscribe to relevant data streams. OPC UA's existing TCP/IP-based client-server model, while effective for plant-floor communication, didn't align well with these messaging patterns.
The framework architecture of OPC UA enabled a solution. Rather than redesigning the entire system, the OPC Foundation added a new protocol layer supporting messaging patterns. OPC UA PubSub introduced support for MQTT, AMQP, and UDP transport protocols while maintaining the existing information models. Data that had been accessible via client-server connections could now be published to message brokers and consumed by cloud applications.
This architectural approach provided a standardized path for moving operational technology data to cloud analytics platforms. Organizations could use the same information models across both real-time plant-floor systems and cloud-based analytics, ensuring semantic consistency throughout the data pipeline.
Modern manufacturing environments often require both real-time control communication and analytics data pipelines operating simultaneously. OPC UA PubSub enables hybrid architectures that address both requirements.
Consider a PLC or edge device managing production equipment. For real-time control purposes, it communicates with HMI systems and controllers using OPC UA's TCP/IP client-server model, which provides the deterministic behavior and low latency that control systems require. Simultaneously, the same device can publish selected data streams via OPC UA PubSub to an MQTT broker, making that data available to cloud analytics platforms.
This dual-protocol approach—sometimes called edge computing—enables data to flow from field devices directly to cloud infrastructure while maintaining the real-time control communication necessary for operations. The key advantage is architectural simplicity. The same information model serves both purposes, eliminating the need for translation layers or custom integration code.
From a data architecture perspective, this pattern provides several benefits. Data governance becomes more straightforward when the same semantic model spans from source systems to analytics platforms. Data quality issues are easier to trace when there's semantic consistency throughout the pipeline. Development teams can work more efficiently when they don't need to manage different data models for operational systems versus analytics systems.
Manufacturing facilities typically implement network segmentation for security purposes. Different subnets isolate various functional areas—production zones, quality systems, logistics networks—to limit the impact of security incidents. However, analytics use cases often require data from multiple network segments.
Traditional approaches to moving data across network boundaries involve complex integration gateways with custom code for each data source. OPC UA PubSub offers an alternative architectural pattern through protocol bridging.
A publisher instance in one subnet publishes data to a broker located in a DMZ (demilitarized zone) using secure TLS connections with OPC UA security features. A subscriber instance in another subnet connects to the same broker and receives the data, mapping it to appropriate endpoints in that subnet. Both instances use the same software and configuration approach, differing only in their role as publisher or subscriber.
This architecture provides data movement across network boundaries while maintaining security controls. The broker in the DMZ serves as a controlled crossing point between networks. Security teams can monitor and audit all data flows through this central point. The approach scales well—adding new data sources or consumers involves configuring additional publisher or subscriber instances rather than developing custom integration code.
For data and analytics leaders managing multi-facility operations, this pattern becomes particularly relevant. Data from geographically distributed plants can flow to centralized analytics platforms using the same architectural approach, regardless of how each facility's internal networks are structured.
Network reliability presents ongoing challenges in manufacturing environments. Temporary connectivity interruptions—whether due to network maintenance, infrastructure issues, or mobile equipment moving through coverage gaps—shouldn't result in data loss for analytics pipelines.
OPC UA PubSub implementations can include store-and-forward functionality. When a publisher loses connection to its broker, it stores messages locally in a database. Once connectivity restores, it transmits the buffered data to the broker. Different transmission modes allow prioritization—sending live data first while historical data catches up, or maintaining strict temporal sequence regardless of when data actually transmits.
This capability addresses a practical concern for data and analytics teams. Machine learning models often depend on complete datasets for training. Missing data creates gaps that require special handling—interpolation, exclusion of time periods, or more complex imputation strategies. Store-and-forward ensures complete data capture even when network issues occur.
The functionality also supports use cases involving mobile equipment or remote operations. Agricultural equipment operating in areas with intermittent connectivity, autonomous vehicles moving through varying coverage zones, or remote monitoring systems with unreliable connections can all leverage store-and-forward to ensure data completeness.
Cloud platforms provide managed services for ingesting and processing IoT data. Microsoft Azure IoT Hub, AWS IoT Core, and similar services handle device management, message routing, and integration with other cloud services. Understanding how OPC UA PubSub integrates with these platforms helps planning deployment architectures.
Azure IoT Edge, for example, provides a runtime environment for deploying containerized modules at the edge. OPC UA PubSub implementations can deploy as IoT Edge modules, running alongside other edge processing components. This enables architectures where data flows from OPC UA devices to an edge gateway running IoT Edge, where it can be filtered, aggregated, or processed before flowing to cloud services.
The containerized deployment model provides operational benefits. Infrastructure teams can use standard container orchestration tools for deployment and updates. Monitoring integrates with existing container and Kubernetes tooling. Configuration management follows standard practices for containerized applications.
For organizations already investing in cloud platforms and edge computing infrastructure, OPC UA PubSub modules that integrate natively with these platforms reduce integration complexity. Rather than building custom bridges between OPC UA systems and cloud services, teams can use supported modules that handle the translation and transport.
Several factors warrant consideration when evaluating OPC UA PubSub for manufacturing data integration:
Information model consistency. The primary value proposition of OPC UA is semantic consistency from source to consumption. Ensure that information models used in plant-floor systems align with how data is represented in analytics platforms. Inconsistencies here negate much of the benefit.
Security architecture. PubSub introduces message brokers as new components in the security architecture. Consider how brokers fit within existing security zones, what authentication and authorization mechanisms apply, and how to monitor data flows for security purposes.
Broker infrastructure. Organizations need to decide whether to operate their own message brokers or use managed services. This decision affects operational complexity, cost structure, and deployment flexibility. Consider requirements for reliability, throughput, and geographic distribution.
Network architecture. Understanding existing network segmentation and data flow requirements helps design appropriate publisher and subscriber topologies. Map which subnets need to exchange data and where brokers should be positioned to enable those flows while respecting security boundaries.
Operational monitoring. PubSub architectures introduce new failure modes—broker unavailability, publisher disconnections, subscriber processing failures. Establish monitoring and alerting for these scenarios to maintain data pipeline reliability.
OPC UA PubSub represents an architectural evolution in manufacturing data integration, shifting from point-to-point client-server patterns to publish-subscribe messaging. This evolution aligns operational technology data integration with the architectural patterns prevalent in modern cloud platforms and enterprise data systems.
For data and analytics leaders, the key consideration is whether this architectural pattern fits your organization's data integration strategy. Organizations heavily invested in cloud analytics, requiring data from multiple facilities or network segments, or building real-time analytics capabilities will find the publish-subscribe model well-suited to their requirements.
The framework approach of OPC UA—where new protocol layers can be added while maintaining existing information models—provides a path for adopting modern integration patterns without abandoning existing investments in OPC UA infrastructure. This continuity reduces migration risk and enables incremental adoption.
Success with OPC UA PubSub depends on understanding how the publish-subscribe pattern differs from traditional client-server integration, planning appropriate network and security architectures, and ensuring information model consistency throughout the data pipeline. Organizations that address these foundational elements can build reliable, scalable manufacturing data integration platforms that support advanced analytics and machine learning initiatives.