November 8, 2025

AutomationML, OPC UA, and Asset Administration Shell: Standards for Manufacturing Data Integration

Manufacturers often face a common challenge: engineering data exists separately from operational data, creating integration difficulties when building unified data infrastructure. These disconnected data environments make it difficult to implement comprehensive analytics and AI initiatives that span the entire asset lifecycle.

Dr. Miriam Schleipen, Chief Research Officer at EKS InTec and head of the joint working group between the OPC Foundation and AutomationML, specializes in semantic interoperability in automation systems. In a recent conversation, she explained how three key standards—AutomationML, OPC UA, and the Asset Administration Shell—provide frameworks for connecting engineering and operational data.

This article explores these standards and their applications in manufacturing data management.

Understanding the Engineering and Operations Data Gap in Manufacturing

Manufacturing organizations typically manage engineering data and operational data through separate systems. Engineering teams use specialized tools during planning and commissioning phases, while operations teams rely on different systems during production. This separation creates data silos that prevent organizations from establishing unified data infrastructure needed for advanced analytics and AI applications.

AutomationML addresses engineering data management. It is an open, standardized file format for exchanging engineering information across companies and disciplines. The format provides a common structure for planning data, including plant layouts, equipment configurations, and virtual commissioning models.

The format operates independently of specific tools and vendors, which helps organizations avoid proprietary data formats that create vendor lock-in. AutomationML is standardized under the International Electrotechnical Commission (IEC), providing enterprise-grade reliability for data infrastructure development.

AutomationML and OPC UA Integration: Connecting Engineering and Operations Data

AutomationML focuses on planning and engineering phases, handling data before plant operations begin. OPC UA manages communication and data exchange during operational phases, integrating data across organizational hierarchy levels both vertically and horizontally.

The integration between these standards creates a connection between engineering and operational data systems. Schleipen explained two primary use cases for this integration:

Moving engineering data into operations: Some data created during engineering—such as equipment specifications or configuration parameters—remains valuable throughout a plant's operational life. The AutomationML-OPC UA integration enables mapping of engineering data into operational OPC UA information models, preventing information loss after commissioning.

Feeding operational insights back to engineering: Data from operations—performance metrics, calculated KPIs, actual equipment behavior—can inform subsequent engineering or re-engineering cycles. This closed-loop approach enables engineering decisions based on real operational data rather than assumptions.

The OPC Foundation published a companion specification for this integration in 2017, with AutomationML providing a best practice recommendation for OPC UA system configuration data. Organizations building data architecture that spans the full asset lifecycle can reference these specifications for proven implementation approaches.

Asset Administration Shell: Standardized Interface for Asset Information Management

The Asset Administration Shell (AAS) serves as the third component, which Schleipen describes as a fundamental building block of Industry 4.0. The AAS functions as a standardized digital container that accompanies assets throughout their lifecycle.

The AAS provides three key capabilities for data management:

Unique asset identification: Each asset receives a unique identifier, enabling consistent asset tracking across the organization.

Generic interface access: Standardized interfaces allow information access regardless of underlying technology or vendor, reducing integration complexity when working with diverse data sources.

Lifecycle evolution: The AAS expands as assets move through different phases. Documentation from vendors is added during procurement, simulation models during virtual commissioning, and operational KPIs during production, with all information remaining linked.

Asset Administration Shell Types: Passive, Reactive, and Proactive

Schleipen highlighted three types of AAS that represent different levels of sophistication:

Type 1 (Passive): A simple serialized file—a structured document describing an asset. It contains static information that can be read but not interacted with dynamically.

Type 2 (Reactive): This type adds interfaces that enable active information access. Organizations can query data, retrieve submodels, and integrate AAS data into analytics platforms.

Type 3 (Proactive): The future vision—asset administration shells that communicate directly with each other without requiring human intervention or a central system. Assets communicate through a common "Industry 4.0 language" and can coordinate autonomously.

For data leaders implementing infrastructure, Type 2 provides practical functionality for current implementation. Type 3 represents future development direction and should inform long-term architecture planning.

AAS Submodels: Modular Structure for Asset Data Organization

The AAS modular structure using submodels provides particular value for data management. Rather than consolidating all asset information into a single monolithic structure, the AAS organizes information into logical submodels:

  • Identification submodel: Basic identifying information
  • Documentation submodel: Technical documentation, PDFs, manuals
  • Simulation submodel: Simulation data and models
  • Operation submodel: Operational parameters and KPIs
  • Energy submodel: Energy consumption data

This modular structure provides flexibility in data management. Organizations can access specific submodels relevant to particular analytics use cases without retrieving irrelevant information. Data governance teams can apply different access controls and quality requirements to different submodels based on sensitivity and business importance.

Distributed Digital Twins: Modular Approach to Digital Twin Architecture

Schleipen discussed distributed digital twins, a concept with implications for data system architecture.

Traditional digital twin approaches often centralize all components within a single framework from one vendor. Distributed digital twins use a different structure where different aspects of the twin—the information model, geometric model, and behavioral simulation—can exist in separate locations and integrate when needed.

This approach aligns with several data architecture principles:

Vendor independence: Organizations are not dependent on a single vendor's framework. Best-of-breed solutions can be selected for different aspects of digital twins.

Scalable execution: Simulation components can be distributed and executed at optimal locations—at the edge, in the cloud, or on-premise. Containerization technologies enable practical implementation.

Modular integration: Separate simulation models for energy consumption, equipment behavior, and production flow can integrate through standardized interfaces rather than requiring monolithic tool implementation.

The Functional Mock-up Interface (FMI) standard enables behavior model encapsulation and execution through co-simulators. Combined with AAS for information management, OPC UA for operational data, and AutomationML for engineering data, organizations can implement complete, open, and scalable architectures.

Implementation Guidelines for Manufacturing Data Leaders

Organizations implementing these standards can follow several approaches:

Assess current engineering-operations data gaps: Organizations should map where engineering data and operational data reside. Identify specific use cases where integration drives value—predictive maintenance models requiring as-built documentation, or re-engineering projects needing operational performance data.

Establish OPC UA foundation: OPC UA provides the foundation for operational data integration. AutomationML and AAS integrations build on top of OPC UA infrastructure.

Pilot Asset Administration Shells for critical asset types: Begin with targeted implementation rather than comprehensive rollout. Select one asset type that moves through multiple lifecycle phases and implement Type 2 AAS to evaluate lifecycle data management improvements.

Plan for distributed architectures: When evaluating digital twin platforms or simulation tools, prioritize solutions supporting open standards and API integration rather than monolithic environments.

Build cross-functional governance: These standards require collaboration across engineering, operations, IT, and OT. Establish data governance structures spanning these domains beyond operational data alone.

Summary: Building Interoperable Data Infrastructure with Open Standards

AutomationML, OPC UA, and Asset Administration Shell provide standardized frameworks for creating data infrastructure that connects engineering and operational systems. These standards enable organizations to implement data integration across the asset lifecycle and develop flexible, vendor-independent data architectures.

For organizations managing complex global operations or coordinating across international facilities, standardized interoperability addresses critical data management challenges. The standards have matured over several years—the AutomationML-OPC UA integration specifications were published in 2017—and are increasingly supported by major automation and software vendors.

Implementation can begin by identifying specific integration pain points: engineering data needed during operations, operational insights that could inform engineering cycles, or assets managed across disconnected systems. These standards provide systematic approaches to connecting these data sources.

Organizations can start with targeted pilots focused on specific asset types or data integration use cases, then expand implementation as they validate benefits and build organizational capabilities around these standards.