May 2, 2026
May 2, 2026
# What Is I3X and Why Could It Become the Web Browser of Manufacturing Data?
The manufacturing industry is roughly a decade late in standardizing the most basic interface for getting data out of software systems—and the fix is almost embarrassingly simple.
That's the core argument Matthew Paris, Director of Quality Test Systems at GE Appliances and a leading contributor to the I3X specification, made during my conversation with him on the AI Manufacturing Podcast. Paris, who sits at a unique intersection of end user and system builder, makes a compelling case that the biggest barrier to scaling industrial intelligence isn't the absence of sophisticated technology—it's the absence of a common, lightweight way for software to talk to other software at the MES and cloud level. I3X, the Industrial Information Interoperability Exchange Common API, is designed to be that missing piece.
Every time a manufacturing organization onboards a new software product, the same exhausting cycle repeats. What interfaces does it provide? What encoding does it use? What methods are available? What format is the data in? Paris describes this as navigating the "data access stack"—a layered set of decisions analogous to the TCP/IP model that most teams don't even realize they're making.
The trap is that "REST interface" has become a checkbox answer that satisfies procurement but solves almost nothing. As Paris put it, REST has no bearing on what capability that interface gives you, what information it produces, or what format that information is in. You could have a REST interface that only tells you the time. An Open API spec just documents whatever arbitrary interface a vendor chose to build. None of this guarantees that a consumer of the data can actually use it without significant custom integration work.
The result is that every new software addition to the manufacturing stack requires its own bespoke integration effort. Multiply that across twenty or thirty software products in a modern manufacturing environment, and you begin to understand why so many organizations are stuck in perpetual integration mode rather than extracting value from their data.
Twenty years ago, the vendor landscape was simpler. A manufacturer might choose between three to five platforms, pick one ecosystem, and let that vendor handle everything. It was a big decision, but a simpler one. Technology has since shattered those monoliths.
As problems become more complex, they demand more specialization. No single vendor can solve the entire problem space, which is why large platform companies resort to acquisitions—buying the companies that solved pieces they couldn't. This proliferation of specialized vendors, each adding their own value, has created enormous churn in how manufacturers onboard new capabilities. Paris frames it bluntly: you're almost required to be nimble now, but if your whole architecture is nimble, it's too disruptive. You need a stable foundation to be nimble around.
And then there's 2026's accelerant: AI-assisted software development. When Claude or similar tools enable nearly anyone to become a software developer adding capability to the manufacturing stack, the number of applications trying to plug into your architecture explodes. Without a standardized interface, you'd have to teach every AI coding assistant about twenty different proprietary APIs across your ecosystem. With a common interface like I3X, you tell it once: develop against this spec. That's the difference between weeks of integration work and hours.
The compounding cost isn't just the engineering hours spent learning each vendor's proprietary interface. It's the opportunity cost of intelligence that never gets built because the foundation is too fragile to support it.
Paris describes a progression that most manufacturers recognize. Step one is visibility—dashboards, reports, seeing what's happening. That alone is valuable. Micro stops in discrete manufacturing, for instance, can fly under the radar as noise until you measure them and discover they're costing real money. But dashboards are a polling pattern. You have to go look at them. The next step is subscription-based intelligence—systems that tell you when something requires attention, with escalation paths and contextual filtering. That requires AI. And AI requires not just data access, but data that's typed, related, and contextually rich.
Here's where the implication chain gets painful. If your AI agent has to spend its cycles navigating the same quagmire of inconsistent interfaces, undocumented formats, and unrelated data objects that your human integrators struggle with, you've essentially hired a new employee and given them no onboarding. Paris uses exactly this analogy: think of AI agents as new employees, and your job is to make that onboarding as short as possible. Every layer of unnecessary friction between the agent and the data it needs delays the time to value.
Without a common interface, manufacturers get stuck in what you might call the "integration treadmill"—constantly running just to maintain connectivity rather than advancing toward intelligence.
I3X is deliberately, almost aggressively, simple. It sits on HTTP, uses JSON encoding, and provides a small set of capabilities that answer four fundamental questions: What information do you have? What is the latest value? What were the past values? And can you tell me when it changes?
That's it. Paris estimates it covers twenty percent of the capability you could theoretically build, but solves eighty percent of the use cases manufacturers actually face. The design philosophy is to reuse existing, battle-hardened IT technologies rather than invent new ones. HTTP was standardized in 1999. JSON overtook XML SOAP around 2013-2014. The building blocks have been ready for a decade. What was missing was the decision to assemble them into a common specification for manufacturing.
This is where the contrast with OPC UA becomes sharp. OPC UA has spent twenty years trying to solve multiple problems simultaneously—information modeling, transport mapping across UA TCP, HTTP, MQTT, AMQP, Ethernet, and UDP—to be relevant across every conceivable use case. The result, at the application-to-application level in the cloud, is that even OPC UA's own reference architecture draws a box labeled "custom application" with an arrow labeled simply "HTTP REST." That's the actual level of standardization the industry has achieved for software-to-software communication at the MES and cloud layer after two decades.
I3X takes the opposite approach. Rather than boiling the ocean, it makes an opinionated, minimal set of decisions and publishes them. Vendors who've seen the spec report they can prototype an implementation within a week, because they already have HTTP endpoints and JSON encoding. It's just a matter of mapping their internal processes to the standardized methods.
Paris draws an analogy to the USB mouse. Every operating system understands left click, right click, scroll wheel, and cursor movement out of the box. A gaming mouse with fifty programmable buttons needs a special driver, but the basics just work. I3X is that baseline interoperability—the part that should have been standardized years ago.
This is where Paris challenges some popular assumptions. An MQTT broker, he argues, is woefully insufficient to achieve what a Unified Namespace should be. MQTT is excellent at transporting changing data in a scalable, fan-in/fan-out pattern. But it can't natively answer "what is the current value?"—you'd have to shoehorn retained messages to approximate that. It has only one dimension of relationship: the topic hierarchy. It can't advertise data types. It has nothing to say about data governance.
Paris points to a demo at the Previt conference where an application connected to an MQTT broker and tried to infer data structure from what it observed. For every topic, it had to create a new data type from scratch, because the broker provides no metadata about what it's carrying. This is why companies like HiveMQ are building products like Pulse on top of the broker—because the broker itself is now a commodity, and the value lies in the governance, validation, and type awareness layers above it.
I3X is designed to wrap around any information source, including an MQTT broker enhanced with a product like Pulse. It becomes the standardized query layer that lets any application—including that Netscape-equivalent I3X Explorer tool—connect and ask: what types do you know about? What namespaces are available? Give me everything tagged as an energy monitor across the enterprise. That kind of query is nearly impossible against a vanilla broker but straightforward against an I3X-compliant endpoint.
The specification is also deliberately silent on how you should model your information. It will pass through whatever types you define—custom namespaces specific to your company, OPC UA companion spec types, VDMA namespaces, or CESMII definitions. It doesn't prescribe; it transports. But in doing so, it acts as what Paris calls a "brutal mirror" on the quality of your data modeling, exposing exactly how well or poorly your information is structured.
Paris pushes back on the chicken-and-egg framing that plagues most standards discussions. His argument is structural: because I3X operates at the software level rather than the device level, adoption doesn't require firmware changes or hardware swaps. It arrives in a software update. If a vendor implements it, the end user receives it for free on the next release.
The vendor momentum is already building. Hibyte, Microsoft Azure, AWS, FlowSoftware, and Ignition are among the platforms implementing or demonstrating I3X. Paris singles out Ignition as particularly significant—its open platform architecture naturally invites these kinds of standards, and its massive installed base means broad exposure. The specification moved from alpha to beta ahead of Hannover 2026, where multiple vendors are demonstrating live implementations.
The real strategic question for manufacturing leaders isn't "should we adopt I3X?" It's broader: how do we build a stable architectural foundation that lets us plug in and remove software capabilities as they emerge—including AI agents that need to reason over our data? I3X is one Lego block in that foundation, but it's the block that eliminates the repetitive, wasteful integration tax you pay every time you add a new tool to your stack. If your current approach to software integration requires custom work for every new connection, you're not building a platform. You're building technical debt.
Kudzai Manditereza is an industrial data and AI educator and strategist. He specializes in Industrial AI, IIoT, Unified Namespace, Digital Twins, and Industrial DataOps, helping manufacturing leaders implement and scale Smart Manufacturing initiatives.
Kudzai shares this thinking through Industry40.tv, his independent media and education platform; the AI in Manufacturing podcast; and the Smart Factory Playbook newsletter, where he shares practical guidance on building the data backbone that makes industrial AI work in real-world manufacturing environments. Recognized as a Top 15 Industry 4.0 influencer, he currently serves as Senior Industry Solutions Advocate at HiveMQ.