April 28, 2017 By Plamen Kiradjiev 2 min read

Announcing the IoT Industrie 4.0 reference architecture

Whether you call it Industrie 4.0/Industrial IoT or simply Digital Manufacturing, innovation on the shop floor is of heightened interest to organizations. By seizing opportunities to integrate vertically from machines to cloud, horizontally among supply networks, or along the lifecycle of the product, organizations can improve processes, quality and interoperability to meet changing business needs and achieve extremes of custom manufacturing.

Industrie 4.0 doesn’t replace the typical business requirements in the manufacturing space (productivity, error prevention, and flexibility). Rather the completeness of interoperability better supports overall equipment efficiency, predictive maintenance, and product/process quality. Lean manufacturing is better supported by avoidance of technology gaps and eliminating complexity.

Functional scenarios for disruptive innovation in manufacturing are data-driven and especially insight-oriented. These expectations and requirements mean Industrie 4.0 represents a special case of the general Internet of Things reference architecture due to the need to integrate the operational technology (OT) layer with the IT layer in a manufacturing context. What makes it special are the closed environment and it’s unique availability or compliance needs, the three layers (Edge, Plant, and Cloud/Enterprise), and the high importance of a flexible functional deployment among the three layers.

The main challenge remains accessing the data of the automation layer in a secure way, while avoiding timing discrepancies between the OT layer (lower millisecond level down to nanoseconds) and the IT layer (middle millisecond and higher). Most manufacturers have specific requirements for the autonomous operation of each factory, such as continuing operating even if connectivity to a central IT or cloud infrastructure is not present and low-latency of reaction on events.  In addition, normally there are strict requirements for data locality, privacy, and security, which is often expressed as “production data is not allowed to leave the factory”.

It is likely that this attitude will change over time, driven by the speed of innovation, improving network latency, high security standards, and the cost advantages of cloud-based systems. But Industrie 4.0 projects that are starting today need to address both present and future requirements.

IBM’s Industrie 4.0 reference architecture addresses the current challenge and future needs by the clear definition of the three layers – Edge, Plant, and Enterprise – and the flexibility to deploy and move similar functionality at all three layers. Additionally, the architecture assumes that functionality today likely needs to be deployed on premises, but that this will evolve to dedicated or even public clouds over time. The Industrie 4.0 reference architecture is based on the IBM Internet of Things reference architecture, the Industrial Internet Consortium (IIC) reference architecture, the Purdue model of ISA-95, and best practices observed in real world projects.

Learn more about IBM’s Industrie 4.0 Reference Architecture

Co-author: David Noller

References

More from

Authentication vs. authorization: What’s the difference?

6 min read - Authentication and authorization are related but distinct processes in an organization’s identity and access management (IAM) system. Authentication verifies a user’s identity. Authorization gives the user the right level of access to system resources.  The authentication process relies on credentials, such as passwords or fingerprint scans, that users present to prove they are who they claim to be.  The authorization process relies on user permissions that outline what each user can do within a particular resource or network. For example,…

Applying generative AI to revolutionize telco network operations 

5 min read - Generative AI is shaping the future of telecommunications network operations. The potential applications for enhancing network operations include predicting the values of key performance indicators (KPIs), forecasting traffic congestion, enabling the move to prescriptive analytics, providing design advisory services and acting as network operations center (NOC) assistants.   In addition to these capabilities, generative AI can revolutionize drive tests, optimize network resource allocation, automate fault detection, optimize truck rolls and enhance customer experience through personalized services. Operators and suppliers are…

Re-evaluating data management in the generative AI age

4 min read - Generative AI has altered the tech industry by introducing new data risks, such as sensitive data leakage through large language models (LLMs), and driving an increase in requirements from regulatory bodies and governments. To navigate this environment successfully, it is important for organizations to look at the core principles of data management. And ensure that they are using a sound approach to augment large language models with enterprise/non-public data. A good place to start is refreshing the way organizations govern…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters