IoT Basics: Smart Production with Big Data
Industrial Internet systems based on cloud-based IoT platforms will significantly advance the use of Big Data in industry. They unlock existing data sources and provide them for efficient centralized evaluation, which enables more qualified decisions to be made.
Currently, the most concrete approach for a widespread use of Big Data technologies in industry is the Internet of Things (IoT). The IoT is a network of physical devices, vehicles, buildings and other objects that are able to collect and exchange data with built-in software, sensors and network connectivity. Concepts such as Smart Homes, Smart Cities and Smart Mobiles are all based on the Internet of Things.
Big Data or Bigger Data?
The more things are networked, the more data is generated and the more complex are the relationships between things and their management. The Internet of Things therefore requires the technologies of Big Data and is also a concrete application example of Big Data; some analysts even call IoT "Bigger Data" in view of the large number of devices involved.
In the production environment, the term Industrial Internet is frequently used. The objects are people, machines, controls, field devices and sensors as well as software services in the cloud that connect and support them. An industrial Internet system supports intelligent production processes with advanced data analysis technology in order to fundamentally transform the business processes based on it. It is based on the idea of a global industrial ecosystem consisting of advanced computer and manufacturing technology that is equipped with comprehensive measurement and sensor technology and features end-to-end network connectivity. The control systems are to be interlinked in order to enable more flexible and faster adjustments during ongoing production. External data from company-wide or even public information sources should be incorporated more strongly into the control of production processes in order to be able to make better and more qualified decisions.
IoT Reference Architectures
Standardized services, components and interfaces are required to make this vision a reality. Organizations like the Industrial Internet Consortium (IIC) have set themselves the goal of turning this vision of an Industrial Internet into reality. As a first step, the IIC presented a general reference architecture for industrial Internet systems . Similar to the reference architecture model of Industry 4.0 (RAMI), the reference architecture of the IIC uses different views to describe the properties of industrial Internet systems.
For safety reasons, today's production systems are usually hermetically sealed from the outside world. This ‘isolated island’ situation must be resolved in order to make the implementation of industrial Internet systems possible. For this purpose, a number of questions still need to be answered, especially in the area of security. The IIC architecture therefore considers the topic of security as an essential vertical aspect that must be taken into account by all means.
Various architectural patterns are used to describe the implementation view. One of these architectural patterns is a 3-tier architecture (3-layer architecture) consisting of 3 logical layers (tiers) and the networks connecting them. The layer model shown in the Figure can be found in many commercial IoT platforms already offered today. The Edge Tier collects data from Edge Nodes over a local network (Proximity Network).
This local network connects machines, sensors, actuators, devices, control systems and other intelligent assets. It combines these edge nodes into one or more clusters, which are then connected to the access network via edge gateways. The Edge Gateways also serve as an access point for device management and are able to pre-aggregate or otherwise pre-process selected data.
The access network provides connectivity for data and control flows between edge and platform tier. The access network can be a closed corporate network, a Virtual Privacy Network (VPN) or the public Internet. For data transmission, specialized IoT protocols such as MQTT or AMQP, but also specific REST APIs can be used.
The platform tier essentially consists of a non-domain-specific service platform that provides services for transforming, storing and analyzing the data as well as supporting functions for operating the platform including device management. In addition, the Platform-Tier is also the central data hub. It processes and consolidates data flows from the Edge Tier and forwards them to the Enterprise Tier as needed. Conversely, it receives and processes control commands from the Enterprise Tier and forwards them to the Edge Tier. The service network enables the connection between the services of the Platform Tier and the Enterprise Tier. This network can also be a closed corporate network, a Virtual Privacy Network (VPN) or the public Internet.
The Enterprise Tier includes all higher-level business rules and control mechanisms. In practice, these are domain-specific business applications such as ERP or CRM, decision support systems, but also user interfaces for operating the platform... The Enterprise Tier receives data from Platform and Edge Tiers and sends control commands to Platform and Edge Tiers.
Fog Computing Brings Intelligence to the Field
The reference architecture of the IIC already shows that analysis functions do not always have to take place centrally in the Platform Tier, preprocessing can also take place in the Edge Gateways. A large part of the data storage and analysis takes place centrally in the Platform Tier because this centralized approach offers clear advantages in the operation of the solutions. But the approach still reaches limits that are difficult to overcome in terms of current bandwidth, speed and availability of access networks. In addition, data may not even leave the Edge Tier for regulatory or security reasons.
In order to circumvent these problems, the concept of Fog Computing offers a more decentralized approach for the IoT. With fog computing, an attempt is made to bring analysis intelligence to the point where it is most efficiently executed. Fog Computing should provide a kind of abstract operating system for analysis functions, which makes their execution independent of the physical environment. Things on site become even smarter as a result and can solve many tasks autonomously on site but make use of additional central services of the platform tier if required. This enables faster response times and reduces the strain on the networks.
This is also supported by new technology trends such as container computing (e.g. using docker containers), which make it much easier to distribute applications as so-called micro services to local devices without having to adapt the firmware. This will make it easier to transfer the app concepts familiar from smartphones to controllers and gateways in the future. Leading providers of IoT platforms are already adding concepts to their architectures to support Fog Computing.
The PLC as IoT Controller
An IoT controller is a programmable logic controller (PLC) which is particularly easy to integrate into a cloud-based IoT application. The IoT controller combines the functionalities of an Edge node and an Edge gateway in one device. Thanks to its modularly expandable I/O bus, a wide variety of I/O signals can be captured directly, and additional subnodes such as other controllers or bus-capable field devices can be integrated via additional bus couplers.
As a typical PLC, the IoT controller sends one or more configurable data sets, consisting of I/O variables and/or program variables, cyclically or event-driven to an IoT hub in the cloud. To this end, the IoT controller uses an integrated Ethernet- or mobile phone interface. In return, the controller also receives individual control commands from the cloud via this interface and forwards them to the PLC application running on the controller.
Besides the usual IEC 61131 applications, such a controller can also be used to implement custom applications in C, C++ or other languages. Communication with the cloud application takes place via IoT protocols. Before the IoT controller can send data to the cloud and receive control commands, it must be registered and activated through device management supported by the cloud platform.
In the event of a temporary failure of the Internet connection, the controller can temporarily store the data on a local storage medium up to a configurable maximum time and then deliver the data when the Internet connection is available again, so that the loss of data is negligible.
IoT Platforms as Cloud-Based Building Blocks
IoT platforms are meant to help to implement and operate specific IoT applications with reasonable effort. They offer a wide range of integrated services and components that can be used as the basis for implementing concrete industrial Internet solutions. Apart from the actual data processing services, efficient tools for the development and operation of the solutions are also included. Since the control over such a platform with its ecosystem is regarded as the key to establishing future business models, many companies are trying to establish their own IoT platform on the market.
In addition to the large cloud providers such as Amazon, Microsoft and Google, many smaller, specialized providers are also trying to secure part of the cake by operating IoT platforms either in their own data centers or on the cloud platforms of the large providers. Large automation technology providers such as GE or Siemens also offer their own IoT platforms, as do large ERP providers such as SAP or Oracle. Figure 2 shows typical building blocks of such an IoT platform using the concrete example of Azure, the cloud platform from Microsoft . The summary is greatly simplified. It shows only a subset of the cloud services available in Azure.
Edge devices are linked to the cloud directly or via Edge gateways. The protocol variants MQTT, AMQP and REST API are supported. To simplify matters, Microsoft provides free Software Development Kits (SDKs) for devices and gateways for major operating system platforms and programming languages.
For thousands or even millions of devices to be able to deliver their data to an IoT application at the same time, an appropriate infrastructure is required that can securely and uninterruptedly receive the associated data volume and at the same time enable parallel processing. At Azure, event hubs and the IoT hubs based on them are doing this, but with additional device management functions. The API apps can be used to implement additional individual yet highly scalable REST APIs with .Net, Java, PHP, Node.js or Python and can also be made available to other developers for integration into their own applications.
For the transformation, storage and analysis of data, there is a huge range of services that can be selected and linked as required by the application. For simple prototypes, the simplest way to do this without any programming is through web-based configuration. For more complex applications, however, the basic modules must be supplemented by additional individual software features. But that is where the excellent integration of Visual Studio and Azure pays off. With Azure Machine learning predictive analytics models can be created and tested in an integrated and web-based environment and the trained algorithms can be made available as Web APIs.
It is also noteworthy that in addition to Microsoft's proprietary solutions in Azure, HDInsight also provides a complete Hadoop stack as a preconfigured and managed cloud service. HDInsight includes MapReduce, Pig, Hive, HBase, Storm, Mahout, Spark and additionally R-Server, an execution environment integrated with HDInsight for R. Besides Java, C# and .Net are supported for the development of MapReduce jobs.
Summary and Outlook
The continuing digitalization and the associated spread of data-driven algorithms will lead to some of the presented Big Data technologies to be used in almost every company. There is great potential for big data technologies for industrial applications such as the use of data analytics and machine learning. Today, however, there is still a lack of trained data analysts with the necessary data science tools and the corresponding application knowledge to derive valuable and useful knowledge from existing data. This results in a market for corresponding consulting and implementation services, but also for highly interactive, visual tools that make data analysis accessible to the average domain expert.
The successful use of Big Data always requires the existence of suitable input data. However, their suitability depends on the individual application and is therefore difficult to predict. Ideally, a company would immediately start storing all data proactively, in order to have it available later for extensive analyses. Of course, this is associated with considerable costs - especially if the computer cluster is to be permanently active to be able to execute pending requests at any time with proper performance. The industry, but also the technology providers still have to prove that this effort is not only worthwhile in very specific individual cases.
With Big Data, as with the Internet of Things and Industry 4.0, many critical questions regarding data and plant security have not yet been conclusively answered, but more and more solutions are emerging that directly address the aspects of data security and data governance. But Big Data itself will also be able to contribute its own solutions to the successful mastering of this challenge.
The Internet of Things has the potential to radically replace the usual automation pyramid and replace it with a new system landscape in which things are networked in many ways. Big Data supplies numerous solution modules for this purpose. Industrial Internet systems on the basis of cloud-based IoT platforms will also significantly advance the use of Big Data, especially in industry, as they open up existing data sources and provide them for efficient central evaluation.
Control over successful IoT platforms and the ecosystems associated with them promises great business potential. This is why countless providers of IoT platforms are already trying to establish themselves. Only a few of them will prevail in the long term. The best prospects for this market of the future are offered by the already dominant providers of IT technology and cloud computing. On these leading IoT platforms and the marketplaces available on them, however, industry-specific solution providers with independent service offerings will again be able to succeed.
Dipl.-Inform. (FH) Klaus Hübschle studied computer science at the University of Applied Sciences Furtwangen and started working as a software developer at M&M Software GmbH during his studies. In his professional career he has since held leading roles in numerous consulting and software development projects in various areas of automation technology. As managing partner in the field of technology, he is today driving the company's focus on the new challenges of industry 4.0 and digitalization and is focusing on Cloud Computing, Big Data, Internet of Things and assistance systems.
1) The Industrial Internet of Things. Volume G1: Reference Architecture. Needham: Industrial Internet Consortium (IIC), 2017 https://www.iiconsortium.org/IIRA.htm (accessed on 09.05.2017). 364 Bibliography
2) Microsoft Azure IoT Reference Architecture. Redmond: Microsoft Cooperation, 2016. http://download.microsoft.com/download/A/4/D/A4DAD253-BC21-41D3-B9D9-87D2AE6F0719/Microsoft_Azure_IoT_Reference_Architecture.pdf (accessed on 09.05.2017).
This article was first published by Industry of Things.
Original by Jürgen Schreier / Translation by Alexander Stark
This article is protected by copyright. You want to use it for your own purpose? Infos can be found under www.mycontentfactory.de (ID: 45988449)