CPS, Data Fusion and AI – Advancements in Industry 4.0
Published by : Industrial Automation
Multi modal sensor data fusion and integration is very important, especially in terms of cyber-physical systems, says Indrajit Kar.
The 4th Industrial Revolution, which includes cyber-physical systems (CPS), multi modal sensor fusion, artificial intelligence and cloud computing, plays a crucial role in advancement of smart industry. Multi modal sensor data fusion is already playing a crucial role in Environmental Remote Sensing. The monitoring of Earth’s environment has entered a totally new era due to the unparalleled advantage of remote sensing. From satellite images or space borne to air borne to various ground borne, sensors have given rise to new types of data which we term it as Big Data. Industrial big data coming from various industrial devices is also managed and process by various big data platforms and cloud storage systems.
On the other had the computerised mechanisation with highly active sensory devices form the basic building blocks of Industrial IoT. IIoT with big data is also a new paradigm shift in what we call Industry 4.0.
So what is a cyber-physical system?
We are not new to digital twins (DT); like DT, cyber-physical system entrenches computing, communication and control capabilities into physical devices (e.g., manufacturing equipment) to monitor, control and coordinate the physical activities. It couples cyber capabilities tightly with physics through technologies such as IoT sensors and actuators, embedded systems, computing, communication, networking, etc. The devices jointly accelerated the interaction and coordination processes of cyber-physical systems.
To take a step back cyber-physical system and digital twin have the same goal, i.e., achieving seamless integration between the physical and cyber space. The two key players of Industry 4.0 along with artificial intelligence is what makes smart factories and the building blocks of manufacturing shop floor. Mathematically the physical system are governed by differential equations and they are endowed with IoT sensors and actuators, embedded systems which helps in capturing various types of data. These data are converted from an analogue to digital form. An ADC (analogue-to-digital converter) is used to convert an analogue signal such as voltage, temperature, humidity to a digital form so that it can be read and processed by a microcontroller. Which are then either stored in a cloud storage or in-house storage system.
The cyber system is built with algorithms, models and rules to describe the states and behaviours of the physical counterpart. The interaction can be bidirectional or unidirectional depending on the need of the business. The CPS can also be a simple device in the shop floor. In a manufacturing shop floor we do not have just one type of sensors, we have various types of smart sensors from image based to numeric based to distance based (e.g., data from Lidar, thermal camera and temperature senor, etc). The sensors with similar characteristics provide a harmonious opportunity to improve spatial and temporal coverage. With the advent of software libraries the fusion of multi modal sensor data from various source has gained momentum, which in turn gives rise to rich hidden feature. These features are then use to train AI and machine learning algorithms. These types of fusion can help mitigate data gap issues which are caused by sensor failure. Resulting in a better cyber part which mimics its physical counterpart.
From systems to data management
The data management part which deals with extract load and transforming the data in SQL or No SQL data bases. These data bases are wither linked or not linked. Linked Data (LD) is a well-established standard for publishing and managing structured information, gathering and bridging
together knowledge from different sensors and physical devices. The linked data is the 1st step towards analytics which helps the AI and advanced analytics teams in a countless ways. The cyber systems operate synchronously with the physical system, record real-time data, and generate a large amount of simulated data on the geometric and physical properties, behaviours and rules which with the advent of big data technologies like Hadoop and sparks has become easy to manage.
From data to visualisation
The data is then used for visualisation and used for analytics and many risk mitigations. Post visualisation the same data can be used for decision making as well as automation of industrial process in an organisation. This automation and intelligent decision making can be done by today’s expert machine learning techniques and artificial intelligence. So you see the cyber- physical systems and multi modal data provide technical assistance to humans to tackle not only the adverse conditions and factors of risks but also helps in various advanced analytics and automation.
The cyber-physical systems a complex design
All these might sound very straight forward however there are many aspects in building an effective cyber-physical system which brings IoT, big data, cloud computing and AI together and cater to design, production, maintenance and end of life. CPS can be classified mainly into three levels, including the single machine, production line and shop-floor level. Depending on the level of CPS there are many devices, which play a crucial role, viz., like remote terminal units (RTU), embedded controllers, programmable logic controllers (PLCs), distributed control system (DCS), Supervisory Control and Data Acquisition (SCADA) and IIoT Hub which is the sender of data as well as the manager of devices. The Hub checks security, dispatches data to storage, analytics, or queue. Normally, these type of devices needs a multi-protocol gateway, such as AMQP, HTTPS or MQTTS, and a message broker. By now you would have understood the necessity of having a powerful network, information network, control network, and field network, around all the components which can exchange information between physical, cyber physical and cyber space. Having mentioned network we definitely cannot forget about cyber security, which is a key aspect of IIoT.
Shifting gears to the data and analytics
Now let’s focus a little about multimodal sensor data fusion and feature extraction for machine learning, which I have already briefly touch upon.
Multimodal analytics, which aims at processing multimodal data from various input devices in the form of Audio, Visual and Text, and then extracts affective knowledge (features,) for decision making and automation. Like humans, machine learning algorithms can have better prediction with combined data. These fused data have been instrumental in mitigating various data challenges explained above.
Challenges with feature fusion and extraction of knowledge
There are two types of fusion – feature level and decision level fusion. Extraction of effective knowledge from a level three and level two CPS system is very difficult because the data volume increases by many orders of magnitude, making it even harder to convert images, video and text into actionable information and knowledge through conventional manual interpretation approaches for further decision-making. Feature from multi modal data can have highly overlapped and some are complementary with each other. In the field of machine learning and AI these types of data has to be discarded and attention should be given to feature extraction, which has become the most critical step prior to decision making. The final performance of a machine learning model is dependent on the quality of extracted features and engineered feature. Filter and wrapper approach has become one of the go to areas in multi modal data fusion and feature extraction.
The other challenge is Noisy data or incorrect data from one or more than one modality can greatly degrade performance of machine learning algorithm, e.g., a faulty sensor data which is not calibrated properly. After extracting features from all modalities, we merged them to form a long feature vector. So what is decision level features fusion, a very good example is ensemble models where each modality are modelled independently and the results are fused as a decision vector to obtain the final decision. Wither by a voting mechanism or by an averaging. It is like many humans coming together to take a single decision.
Multi modal sensor data fusion and integration is very important, especially in terms of cyber-physical systems or CPS. Although the fundamental model behind data fusion and integration has been well-established through several years of investigations, technically it is still very challenging. With a distributed cloud environment such multi modal data fusion become much more challenging. However with proper fusion of multi modal sensor data from cyber-physical systems not only will assistance in better decision making it will also aid better product life cycle management.
Indrajit Kar heads the Artificial Intelligence and Advanced Analytics Solution engineering portfolio. He has 6 patents in areas of Deep Learning, Adversarial Machine Learning, Computer Vision and Natural Language Processing. He has been honoured with 40 under 40 Data Scientist Award, and has delivered various tech talks in international conferences. He has enabled clients from various industries in their data driven decision making journey for their businesses, customers, products and platforms. He played crucial roles in defining AI & ML road maps for various top fortune 500 companies. Being the client’s innovation partner, Kar has been instrumental in leading several advanced analytics and AI initiatives from solution design until implementation.