Seamless Connectivity – Open Automation
Published on : Friday 12-08-2022
Experts debate the need for opening up proprietary solutions to talk to each other in the industrial environment.
All major manufacturing industries are pitching for seamless connectivity with open communication with real-time capability. Will a new generation replace the disparate multitude of proprietary protocols with an open standard for seamless connectivity? An interface that exchanges standardised data between all the nodes in the network working with different protocols is one strategy. An open standard supported by all vendors is another strategy. Which strategy will win? These are the questions that are now getting answered in many different ways, but gradually finding common ground. But why would production entities need to have autonomous communication?
“Seamless connectivity is a topic that arouses interest as it offers a continuum of connectivity that can drive the development of new products and services or improve efficiency of operating models,” says G Ganapathiraman, Vice President & General Manager, ARC Advisory Group, India. “Increasingly, digital transformation through data- and networking-dependent technologies such as cognitive, IoT, blockchain, and advanced analytics are fuelling adoption of connectivity advances. However, when the network or system becomes large and complex, production entities need to have autonomous communication networks to decentralise control,” he adds.
Dharmender Singhal, Director – Project Management EU/Americas at Haldor Topsoe A/S, attributes this to the Internet, which has changed the way we perform our activities – personal or official. Functioning of IoT devices depends upon the Internet; in case of natural calamity the Internet can be affected. “The main purpose of autonomous communication is development of a communication system which is capable of operating independently of the Internet. Without autonomous systems production can witness potential hazardous scenarios. Seamless connectivity ensures that products from one vendor not only seamlessly work together but also seamlessly work with the world’s most popular networks and architecture,” he clarifies.
“While there is Industry 4.0 adoption by the user industry, there is always concern due to vendor lock-in. Users prefer open protocol, as long as proper support is available for the products. For the product developers, however, there is a different story. They make investments to build up the market and they want to have their lock-in. However this same lock-in mitigates against new entrants as it acts as an entry barrier for them and acts against future innovation,” explains Sudhanshu Mittal, Head – CoE Gurugram & Director – Technical Solutions, NASSCOM Centre of Excellence – IoT& AI.
Apoorva Dawalbhakta, Associate Director – Research, Quadrant Knowledge Solutions, believes that to actively implement smart manufacturing, successive stages within the industry must be digitally connected, and feasibly accessible. “Precise machine-to-machine and autonomous communication, in addition to closed loop and real time communications could be made possible, which would aid effective task completion with the help of advanced robotics or robotic process automation,” he suggests.
Is the need for real-time communication also experienced over the enterprise boundaries? For example, with the vendor subsystems, with delivery subsystems? How about across multiple plants?
For Nimish Danani, Director Consulting Services, Hitachi Vantara, the answer is yes. “In the past, communication within enterprise boundaries was always given priority. As Industry 4.0 adoption increases the benefits of going digital are beyond just connecting single plants and corporate offices. As the visibility improves across plant locations and with vendors and delivery organisations are in a better position to take real time decisions based on market needs. They could re-plan supplier deliveries based on factory status and where required, increase production of certain goods due to a demand spike,” he elaborates.
“Real time communication both within and outside the boundaries of an enterprise are very important to address the transactional needs of different systems and subsystems involved in delivering a business outcome. The same goes for communicating with the vendor ecosystem and across multiple plants,” says Utpal Chakraborty, Chief Digital Officer, Allied Digital Services Ltd.
According to Sunil David, Advisor to IoT and AI Startups, manufacturing applications today are more and more distributed and integrated, not only inside the production sites, but also between the various remote sites of production, of management, of product development and of product distribution. “The need for manufacturers to connect with their Suppliers and Partners is also extremely important to ensure there is real time visibility of the Supply Chain. Hence, there is not only the need for local communication within the production sites but also for remote communication which has to be carried out through Wide Area Networks leveraging cellular, terrestrial or satellite technology. The communication requirements of these wide-area distributed applications are very similar to those of the higher level applications within the factory. Exchanges of long streams of data as well as of multimedia video streams are typically part of these applications. However, the wide-area dimension introduces new technical problems and strongly restricts the available solutions,” says Sunil, who has almost three decades of experience in the IT and Telecom industry.
“To leverage the true potential of a complete value chain and to gain new optimisation real time communications over enterprise boundaries is the need of industry. Ultimately, the whole system from raw materials till consumption of goods has to function efficiently or in a coordinated manner. This is only possible by real time communication over enterprise boundaries,” says Dharmender Singhal.
Historically major vendors have developed their own communication interfaces and protocols. Each such protocol was embraced by their partners. Equally, since ages there has been a call out for open protocols from the side of major buyers. What actually defines an open protocol? When can a system be said to support an open protocol?
“An open protocol could be defined as one that is not owned by any particular company and not limited to a particular company's products. The protocols in the Internet stack are open so that any computing device can follow the protocol to join the global network,” explains G Ganapathiraman. “Companies that provide open protocol systems disclose the relevant technical data required for manufacturers to produce compatible devices. These companies can work with any number of different manufacturers, providing customers with a range of different suppliers to choose from. Hence, open protocols offer the customer a greater degree of choice and flexibility and no vendor lock-in,” he adds.
In Dharmender Singhal’s opinion, if a system supports seamless connectivity with the world's most popular networks and is designed for autonomous communication then it can be said to be an open protocol.
“In my view an open protocol is defined as when anybody can develop the products using that protocol and after going through a standard certification process, those can be seamlessly used by user enterprises. Ideally there should be no patents in the protocol, however if there is need for incorporation of a feature which is necessary and covered under a patent, the decision to include that feature would be taken by a team which does not benefit from that patent,” explains Sudhanshu Mittal. Such an open protocol, according to him, would be published by an association which would consist of various product development enterprises as well as user enterprises as its members.
“Some of the major benefits of the open protocol ecosystem are – it provides a great flexibility while establishing communication among devices and systems coming from various manufacturers with their proprietary protocols into an ecosystem where all such devices and systems are expected to communicate in a seamless manner. It also saves a huge cost, eliminates vendor lock-in and also brings agility to the entire value chain,” says Utpal Chakraborty. “However one problem could be security if not considered seriously, but having said that today we have quite comprehensive security arrangements to deal with such scenarios,” he cautions.
In the automation industry, which is a competent body to provide accreditation and registration for protocols? Which is the entity who would define an open protocol? What commercial incentive would such an entity hope for? How would upgrades and revisions be handled?
“Associations and organisations like the IEEE and ISA, and such others, are obviously some of the key bodies providing accreditation and registration for the protocols. Collectives like OPC Foundation, and others are actually responsible to define and withhold the standards and quality of these open protocols, says Apoorva Dawalbhakta. However, he also points out that any and all such accreditation bodies are merely the symbolic torch bearers and custodians responsible for ensuring and upgrading the various standards as required. “The real leg work needs to be done by the participating industries/organisations, who contribute freely towards the technology and other research for enhancing the quality and standards of these open protocols,” he opines.
Nimish Danani draws attention to the International Society of Automation (ISA) and International Electromagnetic Commission, industry bodies that have a set of recommended standards for control systems and many companies do work towards complying with them. According to him, in the current world, something like ‘Interoperable AI’ seems to be picking up. The Government of India has taken up this initiative, which is encouraging interoperable AI standards (Interoperability – indiaai.gov.in). “Standardisation should help accelerate the adoption of automation and help the industry. Ideally this should be a ‘not for profit’ kind of an organisation so that it remains neutral across product vendors and works for the larger benefit. As the industry evolves this agency can keep updating the standards and allow a fair market place,” he suggests.
To this, Sunil David, adds ODVA (Open DeviceNet Vendors Association), a standards development organisation and membership association, whose members comprise the world's leading companies in the Industrial Automation space. ODVA’s objective is to work towards advancing Open, Seamless and interoperable information and communication technologies in industrial automation. “ODVA recognises its media independent network protocol, the Common Industrial Protocol or ‘CIP’ — and the different network adaptations of CIP, viz., EtherNet/IP, CompoNet, ControlNet and DeviceNet — as its core technology and the primary common interest of its membership. For future interoperability and integration of different production systems with other systems, ODVA embraces the use of commercial-off-the-shelf (COTS) and standard, unmodified Internet and Ethernet technologies as a guiding principle wherever possible. This principle is clearly exemplified by EtherNet/IP — which is the world's number one industrial Ethernet network,” he elaborates.
How do upcoming technologies propose to deal with this topic? Such technologies include Cloud systems, Big Data Systems and more?
“Multiple research studies and analysis have pointed to the fact that the investments in big data in major industrial automation vendors are exponentially increasing with every passing year,” says Apoorva Dawalbhakta. The fundamentals of Big Data and resultant visualisation techniques are ensuring ‘Visual Discovery’ by presenting current industrial automation information in a graphical manner. “As we are aware, optimisation of business processes in real-time is made possible due to the IoT Value Chain – from the enterprise applications to the devices – and can be collectively coined under the unison of ‘Internet of Things and Big Data’. This could be achieved by collecting data from the cloud systems and various points such as machine-generated data, enterprise applications and human-generated data, and processed making use of big-data analytics. The industry performance goals can be achieved efficiently by making use of the big-data analytics which promote the creation of appropriate decision-making models,” he explains.
Utpal Chakraborty believes cognitive technologies like artificial intelligence can play a major role here. In fact it’s already happening in many solutions wherein intelligent integration mechanisms powered with machine learning have already been implemented. “In a hybrid cloud and multi-cloud kind of an environment, efficient integration architecture is of paramount importance. Of course, Big Data, Data Lakes and Data Fabrics can contribute hugely for building an ecosystem with a seamless, decentralised, open protocol communication landscape,” he points out.
“The latest additions to Industry 4.0 are the advancements in Cloud Computing. We are increasingly seeing industries phasing out on-premises systems and migrating toward Industry Cloud Service Providers such as Amazon Web Services and Microsoft Azure, etc. Companies are opting to have Cloud Service Providers provide the required infrastructure, platforms, and software run on large server farms to operate the required applications,” says Sunil David.
Nimish Danani concurs with the above, and adds that if we look at technologies like Cloud or Big Data, whenever an enterprise is choosing a vendor and the system integrator they are definitely checking interoperability. Many niche companies have their own protocols to create data lakes or build AI offer data management solutions which could be industry specific or function specific. “Enterprises will need to ensure that they are able to get data from these systems in a format that can be integrated with other systems and brought on a single platform to make sense of the data. While there may be similarities in the features being offered by various brands but each brings its own value proposition on the table. This makes it necessary for the user enterprise to build necessary skills and also ensure compatibility,” he suggests.
Does use of Open communication compromise on cybersecurity aspects?
“With open communication protocols the fear of compromised cybersecurity always lurks in the background,” says G Ganapathiraman, pointing out how according to the ARC Advisory Group, sophisticated attacks on manufacturers and critical infrastructure operators have changed security requirements. “Yesterday, most industrial facilities could get by with basic OT cybersecurity programs designed to protect operations from general hackers and malware floating around the internet. Today, every facility needs an OT cybersecurity program that can deal with ransomware and targeted attacks by sophisticated adversaries,” he adds.
Sudhanshu Mittal too is of the view that the concept of security by obscurity is completely outdated in today’s world. “Open protocol doesn’t compromise on the security, on the contrary the openness of the protocol and its constant review by a large community provides it the robustness to stand against the cyber hacks,” he opines.
This is a widely shared view as Apoorva Dawalbhakta elaborates further when he states that contrary to an unpopular belief, open state protocols and the resultant open communication is considered way safer due to top notch technological features which are made possible due to contribution of a large number of people and organisation. “The more the number of people and organisations that participate in providing their inputs in regard to data and processes that they undertake, the stronger the database of threats and subsequent protection plans that can be created. The stronger the standards and the protocols that are set in place due to such practices, the harder it is for hackers or anyone with malicious intent to actually execute successful cyber threats/attacks,” he explains.
Utpal Chakraborty also believes there are security concerns around Open communication but that doesn’t mean that it can’t be secured. “There are various ways with which security challenges can be minimised or nullified in such an ecosystem. Nevertheless, security concerns if any should be dealt with utmost care while designing such architecture,” he asserts.
Sunil David is in agreement with the views of other panellists. “Yes, with Open Communication, there could be potential cyber threats that an organisation can get exposed to but it calls for a Defense in Depth strategy that addresses security at multiple layers – Device, Network, Data and Application,” he says. With a ‘defense in depth’ approach it gives manufacturing plants, both all-round as well as in-depth protection as recommended by the international standard IEC 62443. It's aimed at factory operators, integrators, and component manufacturers alike, and covers all the security-related aspects of cybersecurity.
Nimish Danani is of the opinion that Open communication can be more vulnerable to cybersecurity threats if not implemented well. It’s like any technology tool which can bring in tremendous value to an enterprise but if not secured can be a big hazard. While communication protocols were fixed, the threat of hacking if we take an example was low as most of the devices were hardened with the single unique proprietary protocol. “As we allow interoperable devices and with open communication becoming the norm, there will be a need to secure the enterprises from possible cyber threats,” he concludes.
(Note: The responses of various experts featured in this story are their personal views and not necessarily of the companies or organisations they represent. The full interviews are hosted online at https://www.iedcommunications.com/interviews