Human Robot Collaboration Next Leap of Robotics
Published on : Sunday 03-01-2021
HRC is not just about the humans and robots working together safely but also about different ways of humans talking to and programming robots, says Pawankumar Gurav.
Humans and robots are working more closely together as technology improves. This is increasing the productivity of companies and the quality of products, leading to efficiency and growth. In many cases robots increase output so much that more jobs are created in complementary jobs.
Researchers and companies are improving the safety of robot systems so humans can work close beside robots that become co-workers, more than mere tools.
The next step for robot makers, software companies and engineers is to refine human-robot job augmentation further to achieve a greater shift in productivity, freeing people to do higher value and less dangerous work.
For decades robots have fulfilled one of two broad applications:
1. Large industrial robots, programmed offline, work in defined, linear pathways to typically move, assemble or weld a manufactured component on a production line, and
2. Domestic and service robots. Lighter and softer applications, these help in more nuanced settings, from inspection of hostile environments, to helping humans in the home; from vacuum cleaning to healthcare and even social care applications.
The development of smaller “desktop” robots, known as collaborative robots or “cobots”, was brought about from a new application of robots that is rapidly taking hold globally: Human Robot Collaboration (or HRC).
There are new business and domestic scenarios where robots are designed to work alongside humans – rather than as a distinctly separate and binary tool behind a cage. It’s like an extension of human activity, a human working but with added capabilities.
Until recently, robots have been constrained in their utility. They perform a limited range of tasks in a linear and literal way. They are programmed offline and could not respond to new stimuli in mid-task. Industrial robots are heavy, fast and potentially dangerous to humans working in proximity.
But many industrial applications demand a robot solution to increase throughput where the human worker is doing tasks that can be replaced, for example where:
a. The task is simple and repeated identically multiple times but needs to be accurate.
b. Requires more than one worker but not as many as two (e.g., 1.2 people).
c. The task could cause stress or even injury to the operator.
The challenge is to modify a robot’s parameters to make it sensitive to human presence, to slow its speed and power in proximity to humans and to develop technology that allows the collaborative robot or cobot, to assist the human like an intuitive co-worker, not a heavy and dangerous threat.
Cobots have very sophisticated force-torque sensors, so they sense when people are near them
– a person can now interact in the same space knowing that the robot will stop if they intrude within set parameters, and then carries on what it is doing.
The business case for HRC is greater productivity enabling more output and therefore growth, repeatability, redeploying labour into better jobs, higher profits and reduced exposure to repetitive or physically hard work with – sometimes low but present – safety risk.
The main advantages of using cobots over human-only operations:
i. Increased productivity – frees up human to work on other operations
ii. Reduced repetitive strain injuries and lifting
iii. Productivity gains without needing extra work space, and
iv. Humans can focus on higher value tasks.
It would be pertinent to examine the features of HRC in this context.
Sensitivity sets cobots apart
Together you can achieve something greater than just one of those elements on their own.
The challenge is to change ‘binary’ or very linear programming to a more intuitive level where robot behaviour adapts to humans during a process, i.e., human robot augmentation (HRA).
For example, KUKA iiwa (Cobot) has been engineered with greater sensitivity, to work with and augment human operations. The lightweight iiwa’s high-performance servo control is able to detect contours quickly under force control. It establishes the correct installation position and mounts components quickly with high precision with an axis-specific torque accuracy of ±2% of the maximum torque.
Augmenting industrial robots like cobots the number of industrial applications for such cobots is growing. At the moment it is dominated by the automotive industry, but retail, food, inspection and laboratory applications are rising.
Large industrial robots are not designed to work next to humans; most are caged off to protect workers, and operate under a hierarchy of regulations covering safety.
How do you get large industrial robots that normally operate at high speed to work closely with humans? This involves changing the robot programming software to make the robot more intelligent, more understanding of humans, and meshing this together so when the robot needs to move fast and operate high payloads it can, but when it needs to go into a human co-worker paradigm, it can without entering a default “safe” mode where you have to reset it.
Real-time monitoring of 3D printing
HRC shows real-time adaption to the process. Another application that displays adaptation is in 3D printing. As the robot deposits material it monitors the process and automatically updates and augments itself depending on the measurements and the inspection.
Machine learning and artificial intelligence
Tasks that are simple to a human can be complex for a robot. For example, working out what an object looks like among many identical or similar objects, and how to grab that object is very complicated for a robot.
The challenge for a robot becomes increasingly more complex when multiple objects are at an odd angle, other objects obscure their view, and it needs to approach it at a novel angle. Machine learning programs can help the robot to see the wood for the trees. With today’s computing power, you can provide many computations of how that object might look in different scenarios and the robot can ‘learn’ that, almost instantly, where it has learned that it can pass it on immediately to any other device in the network. A human might take many years to acquire and pass this on as a skill, but robots can do this quicker – within certain limitations.
Software development is key to collaborative robot applications. Programming is moving from ‘command code’ to simulation and in the future, artificial intelligence that will enable task learning.
There are three levels of software for robot programming.
Industrial grade robot programming software – this is dedicated to offline programming of robots and often uses a digital representation of a robot cell. The operator programs what he/she wants the robot to do then simulates it to ensure there are no collisions, it is safe, there are no singularities, no access limits. When he is happy you create the code that can drive the robot.
At the mid-level – software is more experimental and is often used by the research institutions.
This software experiments with digital fabrication and computational business information modelling. A product is taken from the design environment and straight to manufacture using a robot. The test cell is connected to sensors, monitors, peripheral equipment and often does not have the robustness of an industry grade application, because researchers want to change parameters quickly.
Future stage – software that uses machine learning algorithms so robots can change their operation to adapt to the changing task, without resetting. The learning could involve risk assessment. In this paradigm, if there is a robot human collision the robot calculates what the force applied to the human would be and ensures it is within TS/ISO15066, the technical specification that defines how hard a body can be hit.
HRC and the future of manufacturing
Further adaptive programming – OEMs investigating platform-based utilities where different functionalities are plumbed in, connected with sensors and peripheral equipment, to make it more where programming and operating a robot becomes a completely different experience to the normal offline programming, press and go.
Here you have real-time updating of the code that is operating the robot – monitoring its environment and what it is doing, responding to the actions.
Augmented reality and novel programming – The use of augmented reality (AR) headsets provides a completely different way of communicating with robots, which is leading to new ways of programming robots. AR will accelerate the speed of robot movement adaptation.
HRC is not just about the humans and robots working together safely but also about different ways of humans talking to and programming robots, and talking to and communicating back to humans. When we start to incorporate vision systems, speech recognition systems, different ways of programming using hand gestures or signal-based methods, you will see a shift in how robots are programmed.
Programming will become more intuitive for non-engineers. Robots will become more commoditised and democratised, more accessible to a wider number of people. Robots will by then be incredibly flexible devices. One day they could perform an industrial job and then more service-oriented tasks the next day.
Applications of HRC
The applications areas for human robot collaboration are many. These include automotive assembly tasks, machine tending, medical applications, machining and trimming applications, hybrid manufacturing, retail sector, service robots, astronomical space applications, autonomous applications and endless applications where humans need assistance!
Pawankumar Gurav is a graduate in Mechanical Engineering from Shivaji University, Kolhapur. He has over 3.5 years of experience in KUKA Robotics for Product Support, Solutions/Projects, Application Engineering, Certified Programmer of KUKA Mobile Platform, Sunrise OS, LBR iiwa Cobot.