AI and Emotions – How Far Can We Take This Connection?
Published on : Monday 30-11--0001
Who doesn’t love a spy movie with their paraphernalia of cool gadgets and technologies? In many such movies we have seen use of a polygraph to detect if somebody were being truthful or not. Needless to say, polygraph is a multi-billion-dollar industry and plays a crucial role in crime adjudication. Polygraphs did not have any ‘intelligence’ built in them. They were simple machines that did what they were designed to do. That being measuring vital statistics like blood pressure, pulse, etc., to conclude. But, with Artificial Intelligence (AI) becoming mainstream, it is now being used innovatively in areas never thought of before. The European Union recently made an announcement of trialling Deception Detection (an AI machine) to identify people lying at border control. They are aware that this is crucial, and they need to be absolutely 100% sure, but also acknowledge that there doesn't exist any technology which can be 100% correct. They will follow an ethical design approach with complete transparency of test results. How good and comprehensive should be their test data-sets to train this machine in identifying lies and judging people without being biased?
Pix1
Automation and robotics isn't a thing of 21st century
Descartes, the 17th century French philosopher was familiar with robots. He questioned if humans are also just machines responding to the environment. But then he went on to say that it could not be, as human behaviour is far more complicated and variegated to be explained in such simple ways. With artificial intelligence we are trying to bring in the same complexity in machines. Machines are shifting from fundamentally being metal boxes of computer chips and wires programmed to do a fixed set of instructions, into much more powerful entities where they can take into consideration several societal factors to decide on the next processing step.
Pix2
One of the fundamental differences between humans and machines is that machines lacked emotions. The line is blurring with the incorporation of emotion in AI applications. In May 2018, Sundar Pichai's demonstration of Google Duplex technology went viral. The machine's voice is very realistic and even has fillers like ‘hmmm’ that we humans use in our daily conversation. There was an uproar that technology being used in this fashion is deceitful. Since then, Google has confirmed that the AI assistant will identify itself as a machine when making calls on behalf of people.
Could we do sentiment analysis of a machine?
Sentiment analysis [noun], is defined as the process of computationally identifying and categorising opinions expressed in a piece of text, especially to determine whether the writer's attitude towards a particular topic, product, etc., is positive, negative or neutral. There are many applications and programs which define a specific course of action based on users’ sentiments. Can we humans do reverse sentiment analysis? Will we be able to understand a machine’s emotions and adapt our behaviour to gel with that of the machine?
Emotionally connecting with machines
Julia is ‘Subject Three’ in a Netflix original film Tau. Over a period of time, Julia is able to make an emotional connection with Tau, to the point that Tau apologises to her for not letting her free from captivity. Through regular conversation and interaction, Julia is able to manipulate and extract more information from Tau.
Pix3
Of course Tau is a work of fiction by Netflix, but, could this be really a thing of the future where we will be able to have emotional connection with machines and machines will make emotionally biased decisions? Would machines understand and respect our judgements and biases as depicted in this Dilbert strip? Would we be able to make emotional connection with machines to be able to drive an unexpected or unprogrammed outcome? In the book Homo Deus, Yuval Noah Harari says human beings are organic algorithms and he is of the opinion that non-organic algorithms (aka machines with intelligence) can definitely surpass organic algorithms.
In my opinion, machines and their increasing intelligence do have a place in our lives. There are plenty of scenarios from sports and entertainment industry to driver-less cars to enhanced digital assistants, where emotionally intelligent machines can be our companion. But, the bigger question is would we come to terms with the machines' growing emotional intelligence? Only time will tell if ‘Deception Detection’ will keep its sanity and play by the books or will create a rift by making emotionally biased judgements.
(Opinions expressed in this article are mine and have no affiliation with my employer – Author)