To have empathy or not to have empathy, to be empathetic or not to be empathetic. The often renewed discussion about being and having has once again been rekindled. This time, the focus is on the concept of empathy, which literally fills the pages of online and offline media in most of the world. There seems to be a clear purpose – artificial intelligence experts from several technical and scientific areas are determined to generate, invent and build what has been shown in countless series, movies and books: empathetic machines. But, why we do want these artificial entities to understand empathy?
Projected empathy: you’re not understanding me
Phrases such “put yourself in their place” or “put yourself in somebody’s shoes” are frequently used in a variety of social contexts. The scientific community addresses empathy as: the capacity to understand the feelings and emotions of others, assuming that the other is somebody similar to ours.
So, what is the meaning of a child’s smile, a worker’s anxiety or a newlywed’s happiness? All of these situations may seem familiar to us but how do we really understand what other people wish to convey to us in a specific situation?
If the rule is that human beings are naturally empathetic, why have there been such cruel events as wars, genocides, terrorist attacks and other smaller-scale situations such as conflicts in personal relationships where no consensus is reached?
One of the current hypotheses was explained in the article “Ponte en mi lugar,” which indicated that this possible empathy malfunction could be clearly linked to projected empathy. This type of empathy involves thinking that the other person is experiencing the same we would feel if we were in their position. Don’t many of our conflicts develop this way?
Machines programmed to capture emotions
Human beings clearly face a dual challenge in terms of empathy: first, they must always capture the emotional expressions or the message other people wish to convey; then, they must know how to give an even-tempered response.
The thing is that there are machines able to distinguish non-verbal communication by interpreting postures and facial movement precisely. And human beings have shown to face some difficulties at this cognitive level.
Scientists are decrypting this type of neuronal process and have discovered that human beings tend to react automatically when faced with certain stimuli, i.e. their response is not fully conscious. For this reason, a machine programmed to solely analyze this level of subtle communication could take on a key role in certain communication settings where almost 100% precision and reliability are needed.
As such, the idea is to teach these artificial entities to distinguish this level of communication, classify and, based on their configuration, determine the exact cognitive status of individuals using an algorithmic model. This would bring clarity and precision when needed.
Robots that aid people with autistic spectrum disorders
However, talking about machine empathy at the moment could lead to something really risk and speculative. Nevertheless, we can mention success stories of interaction between human beings and machines at a level that, in analogous terms, could be described as “a simulation of empathy.” This kind of machine is called a “social assistance robot” or “social robot.” They have already left the labs and research centers to aid children in the autistic spectrum with noteworthy results.
For example, one of the best-known projects is project DREAM (Development of Robot-Enhanced therapy for children with Autism), where children in the autistic spectrum have already interacted with robots with highly satisfactorily results. The project’s findings include the fact that these children find robots to be simpler and predictable and they do not feel judged by these artificial machines.
To give you a bit of background, people with autistic spectrum disorders have shown to have issues with cognitive empathy (theory of mind) and with emotional (also called affective) empathy. Cognitive empathy is the ability to understand complex information about the intention, thoughts and emotions of other people spontaneously.
Consequently, cognitive empathy (the type of empathy which artificial intelligence experts are trying to implement in robots) should not be mistaken for emotional (affective) empathy. This second level of empathy is linked to the concept of compassion. People in the autistic spectrum usually have problems in the first level of empathy: recognizing facial expressions that convey emotions such as joy, sadness, upset, anger, happiness or stress. Consequently, they have other associated difficulties because they do not understand anything other than purely logical and literal language. As such, it is hard for them to develop compassion since they do not recognize this emotion in others in the first place. However, it has been shown is that if they can recognize the emotions of others, they are able to develop compassion (Dziobek et al. 2008).
The importance of empathy for survival
However, in this case, the somewhat rhetorical question at the beginning (to be empathetic or to have empathy) should not be a problem. After all, having found the evolutionary advantages of empathy, the really intelligent thing is for human beings, and their imagination potential, to seek to create machines equipped with certain skills that enhance human as well as natural well-being. The first steps along this path have already been taken with the social assistance robots for children with autistic spectrum disorders or aid for the elderly in Japan, for example. Also, paraphrasing one of the fathers of artificial intelligence, Marwin Minksy in Society of Mind,” emotions are so indispensable that a machine would not be intelligent without them.”
Space for everyone
If we are asked for a constant characteristic throughout human history, we should mention the human ability to leverage its imagination with the sole aim of finding what it is missing, what it longs for. In short, what allows human beings to live better and/or longer. Or, what brings more moments of pleasure and prevents the consequences of suffering and pain.
And when machines do not mean war to human beings or vice-versa, there is a possible path of mutual understanding. Consequently, it is evident that both machines and human beings could share and coexist in the same space-time axis with the advantage that artificial entities could benefit and learn from natural beings through a feedback process, and without the same genetic code. Does this sound like respect and maybe even empathy?
Rosae Martín Peña