If your therapist was a robot would you be more or less likely to trust it with your emotional wellbeing? If one of your colleagues was a machine, would you find it hard to work with it, or would you relish the lack of small talk and workplace politics? And would you buy a microwave that could press your emotional buttons?

Robot and girl. Photo credit: Andy Kelly

TV series like Maniac and films like Ex-Machina pervade our culture with possible future scenarios where technology has the ability to adapt and understand human emotion.

We’re still a long way from ‘the singularity’ – a hypothetical point in the future when robots and humans merge and become indistinguishable – but we are already relating to our technology as though it does have emotion, even when it doesn’t - how many times have you shouted at your computer for not saving a document or your SatNav for taking you the wrong way?

“At the core of our understanding of the world is that everything has intentionality and behaves like a human,” said Rob Wortham Teaching Fellow in Robotics and Autonomous Systems at Bath University, when asked why we have emotional connections to our technology.

A case in point is a recent study in peer-reviewed open-access journal Plos One which found that when a robot begs not to be switched off, participants are reluctant to do so.

In the experiment, the robot would ask simple questions like “Do you prefer pizza or pasta?” which was enough to make the participants like the robot. They then experienced stress when they were given the option to turn off the robot at the end of the experiment as it begged to be left on with phrases like, “No! Please do not switch me off!”

“You will hear the term ‘anthropomorphism’,” said Dr Wortham, referring to the way we ascribe human characteristics to ideas, animals or objects. “It’s the way we understand our pets as though they are tiny, furry people and we shout at computers because they’re being mean to us.”   

This isn’t necessarily a bad thing and groups like the US Navy’s Laboratory for Autonomous Systems Research (LASR) is using our tendency to relate to technology in emotional ways to build robots that work better as part of a team.

Visitors interact with the mobile, dexterous, social robot Octavia at the Office of Naval Research. 

LASR has developed a firefighting human-like robot called Octavia who is ‘Mobile, Dextrous and Social’ and is designed to make the crew relate to her as though she was another human teammate.

Octavia can respond to commands and physical gestures, speak and show confusion with her facial expressions.

There are two cameras built into her eyes that can analyse characteristics like facial features, complexion, and clothing; Octavia can detect voices using four microphones and a voice-recognition programme called Sphinx; she can identify 25 different objects by touch; and she also has theory of mind - the ability to reason about what the humans around her might be thinking in order to understand their commands (it’s a trait that, if we were to anthropomorphise her, we might call empathy).

Most importantly, Octavia can withstand high temperatures and smoke fumes, which her crewmates can’t, so they can point to the fire and she can actually go in and extinguish it.

While there’s a lot of talk about robots stealing our jobs in the future, the reality - as Octavia demonstrates - could be quite different.

It’s the places where humans can’t go – because it’s too dangerous, that’s a major area for development of artificial intelligence (AI) and emotionally intelligent robots.

"One of the main limitations facing war fighters and emergency responders in subterranean environments is a lack of situational awareness; we often don't know what lies beneath us," said Timothy Chung from the U.S Military’s Defense Advanced Research Projects Agency, which announced earlier this year it was developing subterranean robots to help during national emergencies.

There’s a paranoid side to us that fears this technology – just put ‘AI’ into Google and a list of videos and articles appears with ‘unnerving’ or ‘creepy’ in the title. Science Fiction has a lot to answer for.

We are constantly offered dystopian visions of the potential pitfalls of AI by films like Blade Runner, Skynet and HER, with less of the positive outcomes of robots that ‘feel’ – like Disney’s Wall-E and its forerunners in Short Circuit and Silent Runnings.

Joanna Bryson, Senior Research Fellow at the Department of Computer Science at Bath, who specialises in AI ethics is sceptical of dystopian scenarios where robots have too much power.

She said in a Royal Society panel last year: “There’s this assumption that if something was as smart as us it would try to take over the world like we do. But firstly, most people don’t try to take over the world, but secondly your smartphone can play chess better than you and do arithmetic better than you but your phone hasn’t even taken over your pocket, let alone the world. That’s not something we build into AI.”

There are also more everyday applications for robots that can read your emotions.  What if your fridge could understand that you were low on zinc and suggest a healthy meal plan for you? Or even order you a takeaway so it appeared when you arrived home?

Or a robot therapist that could understand not just what you were telling it, but also the emotional tone of your speech so it knew when to probe and when to hold back? Some of this technology isn’t that far off in the future.

“If you see robots as tools used by trained qualified therapists, they can be very useful,” explained Dr Wortham, who said that therapy robots are already being developed.

“With autistic children, for example, the therapist will use a robot as a kind of proxy, robots are simpler and more predictable to use than humans... children preferred to interact with the plain robot than the human-looking one.”

There are also social therapy robots being developed to teach children through games, and robots to help older people with dementia.

We might still be decades away from seeing truly empathic devices in our homes, but when that does happen, we’ll already have plenty of practice of using them from all that time spent shouting at our laptops.