As technologies such as robotics and artificial intelligence have progressed, there has been considerable time and effort given to the ethics of this development.
These efforts have been driven by a strong desire to ensure that the technologies function in a way that benefits society rather than harms it.
Worthwhile though these efforts undoubtedly are, a group from the MIT Media Lab believe that we also need to be doing far more work on how man and machine interact. Such an effort would require the field to break free from the constraints of computer science and engineering, taking in such disparate disciplines as economics, biology, psychology and various other social sciences.
The team make their case in a paper published recently in Nature, in which they argue for a multi-disciplinary approach to future technology development that takes into account the active role technology is playing in our lives. It demands that technology moves on from being seen as a passive participant in our life towards one where they're an active player with their own behavioural characteristics.
The movement is not operating from a standing start however, as there has already been some fascinating work done to explore how humans interact with industrial robots. It's a field of research that's driven by the huge expenditure on such tools in recent years, with the latest estimates suggesting it will reach $210 billion per year by 2022.
German research from a few years ago found people were generally pretty happy to have a robot colleague. The study found that 60% of workers would be happy to work with a robot, with 21% believing doing so would represent a considerable improvement from their human colleagues. This was largely due to people believing that a machine would be far less error-prone than their carbon-based peers, with less emotive and therefore predictable behaviour.
This general acceptance only goes so far, however, especially when the robots are performing the kind of tasks that make up a significant portion of the individual worker's role.
Research from Cornell University found that when robots outperformed us at work, it had a distinctly negative impact on workers, making them feel negative about both their abilities on the job and even about themselves as people. Unsurprisingly, this rapidly germinates into resentment of the technology.
Of course, it's debatable whether managers will hold much sway by the feelings of their human workforce when considering whether to implement the most efficient technology possible, but it does nonetheless underline some of the complexities inherent in such decisions.
Changing our thinking
Another fascinating exploration of the way machines influence our behaviours comes via researchers at Aix-Marseille University, whose recently published study examines how working with a robot affects our brain.
It found that when we work alongside human colleagues, the area of the brain that looks after social rewards is frequently activated, which underlines the rewards we gain from doing so. Humans are, if nothing else, social animals after all. When we work with robots however, these areas of the brain sit largely silent. Indeed, the basal ganglia, hypothalamus and amygdala show none of the activity present when we engage with humans.
It should perhaps come as no surprise to learn that humans have traditionally had a difficult time in bonding with robots, with those robots falling in the 'uncanny valley' especially difficult to bond with. The participants in the German research mentioned previously said that they would find robots that displayed emotions particularly uncomfortable, which can be especially important as robots cross over from performing routine and standalone tasks towards more human endeavours.
For instance, many companies are following the lead of the likes of Amazon and Uber in implementing artificial intelligence in their management of employees. It's a transition now without risk however, as a study from Penn State aptly illustrated. It found that whilst Uber drivers were generally okay with their AI bosses, this only extended so long as they perceived the technology as being fair. If the 'social contract' between worker and manager broke down, then feelings rapidly soured towards the technology.
This is important to understand, as the technology is largely programmed to think purely rationally, with productivity and optimisation the most important criteria in any decision. They have no empathy for one's family situation or whether employees are engaged at work. They can't serve as role models or do any of the more human things workers look to their leaders for.
Forming a bond
Indeed, research from the University of Lincoln found that people are much more likely to bond with robotic technology if it's designed to have some of the same kind of flaws and foibles as human beings. This has even extended to feeling a degree of empathy towards robots, with a study published in Nature showing that there is clear brain activity in the areas associated with empathy when robots were 'hurt' in some way.
As the MIT Media Lab team highlight in their proposal for a dedicated field of research in this intersection between man and machine, it's important that we understand the numerous issues involved in that interaction, a handful of which have been illustrated above.
“We’re seeing the rise of machines with agency, machines that are actors making decisions and taking actions autonomously,” the team explain in a blog. “This calls for a new field of scientific study that looks at them not solely as products of engineering and computer science, but additionally as a new class of actors with their own behavioural patterns and ecology.”
There has been a huge amount of time and energy devoted to understanding how humans can work effectively together. Now, as autonomous technology enters the workplace, the time has come for this to be expanded to include how man and machine can combine forces productively.
To enable comments sign up for a Disqus account and enter your Disqus shortname in the Articulate node settings.