Here, Leroy Spence, head of sales development at EU Automation, explains the new standard and the impact on industry.
The British Standards Institution (BSI) has devised a new guideline for the ethical design and application of robots and robotic systems. It recognises potential ethical issues that can arise from the increasing number of automated and autonomous systems being introduced to industrial and consumer environments. It also emphasises that it must always be transparent who is responsible for the behaviour of the robot, even if it behaves autonomously. The standard is relevant to all robots and robotic systems including autonomous cars, medical robots, industrial robots and those used for personal care.
A committee of scientists, academics, philosophers, ethicists and users developed the standard which is intended for use by robot and robotic device designers and managers. The standard, BS 8611:2016, was originally presented in September 2016 at a conference in Oxford, UK, and is available for purchase on the BSI website.
The new standard begins similarly to Isaac Asimov’s three laws of robotics, first proposed in his science fiction short story Runaround in 1942. Asimov’s first law states a robot may not injure a human being or allow a human to come to harm through inaction. The second law rules that a robot must obey all instructions given by humans, except those that conflict with the first law. Finally, the third law dictates a robot must protect its own existence as long as this does not involve conflicting with the first two laws. Robots should therefore always be safe, secure and fit for purpose.
BSI guidelines for manufacturers on previously uncommon hazards include; robot deception, robot addiction and the potential for a learning system to exceed its remit. The issue of whether forming an emotional bond with a robot is desirable is also covered; a particularly contentious subject if the robot interacts with the elderly or children. The standard also discusses the risks of the robot becoming sexist or racist, an issue that prominently surfaced when Twitter users influenced Microsoft’s new AI chatbot, Tay, to spew out offensive messages.
According to Alan Winfield, professor of robotics from the University of the West of England, this is the first published standard on robot ethics. However, the EU is also working on robot ethics standards, with a draft report issued in May 2016. This covers the ethical issues of an automated workforce and will lay the groundwork for ethical development and design of robots.
If approved, the standard would become the first legal framework on the issue of robot ethics. The introduction of the new standard could provide the impetus for bodies such as the EU or even further afield to consider legal action to safeguard humans from the ethical issues associated with the growing number of industrial and commercial robots.
Industrial robots
In industry, standards on the ethical use of robots are of particular use. Traditionally, industrial machines were guarded and caged to be kept safely away from humans. Newer generations of robots are able to work alongside and even in collaboration with human workers, having sensors and the ability to learn, as well as other safety features.
Examples of collaborative industrial robots are ABB’s YuMi or Rethink Robotics’s Baxter. These collaborative robots can work alongside humans and make it easy to integrate automation to an industrial process.
Although collaborative robots are becoming more popular, it is still common for manufacturers to operate legacy industrial automation systems, which offer the benefits of industrial automation without the ethical concerns. For manufacturers worried about the wellbeing of their industrial automation systems, but who are not ready to upgrade to the latest generation of cobots, sourcing legacy industrial parts doesn’t have to be difficult. A supplier of new and obsolete industrial automation parts, such as EU Automation, can provide replacement parts to safeguard the system’s future until the manufacturer is sure that an upgrade is necessary.
The BS 8611:2016 standard is one of the first signs that industry is starting to preoccupy itself with ensuring robot behaviour is accountable, truthful and unprejudiced. The dystopian future of Matrix is a highly unlikely possibility, but if we want to introduce robotics into industry and consumer environments on a wider scale, the ethical question should be at the forefront of our minds.