The federal transport minister Alexander Dobrindt has presented a report on automated driving to Germany's cabinet. The report is the work of an Ethics Commission on Automated Driving, an expert panel of scientists and legal experts.
The report notes the technological advances being made to increase automation in cars to make them safer and reduce accidents, but it adds: “Nevertheless, at the level of what is technologically possible today […] it will not be possible to prevent accidents completely. This makes it essential that decisions be taken when programming the software of conditionally and highly automated driving systems.”
The report lists 20 guidelines for the motor industry to consider in the development of any automated driving systems. Dobrindt says that cabinet has adopted the guidelines, making it the first government in the world to do so.
The moral foundation of the report is that, since self-driving vehicles will cause fewer human deaths and injuries, there is a moral imperative to use such systems since governments have a duty of care for their citizens.
If an accident cannot be avoided, the report says human safety must take precedence over animals and property. The software must try to avoid a collision altogether, but if that’s not possible, it should take the action that does least harm to people.
The report also recognises that some decisions could be too morally ambiguous for the software to resolve. In these cases, the ultimate decision and responsibility, for now, must be with the human sitting in the driver's seat, as control is swiftly transferred to them. If they fail to act, the vehicle will try to stop.
However, it is acknowledged that no system is perfect. If harmful outcomes cannot be reduced to zero, they will at least be lower than the current level.
If a collision is unavoidable, the report says systems must aim for harm minimisation. There must be no discrimination on the basis of age, gender, race, physical attributes or anything else.
The report mentions the possibility of fully autonomous systems, but recognises that the technology is not yet capable of solving tricky ‘dilemma situations’ in which the vehicle has to decide between the lesser of two evils. As the technology becomes sufficiently mature, full autonomy will be possible.
According to the report, at all times it must be known who is driving – human or computer. Perhaps by means of scanning their license, everyone who drives a vehicle must first be validated as being legally qualified to drive that class of vehicle.
The vehicle should have an aviation-style ‘Black Box’ that continuously records events, including who or what is in control at any given time. In the event of an accident involving an autonomous vehicle, an investigation should be carried out by an independent federal agency to determine liability.
The driver of a vehicle retains their rights over the personal information collected from that vehicle. Use of this data by third parties must be with the owner's informed consent and with no harm resulting.
The threat of malicious hacking any autonomous driving system must be mitigated by effective safeguards. Software should be designed with a level of security that makes malicious hacking exceedingly unlikely.
As for the set-up of a vehicle's controls, they must remain ergonomically optimal for human use, as they are in a conventional car.
The report also says the public must be made aware of the principles upon which autonomous vehicles operate, including the rationale behind any of those principles. These should be incorporated into school curriculums so that people understand autonomous vehicles.
The guidelines will be reviewed and fine-tuned after two years of use and regularly in the years after.