Out-Law / Your Daily Need-To-Know

EU laws should recognise liability of robots, says MEP

Out-Law News | 07 Jul 2016 | 10:56 am | 2 min. read

Robots could be held liable, at least in part, for damages caused when they malfunction or in accidents under future EU legislation, an MEP has said.

In a draft report submitted to a European Parliament committee, Mady Delvaux of Luxembourg said future laws should recognise "robots' civil liability" (22-page / 331KB PDF). In future the liability of manufacturers, programmers, owners and users of robots for actions by robots should reflect the self-learning capacity of machines and their ability to operate independently.

"In principle, once the ultimately responsible parties have been identified, their liability would be proportionate to the actual level of instructions given to the robot and of its autonomy, so that the greater a robot's learning capability or autonomy is, the lower other parties' responsibility should be, and the longer a robot's 'education' has lasted, the greater the responsibility of its 'teacher' should be," Delvaux said.

"Skills resulting from 'education' given to a robot should be not confused with skills depending strictly on its self-learning abilities when seeking to identify the person to whom the robot's harmful behaviour is actually due," she said.

The MEP said that "an obligatory insurance scheme" could be imposed to provide "a possible solution to the complexity of allocating responsibility for damage caused by increasingly autonomous robots".

"An insurance system for robotics could be based on the obligation of the producer to take out an insurance for the autonomous robots it produces," Delvaux said in her draft report.

The insurance scheme could be "supplemented by a fund in order to ensure that reparation can be made for damage in cases where no insurance cover exists", she said.

Delvaux said: "The more autonomous robots are, the less they can be considered simple tools in the hands of other actors (such as the manufacturer, the owner, the user, etc.)… This, in turn, makes the ordinary rules on liability insufficient and calls for new rules which focus on how a machine can be held – partly or entirely – responsible for its acts or omissions."

A "system of registration of advanced robots should be introduced", the MEP said, and this should be link in with the supplementary robot fund to "allow anyone interacting with the robot to be informed about the nature of the fund, the limits of its liability in case of damage to property, the names and the functions of the contributors and all other relevant details".

Technology law expert Stephan Appt of Pinsent Masons, the law firm behind Out-Law.com, said earlier this year that there is a debate among academics in Germany over whether a new type of liability might be established as a result of increasingly self-learning software for eventually near-autonomous operating robots. He said the debate could apply in the context of driverless cars.

"In a futuristic world where artificial intelligence meets autonomous driving, self-operating, self-learning vehicles of the future could, as the discussion goes, be held liable for accidents they are involved in," Appt said. "If a machine can drive itself and the owner has little or no influence over how it operates, and if no technical fault arises in the event of an accident that can be pinned on the manufacturer, then you can see why there might need to be a paradigm shift in consideration of liability."

"To pursue a machine for damages there would need to be an underlying pot of money that those seeking redress could access. It would be likely that funds would be provided for by manufacturers of these new-style intelligent cars, and it is here that insurers might have a new market. Manufacturers would be likely to sell these vehicles to consumers with insurance coverage attached to underwrite the risk of machine liability," he said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.