Deliver Your News to the World

Liability, Public Trust and Policy Around the AV Future

In 2018, the European Parliamentary Research Service published findings, citing that “[a]ccelerating the adoption curve for driverless or autonomous vehicles by five years has the economic potential to generate European added value worth approximately €148 billion.” With such a large price, it is incumbent on us to identify and tackle any issues slowing the introduction of autonomous vehicles.


WEBWIRE

Traditional motor insurance, regulated at EU level, works on the presumption of autonomy and ultimate responsibility of the driver. Current law postulates that every registered car in EU member states needs to be insured for third party liability, covering damages to property or injury to anyone other than the driver and assigning ultimate responsibility to actions performed while behind the steering wheel. Another part of EU law relating to motor vehicles considers producers’ liability, but only to the extent that harm was caused by a particular defect in the product.

Both concepts appear ill-equipped to deal with the future adoption of autonomous vehicles and the diminished autonomy of the driver in favour of AI-based computer programmes working on specific ethical frameworks. Legislators need to re-evaluate the correct apportionment of liability, likely shifting the balance from consumers to producers. A pro-active stance can help to avoid legal and administrative costs in connection with uncertainty and reverse the trend of declining public trust in the new technology following recent scandals, such as a self-driving Uber killing a woman in Arizona in March 2018. 

In addition, a variety of risks are likely to become more relevant with the mass adoption of autonomous vehicles. Amongst those are (i) the potential for failure of the computer programme (the “driver”), (ii) the potential for network failure (the “IT infrastructure”), (iii) the already present risk for malicious cyber attacks and hacks (“3rd party perpetrators”), and (iv) risks around regulatory framework decisions (“biases, ethical and legislative decisions”). Current legislation does not sufficiently address the key risks identified above, requiring not only a step-change in our legal framework, but also in the public discourse to difficult ethical questions posed by the adaption of autonomous vehicles.

Research conducted by the EU Parliament pointed to 55% of respondents considering EU regulatory action for autonomous vehicles as very important, ahead of technologies such as robots, drones and even human enhancement. Amongst the most cited topics for concern are data protection, values and principles guiding AI decision making and liability rules – all areas that comprehensive legislation will need to address.

Instead of the simple and crude approach to assigning liability to the driver, a new framework will need to consider how to identify the ultimate culprit: was the manufacturer of the lidar system, the programmer of the Artificial Intelligence, the OEM assembling the individual components and selling the car, or the government setting the ethical decision-making framework responsible for the car crash? Obviously, the answer to this question will change from accident to accident, often without anyone being able to ultimately assign full responsibility to a single party. We will need to design a robust system that is able to deal with these issues, while retaining the trust of the citizens it is designed to protect.

It is more critical than ever that we have an honest and open debate about the ethical and legislative issues that need to be solved to ensure the autonomous future will not be delayed by slow-moving political processes, adverse public opinion, or legal uncertainties.


( Press Release Image: https://photos.webwire.com/prmedia/5/237881/237881-1.png )


WebWireID237881





This news content was configured by WebWire editorial staff. Linking is permitted.

News Release Distribution and Press Release Distribution Services Provided by WebWire.