LCI Learning

Share on Facebook

Share on Twitter

Share on LinkedIn

Share on Email

Share More


SYNOPSIS

With the proliferation of global inventions in the name of Artificial intelligence and robotics popping out in the automobile sector, it is safe to say that the future of driving is here. But the one question that crops up in the minds of the public is “at what cost”?

This Article throws a light on one of the recent incidents which took place on 01 June, 2020 where the autonomous driving cars being developed under the AI Industry spark a potential legal issue of liability related to accidents where the cars are operating under the Autopilot feature.

INTRODUCTION

 By now it’s likely that the world has heard enough that this is our future. One day we’ll all be driven around the world entirely by computerized cars which will be more efficient, safer, more capable and completely unflappable in the face of conflict, whether inter-auto or interpersonal.

Tesla, under its CEO Elon Musk is one such name that has been making news in the technology driven market to have developed the vision of future driving. Recently, Elon Musk’s SpaceX made history by becoming the first private limited to have launched astronauts into the space showing the world of the capabilities that the AI based industries hold.

The on and off commitments made by the CEO to bring to the roads, fully self-driving cars, seems to be seeing the light of the day with more and more autonomous driving cars being manufactured, tested  and put to roads. However when it comes to the safety and reliability, the technology is far from home.

The “Full Self-Driving” package, to be clear, does not yet make Teslas capable of driving without human intervention. Right now, it gives customers the access to a series of incremental improvements to Autopilot. That may seem like a lot because Tesla’s cars again aren’t currently capable of actual “full self-driving.” Autopilot can center a Tesla in a lane, even around the curves, and adjust the car’s speed based on the vehicle ahead. The “Navigate on Autopilot” feature can suggest and perform lane changes to get around slower vehicles, and steer a Tesla toward highway interchanges and exits.

THE INCIDENT

The most recent incident questioning the safety and liability has been caught by the Traffic cameras captured the moment when a Tesla slammed into an overturned cargo truck in Taiwan early morning of 01 June, 2020. The white Model 3 is seen in the fast lane approaching a white truck lying across the highway, as another vehicle slows and drives around it.  The driver was uninjured and told emergency responders that the car was in Autopilot mode, but did not have its more Full Self-Driving Capability feature engaged. It is noteworthy that the Autopilot is advertised with the ability to steer the car within a lane and brake for vehicles and other obstacles, while Full-Self Driving can execute lane-changes and other advanced maneuvers.

Tesla, which has the ability to collect operational data from its customer vehicles, has not yet commented on the incident. It's not clear at what point Autopilot should have engaged the brakes itself. Previously, Autopilot was involved in two fatal accidents in Florida, when Model S sedans failed to recognize and stop in time for tractor-trailers crossing highways

As per the official vehicle safety Report by Tesla, the accidents data in the first quarter of 2020 shows once accident for every 4.68 million miles driven in which drivers had autopilot feature engaged. Whereas for those driving without the autopilot feature but with their active safety features, one accident for every 1.99 miles driven was recorded.

LEGAL THEORIES OF LIABILITY

Following are some of the legal theories of criminal liability that could apply to an entity controlled by artificial intelligence:-

  1. Perpetrator theory:-  the programmer or software designer or the user could be held liable for directly instructing the AI entity to commit the crime. This is used in common law when a person instructs or directly causes an animal or person incapable of criminal responsibility (such as a young child or a person with a severe mental disability) to commit a crime.
  2. Natural consequence:-  the programmer or the user could be held liable for causing the AI entity to commit a crime as a consequence of its natural operation. For example, if a human obstructs the work of a factory robot and the AI decides to squash the human as the easiest way to clear the obstruction to continue working, if this outcome was likely and the programmer knew or should have known that, the programmer could be held criminally liable.
  3. Direct liability - the AI system has demonstrated the criminal elements of the recognized theory of liability in criminal law. Strict liability offenses like speeding require an action i.e. actus reus, but "conventional" offenses like murder require an intention i.e. mens-rea. Criminal negligence involves non-performance of duty in the face of evidence of possible harm. Legally, courts may be capable under existing laws of assigning criminal liability to the AI system of an existing self-driving car for speeding; however, it is not clear that this would be a useful thing for a court to do.
  4. Traditional Negligence:- Driver is held liable for harms caused when reasonable care was not taken while in operation of the vehicle
  5. Crash-victim:- Crash victims are not permitted to sue the driver of the vehicle, unless the injuries resulting from the crash are of a certain severity. Victims are compensated through their own insurance

Driver, Manufacturer or the State, who is liable for the accidents involving autonomous driving?

One of the most crucial challenges affecting the liability in such crashes is based on the complexity of the technology. The software and hardware is supplied by dozens of different companies, implying that placing the liability only on the manufacturer itself may not be so straightforward. With fully autonomous vehicles, the software and vehicle manufacturers are expected to be liable for any at-fault collisions (under existing automobile products liability laws), rather than the human occupants, the owner, or the owner's insurance company.

  • The other important liability-determining factor is the expectations set by the company. Does the manufacturer advertise and sell their product in a way that gives drivers the expectation that they can put the car on autopilot and safely dose off or zone out?
  • If Tesla overtly or through implication to their customers that their vehicles are safe to use on auto pilot, and it turns out they’re not, then Tesla can be held liable for the injuries or deaths that result.
  • In case where the auto accident was due to a lack of appropriate maintenance, and that maintenance was the responsibility of the owner, then the owner of the vehicle could be liable.

If the self-driving vehicle passenger was failing to follow proper operating instructions, which was the case during the first Tesla autopilot fatality accident in May 2016, it may be difficult to hold the manufacturer liable for the accident and the resulting injuries or deaths.

The states cannot be held liable for letting the developments under the Artificial Intelligence progress, the duty of the state lies in the maintenance of roads in order to provide for safe driving. For now, the state can only be sued where it has failed to properly repair or maintain the roads.

At present, insurance companies are treating self-driving cars in the like ways as they would treat any other vehicle, but it seems to be inevitable that the insurance industry will have to change the rules of liability as self-driving cars proliferate. There is still some time left until we can expect specific rules and legislations which exclusively provide for the liability of the parties involved in such matters. The Future only seems to be here!


"Loved reading this piece by Nihal Thareja?
Join LAWyersClubIndia's network for daily News Updates, Judgment Summaries, Articles, Forum Threads, Online Law Courses, and MUCH MORE!!"






Tags :


Category Others, Other Articles by - Nihal Thareja 



Comments


update