The Legalities of Accidents Involving Autonomous Vehicles

For nearly a decade, the world of what used to be science fiction has become one of limitless possibilities. Artificial intelligence has become almost scarily real as software developers have created a way to make everything from the most mundane to extremely difficult tasks simple and autonomous. But with this complex simplicity comes a legalistic nightmare when things mess up.

Who can you blame when software glitches create an opening for hackers to access thousands of individuals’ sensitive information? Who gets the liability when robot surgeons make dangerous mistakes on their patients? And – the big legal battle going on in courts across the world today thanks to the recent autonomous Uber accident – who is held liable when autonomous cars go wild?

The attorneys at Hershey Law are excited to be part of this historical time as these new questions are creating case laws that will be used to shape the future of how attorneys present their cases and judges determine verdicts.

It’s a turbulent time, but one that must be addressed: AI isn’t going away, and it’s only going to become more widespread. Self-driving vehicles are paving the way for the cars of today to become the hovercraft of tomorrow, and the battles are currently being played out in the courtrooms over the legalities.

So What’s the Story on Autonomous Vehicles, Anyway?

They almost seem like a myth, in the way that self-driving vehicles had been part of science fiction for so long. But, like much of today’s technology, once the seeds were planted in the minds of certain scientists and researchers, all it took were a few years of planning, financial backing from some of the biggest companies in the world, and a lot of testing for it to become a reality.

Along the way, consumers were slowly prepared for the jump to fully autonomous cars by being provided with little “safety features” in their vehicles. Conveniences like sensors that warn drivers of cars or items close by, Bluetooth defaults that allow Siri and other AI helpers tell us where to go and even when speed traps are up ahead, and more benefits eased us into a world where self-driving vehicles could be accepted and possibly even become the norm.

But one of the most common traits among humans is the need to be in control of their lives, so of course, letting go of that control when on the road wouldn’t be an easy feat.

Because there were so many potential issues with the new technology, it was followed closely by opponents and proponents both. showed that in California alone, there were 38 accidents over a four-year period caused by autonomous cars, but of those 38 incidents, 37 of them were caused by human error.

Other studies have shown that autonomous vehicles, although they have a higher rate of crashes, had less severe injuries than conventional cars and the vehicle itself was not usually the cause of the accident.

It may sound like a win for the self-driving team, but it makes it a heck of a web to untangle in the courtroom when the judge and jury are trying to decide who to hold liable for an accident caused by these autonomous cars.

The Big Events in Autonomous Accidents

While the majority of these self-driving accidents were minor, some have made headlines and created an outpouring of concern against letting a computer take control of something as sensitive and dangerous as a vehicle.

First introduced widely in 2014 by Google’s self-proclaimed smart car, autonomous cars have come a long way in a short period of time. Observers of this car and those that came after it watched closely, but it wasn’t until two years later, in May 2016, that they had a reason to complain.

At that time, a driver of an autonomous vehicle in Florida became the first fatality of its kind on record when his vehicle failed to brake for a large truck turning in front of it. But was it the car’s fault, the driver’s, or the software developer’s? Amidst such confusion without precedent, this legality was challenging to sort out.

Then came the more recent fatality in March 2018, again the first of its kind, when a self-driving Uber hit and killed a pedestrian. With such controversial topics as rideshares and autonomous vehicles, this incident made headlines across the world.

But these two significant events were not the only time that a self-driving vehicle caused an accident, and hundreds of these cases were going on behind the scenes. A merging of conventional driving case law and new precedent was in process of being created.

How the Court is Handling Self-Driving Vehicle Accidents

Attorneys and judges who are currently dealing with incidents involving autonomous cars understand the significance of even the most minor of these cases. The rulings of each of these incidents will be creating case laws that will be used in the future to determine final judgments of hundreds or thousands of future cases.

To make this as fair as possible in a system that was not entirely prepared for this change, existing law regarding tort liability is being used. That means that the liability of the driver and the insurance company is determined based on traditional negligence, no-fault, or strict liability mixed with a bit of adjustment to take into consideration the autonomous nature of the vehicle.

When it comes to the most common ruling of traditional negligence, this determination is based on the degree of driver responsibility in the accident. With conventional cars, it’s easier to determine the level of driver error. Distracted driving, driving under the influence, and other forms of neglect are typically cases that are seen.

However, in a self-driving vehicle, there are different degrees of potential for human error. Most of these autonomous cars are not actually fully self-driving, so there is still an opportunity for driver responsibility. That means that the attorney prosecuting has the burden of proving exactly how much autonomy was involved versus how much responsibility the driver had for preventing the accident.

If the traditional negligence theory of law is used, and the driver is determined to have not taken the reasonable precautions he or she should have, regardless of the autonomous vehicle, then they would be considered liable for the damage to person and property in the accident.

But in a no-fault state, the at-fault driver still can’t be sued unless the victim has sustained a certain level of injuries. Instead, the non-fault driver’s insurance company will step in to cover medical bills and any necessary compensation. Since the majority of autonomous vehicle incidents to date have involved minor injuries, insurance companies in no-fault states have been working to determine how their policies are going to handle these cases in the future.

Still, the general consensus seems to be that the more autonomous the vehicle, the less room for driver error and fewer accidents, meaning lower insurance premiums.

The third type of liability is less commonly used, particularly in the case of self-driving vehicles. This type of tort liability happens when activities that are considered to be dangerously excessive occur in the matter in question. Most of the autonomous accidents happen at low speed or simply due to driver error, so unless there is a major outlier, this type of tort liability determination would be rare.

Self-driving cars are not actually fully self-driving yet, making the final decision of who was at fault a complex issue. Some vehicles will let you know a collision is going to occur and you have to avoid it. Others kick in after you press the brake pedal. In some cases, a software defect prevents the emergency system from starting up, but in others, the driver could have avoided the accident.

Because of this confusion, the defense and prosecution both must cover the gamut from the potential for driver error to product malfunctions to a software defect that would require the developer’s liability. The prosecuting attorney has to weed through the legalities to find the right entity to accuse, and the defendant’s lawyer has to go through all of it to mount the proper defense.

Don’t Hire an Attorney on Autopilot

If you were involved in an accident caused by an autonomous vehicle, your ability to win your case is determined by which state your accident happened in and the laws of that state, as well as the expertise of your attorney.

Since autonomous vehicle accident lawsuits are relatively new, not every attorney has the same level of experience in prosecuting these cases. Our attorneys at Hershey Law are always on the forefront of legal issues, so we know how to address problems before they arise.

Don’t use autopilot on this decision and turn to just any lawyer to handle your lawsuit. Instead, call us at Hershey Law for your free consultation to see how we can help you handle your important case and fight for your rights.