Skip to main content

The Legal Challenge With Self-Driving Cars: Who’s Ultimately Liable?

Sponsored content

Two decades ago, self-driving cars were a fantasy only possible in sci-fi movies. Today, self-driving cars are only a legislation away.

Because it is new technology, there isn’t much legal precedent to determine cases involving self-driving feature-related car accidents. Legal precedent speaks of prior case determinations, which form the basis for applying future laws.

The laws vary widely in the few states with an autonomous or semi-autonomous driving legislation. A driver who gets into an accident involving a semi-autonomous vehicle in Texas may face different consequences from a driver involved in an accident under the same circumstances in Washington DC.

Are Self-Driving Cars Legal?

The answer is, yes and no. The federal government is yet to clear driverless vehicles on the road, meaning having a driverless car on the road is not legal. However, some states have passed legislation to govern self-driving vehicle usage.

Owning a vehicle with self-driving capabilities is perfectly legal, and you can legally drive one in any state as long as there is someone in the driver’s seat. The law is unclear on how much driving a driver should do vs. how much technology can do, which introduces a gray area.

Due to this ambiguity in federal and state laws, determining liability in semi-autonomous cars can become a big challenge. Therefore, choosing a lawyer to represent you in a self-driving-related car accident claim requires careful consideration.

Tort Liability

Both tort and product liability may apply for autonomous vehicle-related accidents. If a driver intentionally fails to play their part in following traffic rules hoping the autonomous will do it for them, they may have liability for the accidents.

In 2020, a Los Angeles court found Kevin George Aziz Riad guilty of manslaughter with gross negligence for the death of a couple at an intersection when his vehicle, a Tesla Model S, was on autopilot. According to the court, Riad was responsible for controlling the car when approaching the intersection instead of trusting the autopilot.

Tesla, on its website, makes it clear that its autopilot feature does not make its vehicles autonomous. The feature is meant only to assist, not replace the driver. This court ruling marked the first time a driver was found criminally guilty of causing an accident that involved a vehicle’s autonomous feature failure.

Product Liability

Product liability may also be applicable when software failure causes a problem out of the driver’s control to cause an accident. There are ongoing lawsuits against Tesla for failed systems that resulted in an accident under product liability law. For example, a case involving Jeremy Banner, who died in 2018 after his car, a Tesla Model S, crashed into a trailer 10 seconds after activating autopilot.

The lawsuit claims that Banner trusted what the car manufacturer said would work. It also claims that Tesla specifically knew that the technology was defective and would not work based on previous accidents involving feature failure under similar circumstances. “Drivers drop their guard because they trust what the automakers say, and they should take responsibility for it,” says car accident lawyer Bert McDowell of Bert McDowell Injury Law.

As a vehicle that prides itself on having the best autopilot features, these accidents call to question why the carmaker should not be partly at fault for accidents resulting from a feature they claim works. Outcomes of lawsuits such as Banner’s will set precedence for future autonomous feature failure-related accidents.

[Image via Pexels]

Tags:

Follow Law&Crime: