The National Highway Transportation Safety Administration (NHTSA) defines
an "autonomous," or "automated" vehicle to be "…
those in which at least some aspects of a safety-critical control function
(e.g., steering, acceleration, or braking) occur without direct driver
The NHTSA and the Society of Automotive Engineers (SAE) classify automated
driving systems (ADSs) into one of six "levels" (where Level
0 is "warning only" and Level 6 is "fully automatic/no
driver required"). Since only ADSs classified as Level 0 - 2 are
currently available, we will limit our discussion to vehicles equipped
with such systems.
Level 0 - No automation: The driver has complete control of the vehicle and the ADS only "warns"
the driver if it detects a hazard, as is the case with forward collision
and lane departure warning systems.
Level 1 - Driver assistance: The driver is in control, but the ADS can modify the speed and steering
of the vehicle as required by "adaptive cruise control" or "automatic
Level 2 - Partial automation: The driver must be able to take control if corrections are required,
but the driver is no longer in control of the vehicle's speed and
steering. "Parking assistance," Tesla's "Autopilot,"
and Mercedes-Benz's "Distronic Plus" (among others) are
considered to be Level 2 ADSs.
Liability and Automated Vehicles
Prior to the development of ADS-equipped vehicles, establishing liability
in an accident was relative straightforward: If
your vehicle was the direct cause of an accident, or set into motion a chain of events
that led to an accident,
you could be held liable for damages. Unfortunately, the arrival of "driverless"
vehicles has muddied the once-clear legal waters surrounding the issue
of liability in motor vehicle accidents.
In the following section we present two accident cases that will illustrate
some of the legal issues that could rewrite personal injury and product
liability law in the "Age of 'Smart' Machines." We note
that the following cases were chosen for purpose of example and were selected
simply because Tesla vehicles constitute the majority of ADS-capable vehicles
on American highways.
Tesla Model S
7 May, 2016
On the morning of May 7th 2016 a Tesla Model S, with its "Autopilot" ADS engaged, was
traveling down a highway near Williston FL at a speed of 74 mph when a
tractor-trailer rig emerged from a side road while making a left turn.
The driver of the Tesla, who was apparently "distracted" and
was not monitoring highway conditions, was killed when his vehicle struck
the side of the trailer in an "underride" collision.
The its investigation of the accident the
NTSB determined that, despite the fact that the vehicle's ADS failed to detect something
the size of a "big rig" tractor-trailer, the ADS had functioned
normally since the ADS "…
was not designed to, and
could not, identify the truck crossing the Tesla's path or recognize the impending
crash. Therefore, the system did not slow the car, the forward collision
warning system did not provide an alert, and the automatic emergency braking
did not activate."
Tesla Model S
12 October, 2018
Shawn Hudson was traveling along the Florida Turnpike at 70 to 80 mph on
the morning of October 12th 2018 in his 2017 Tesla Model S with its Autopilot ADS engaged. According
to Hudson, he was "looking up at down" at his phone but was
monitoring traffic conditions when the Autopilot failed to detect a disabled
vehicle blocking a lane. Hudson failed to notice the vehicle and, since
the Autopilot failed to sound a warning of the disabled vehicle's
presence, Hudson sustained head and neck injuries in the ensuing crash.
Hudson subsequently filed a lawsuit against Tesla. In that lawsuit he alleges
knew that its Autopilot could not consistently detect stationary objects directly
ahead if the Tesla was moving at more than 50 mph. Despite including that
warning in every vehicle's owner's manual, Hudson alleges that
the vehicle salesman (a Tesla employee, since Tesla does sell its products
through dealerships) did not mention this fact and that he (Hudson) was
unaware of the warning until after his accident.
The NTSB declined to investigate the accident. The lawsuit is now in its
earliest stages and is slowly moving through the Florida courts.
In the cases mentioned above, the
immediate and proximate cause (a legal term meaning "directly responsible for") of each accident
was a failure of the ADS to detect an object that would have been easily
been noticed by a human driver who was aware of the road conditions. These
failures seem to indicate two troubling issues that could lead to a major
legal headache for Tesla Motors, Inc.
The first of these issues arises from the fact that, in the first accident,
the NTSB noted that Tesla's Autopilot ADS
was not designed to detect objects moving at a 90° angle to the vehicle. Since
"T-Bone" accidents (those which occur when two vehicles collide
while moving at right angles to each other) account for a significant
percentage of fatal accidents, why wasn't this shortcoming addressed
when the ADS was in development?
The second issue extends our previous concern: Since high-speed collisions
with stationary objects also account for a large number of traffic accident
fatalities, why wasn't this problem addressed during the ADS design
stage? More importantly, was the problem
discovered during design or testing but
allowed to persist into the production stage?
Simply stated, our positions are as follows:
If either (or both) of these issues were known to Tesla, and Tesla knowing
allowed these defects to be incorporated into its products, then a jury
could decide that Tesla is liable for any damages that arose from the use of its
Given the fact that Tesla's latest "Quarterly Safety Report" has been
shown to contain "inconsistencies", we suspect that Tesla was well-aware that its ABS contained dangerous
shortcomings in its design.
Why you need a car accident lawyer from The Doan Law Firm
We have not addressed other issues that affect the safety of ADS-enabled
vehicles, such as:
- What measure have been taken to protect ADS software from malicious, but
amateur, "hackers" or from deliberate "cyber-attack"
by a manufacturer's competitors or even from a foreign government-backed agency?
- Will existing traffic laws still apply if an ADS has total control of a
vehicle? As an example, could a driver be convicted of DUI/DWI if the
vehicle is in its "driverless" mode?
Could the data routinely recorded by an ADS be used by law enforcement to
infer a suspect's involvement in a crime based on where a vehicle was at
a particular time, or to track someone's location?
As a national personal injury law practice, at
The Doan Law Firm we constantly monitor both the popular and professional legal media for
the latest developments regarding liability involving ADS-enabled vehicles.
We invite you to
visit our website often for our impressions on the latest developments in this rapidly-changing
area of the law.