The National Highway Transportation Safety Administration (NHTSA) defines an “autonomous,” or “automated” vehicle to be “… those in which at least some aspects of a safety-critical control function (e.g., steering, acceleration, or braking) occur without direct driver input.”
The NHTSA and the Society of Automotive Engineers (SAE) classify automated driving systems (ADSs) into one of six “levels” (where Level 0 is “warning only” and Level 6 is “fully automatic/no driver required”). Since only ADSs classified as Level 0 – 2 are currently available, we will limit our discussion to vehicles equipped with such systems.
Prior to the development of ADS-equipped vehicles, establishing liability in an accident was relative straightforward: Ifyour vehicle was the direct cause of an accident, or set into motion a chain of events that led to an accident,you could be held liable for damages. Unfortunately, the arrival of “driverless” vehicles has muddied the once-clear legal waters surrounding the issue of liability in motor vehicle accidents.
In the following section we present two accident cases that will illustrate some of the legal issues that could rewrite personal injury and product liability law in the “Age of ‘Smart’ Machines.” We note that the following cases were chosen for purpose of example and were selected simply because Tesla vehicles constitute the majority of ADS-capable vehicles on American highways.
On the morning of May 7th 2016 a Tesla Model S, with its “Autopilot” ADS engaged, was traveling down a highway near Williston FL at a speed of 74 mph when a tractor-trailer rig emerged from a side road while making a left turn. The driver of the Tesla, who was apparently “distracted” and was not monitoring highway conditions, was killed when his vehicle struck the side of the trailer in an “underride” collision.
The its investigation of the accident the NTSB determined that, despite the fact that the vehicle’s ADS failed to detect something the size of a “big rig” tractor-trailer, the ADS had functioned normally since the ADS “…was not designed to, andcould not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.”
Shawn Hudson was traveling along the Florida Turnpike at 70 to 80 mph on the morning of October 12th 2018 in his 2017 Tesla Model S with its Autopilot ADS engaged. According to Hudson, he was “looking up at down” at his phone but was monitoring traffic conditions when the Autopilot failed to detect a disabled vehicle blocking a lane. Hudson failed to notice the vehicle and, since the Autopilot failed to sound a warning of the disabled vehicle’s presence, Hudson sustained head and neck injuries in the ensuing crash.
Hudson subsequently filed a lawsuit against Tesla. In that lawsuit he alleges that Teslaknew that its Autopilot could not consistently detect stationary objects directly ahead if the Tesla was moving at more than 50 mph. Despite including that warning in every vehicle’s owner’s manual, Hudson alleges that the vehicle salesman (a Tesla employee, since Tesla does sell its products through dealerships) did not mention this fact and that he (Hudson) was unaware of the warning until after his accident.
The NTSB declined to investigate the accident. The lawsuit is now in its earliest stages and is slowly moving through the Florida courts.
In the cases mentioned above, theimmediate and proximate cause (a legal term meaning “directly responsible for”) of each accident was a failure of the ADS to detect an object that would have been easily been noticed by a human driver who was aware of the road conditions. These failures seem to indicate two troubling issues that could lead to a major legal headache for Tesla Motors, Inc.
The first of these issues arises from the fact that, in the first accident, the NTSB noted that Tesla’s Autopilot ADSwas not designed to detect objects moving at a 90° angle to the vehicle. Since “T-Bone” accidents (those which occur when two vehicles collide while moving at right angles to each other) account for a significant percentage of fatal accidents, why wasn’t this shortcoming addressed when the ADS was in development?
The second issue extends our previous concern: Since high-speed collisions with stationary objects also account for a large number of traffic accident fatalities, why wasn’t this problem addressed during the ADS design stage? More importantly, was the problemdiscovered during design or testing butallowed to persist into the production stage?
Simply stated, our positions are as follows:
If either (or both) of these issues were known to Tesla, and Tesla knowing allowed these defects to be incorporated into its products, then a jurycould decide that Tesla is liable for any damages that arose from the use of itsdefective product.
Given the fact that Tesla’s latest ” Quarterly Safety Report” has been shown to contain “inconsistencies”, we suspect that Tesla was well-aware that its ABS contained dangerous shortcomings in its design.
We have not addressed other issues that affect the safety of ADS-enabled vehicles, such as:
As a national personal injury law practice, at The Doan Law Firm we constantly monitor both the popular and professional legal media for the latest developments regarding liability involving ADS-enabled vehicles. We invite you to visit our website often for our impressions on the latest developments in this rapidly-changing area of the law.
"*" indicates required fields