Autopilot Misuse Went Unchecked by Tesla, Expert Claims

Posted on July 29, 2025 by Expert Witness Profiler

The Plaintiff, Neima Benavides, sued Tesla, Inc. after Naibel Benavides Leon was killed and Dillon Angulo was seriously injured when a Tesla Model S went through a T-intersection in Key Largo and off the pavement, striking their parked Chevrolet Tahoe as they were standing next to it.

George McGee, the driver of the Model S, had engaged the driver-assistance system, but had dropped his mobile phone and wasn’t watching the road while reaching for the device on the floorboard.

Mary “Missy” Cummings, an engineering professor at George Mason University is of the opinion that Tesla Inc. has fallen short in safeguarding against the dangerous misuse of its Autopilot system—raising serious concerns about safety and accountability on the road.

She also noted that even before the crash, Tesla was grappling with drivers routinely ignoring its system’s warnings. Yet, unlike other automakers, the company had resisted adopting geo-fencing technology that would prevent drivers from using Autopilot on roads where it wasn’t meant to be activated.

Systems Engineering Expert Witness

Mary Louise “Missy” Cummings was one of the U.S. Navy’s first female fighter pilots. She is now the director of Mason’s Autonomy and Robotics Center (MARC) and a professor at George Mason University. Cummings also holds faculty appointments in the Mechanical Engineering, Electrical and Computer Engineering, and Computer Science departments. She is an American Institute of Aeronautics and Astronautics (AIAA) Fellow and recently served as the senior safety advisor to the National Highway Traffic Safety Administration.

Cummings received her BS in Mathematics from the U.S. Naval Academy in 1988, her MS in Space Systems Engineering from the Naval Postgraduate School in 1994, and her PhD in Systems Engineering from the University of Virginia in 2004.

Get the full story on challenges to Missy Cummings’ expert opinions and testimony with an in-depth Challenge Study

Discussion by the Court

In a letter to NHTSA, Tesla asserted that “Autopilot has the most robust set of warnings against driver misuses and abuse of any feature ever deployed in the automotive industry.”

Cummings told the jury, “I saw no evidence that would back up this claim that they have the most robust set of warnings.”

Cummings has served as an expert witness in at least two other lawsuits against Tesla related to the Autopilot system, according to Court filings.

But when Cummings was appointed as a safety adviser to NHTSA in 2021, Elon Musk dismissed her as “extremely biased against Tesla,” sparking backlash from Tesla loyalists—some of whom even launched a petition to block her appointment.

The professor recalled that McGee was adamant—he believed the car was his copilot and would automatically stop for any obstacles. Like many Tesla drivers, she said, McGee placed his trust in Autopilot to take over when he dropped his phone, expecting the system to safely steer and react on its own.

Key Takeaway:

Safety expert Mary “Missy” Cummings took the stand with a sharp critique of Tesla’s Autopilot strategy, testifying that the company has failed to do enough to prevent drivers from misusing the technology. She pointed out that Tesla’s owner’s manual is difficult for drivers to access, and even before the crash, the company struggled with drivers ignoring critical computer-generated warnings. According to Cummings, Tesla made a calculated decision in 2019 not to geofence its technology—a move she believes was driven by the desire to boost sales rather than ensure safety.

Case Details:

Case Caption:Benavides V. Tesla, Inc
Docket Number:1:21cv21940
Court Name:United States District Court, Florida Southern