A Tesla on Autopilot slammed into a police car, according to a new report — and now the driver is facing criminal charges
A Tesla slammed into a police cruiser during a traffic stop in Massachusetts in December 2019. The driver told the state trooper at the time that the car was on Autopilot, according to an incident report first reported by NBC 10 Boston. The state police say he's now facing criminal negligence charges related to the wreck. Tesla's Autopilot instructions say a driver should always be ready to take over from the computer at any time. Visit Business Insider's homepage for more stories.
A Massachusetts state trooper was in the middle of a traffic stop in December when a Tesla slammed into his patrol car, pinning him between it and the stopped vehicle. Now, the driver of the Model 3 — which was apparently on Autopilot at the time of the crash — is facing criminal charges of negligent driving, NBC 10 Boston reports, citing court documents. Maria Smith, a college student, recounted the scary crash to the local news channel this week. "Before I knew it, my car was flying forward," she said. "I looked behind me, and my whole back windshield was blown out. There was glass in my hair." The driver, Nicholas Ciarlone, told the trooper at the scene of the crash that the car was on Autopilot and he "must not have been paying attention," NBC 10 Boston reported, citing the incident report. He could not be reached for comment, and is set to be arraigned in September. The Massachusetts State Police confirmed the charges to Business Insider. Tesla did not immediately respond to a request for comment. Tesla's Autopilot has been front-and-center in several high-profile incidents. Under normal circumstances, the software can maintain speed and direction while monitoring for obstacles so long as the driver is paying attention. But while Tesla's instructions tell drivers to constantly monitor the program and remain ready to take over on moments notice, there have been plenty of instances caught on video of drivers sleeping, leaving the drivers seat, or even filming an adult movie.Join the conversation about this story » NOW WATCH: A cleaning expert reveals her 3-step method for cleaning your entire home quickly
More like this (3)
The National Transportation Safety Board says the automaker should restrict use of its Autopilot feature and...The National Transportation Safety Board says the automaker should restrict use of its Autopilot feature and better detect when drivers are paying attention.
The National Transportation Safety Board said both Tesla's Autopilot system and an inattentive driver played a...The National Transportation Safety Board said both Tesla's Autopilot system and an inattentive driver played a role in a fatal 2018 crash, The Verge reported. The NTSB held a hearing about the incident on Tuesday after a 23-month investigation. The 2018 incident raised questions about how Tesla has marketed Autopilot, and whether drivers are capable of using it responsibly. Visit Business Insider's homepage for more stories. The National Transportation Safety Board has determined that Tesla's Autopilot advanced driver-assistance system and the inattention of driver Walter Huang were likely factors in Huang's fatal 2018 crash in Mountain View, California, The Verge reported. Huang had too much confidence in Autopilot, which was activated at the time of the crash, and had been playing a game on his phone before his Model X SUV hit a broken crash attenuator, the NTSB concluded, according to The Verge's report. The agency reportedly said that if the attenuator had been replaced, Huang would likely have survived. The NTSB held a hearing about the accident on Tuesday following a 23-month investigation. The agency clashed with Tesla in 2018 over the electric-car maker's decision to reveal information about the crash on its blog. Autopilot can control steering, acceleration, and braking in some situations, but requires the driver to keep their hands on the wheel and pay attention to the road at all times. The 2018 incident highlighted questions that have been raised about whether Tesla has been too aggressive in marketing Autopilot, and whether drivers are capable of paying sufficient attention to the road while using the feature. Tesla has argued that, overall, Autopilot makes drivers safer, pointing to data that shows a lower rate of crashes in Tesla vehicles using Autopilot than in all vehicles in the US. But that data doesn't account for the fact that Autopilot is designed for use only during highway driving, something that by itself could result in fewer accidents. Over 90% of respondents in a 2019 Bloomberg survey of 5,000 Model 3 sedan owners said they believed Autopilot made them safer. Tesla did not immediately respond to a request for comment. An NTSB representative directed Business Insider to the agency's Twitter account. At the time of publication, the account had not yet published an update regarding the agency's conclusions about the 2018 crash. Read The Verge's full story here. Are you a current or former Tesla employee? Do you have an opinion about what it's like to work there? Contact this reporter at email@example.com. You can also reach out on Signal at 646-768-4712 or email this reporter's encrypted address at firstname.lastname@example.org. Read more: Elon Musk contradicted a bold claim he made last year about Autopilot Andrew Yang uses Tesla's Autopilot feature in a new ad — and he breaks the same rule Elon Musk has on TV Tesla defended Autopilot after a congressman said its name is misleading I drove the Tesla Model 3 for 2 days and used its most controversial feature — here's why it made me nervous SEE ALSO: Google's self-driving car project stopped working on a system similar to Tesla's Autopilot and Cadillac's Super Cruise in 2013 because one of its employees fell asleep at the wheel Join the conversation about this story » NOW WATCH: 62 new emoji and emoji variations were just finalized, including a bubble tea emoji and a transgender flag. Here's how everyday people submit their own emoji.