Driver Says Tesla Full SelfDriving Mode Caused Model Y To Crash

Driver Says Tesla Full Self-Driving Mode Caused Model Y To Crash

A Tesla Model Y allegedly entered the wrong lane while making a turn, sparking another NHTSA probe into the company’s autonomous vehicle software.

You Are Reading :Driver Says Tesla Full SelfDriving Mode Caused Model Y To Crash

Driver Says Tesla Full SelfDriving Mode Caused Model Y To Crash

A Tesla Model Y driver has filed a complaint, claiming their car was in Full Self-Driving (FSD) mode when it crashed into another vehicle. Despite its name, FSD is not a truly autonomous driving system. Instead, it is Tesla’s latest iteration of its premium assisted driving technology, building upon its Autopilot and Enhanced Autopilot offerings. Currently available as a beta product, FSD includes all the features of the lower tiers, while also promising additional functionality, such as automated steering on city streets, and the ability to stop itself at traffic lights.

The US National Highway Traffic Safety Administration (NHTSA) uses a six-point scale to describe driverless vehicles. Level 0 means no autonomous features, while Level 5 refers to full automation whereby no human driver is necessary for a vehicle to travel safely on public roads. Based on its present features, FSD classifies as Level 2, which is partial automation. Indeed, Tesla mandates all users of its autonomous systems keep their hands on the steering wheel when driving.

Earlier this month, in Brea, California, a 2021 Model Y was hit by another vehicle as the driver was turning left. As noted by The Next Web, a statement on the NHTSA’s website explains that the unnamed driver claims their car was in FSD mode and that it steered into the wrong lane during the turn. The individual says the Model Y provided a warning half-way through the maneuver, but that their attempt to correct the trajectory was to no avail and the vehicle was struck on the driver’s side. Fortunately, there were no reported injuries, but the NHTSA says it is now investigating the incident.

See also  Wolverine The 10 Most Underrated Villains

Tesla Recalled Vehicles Last Month

The incident comes after Tesla recalled almost 12,000 cars last month due to an FSD issue. The company released an FSD beta 10.3 update on October 23rd, but some drivers soon discovered issues relating to their vehicles’ forward collision warning and emergency braking features. The company temporarily rolled users back to an earlier build before issuing a patch. Of course, Tesla’s software is explicitly being offered as a beta, so it’s unclear whether the company can or will be held liable.

Whatever standards Tesla will be held to for its nascent technology, it’s worth noting regulators are paying attention. This latest action by the NHTSA follows a probe into Tesla cars reportedly crashing into emergency vehicles while in Autopilot mode. It’s also of note that this is not even the first investigation relating to FSD – the California Department of Motor Vehicles is investigating Tesla over its use of the phrase “full self-driving” and whether it amounts to false advertising.

Link Source :

Leave a Reply

Your email address will not be published. Required fields are marked *