Dark
Light

Amid Tesla Recall, Maps Show Crashes Involving Autopilot

3 mins read
90 views
Tesla accidents Autopilot issues National Highway Traffic Safety Administration (NHTSA) Vehicle recall Autonomous driving technology

More than 100 Tesla accidents in the Bay Area prior to Tesla recalling more than 2 million vehicles for a software update relating to their autopilot systems drew attention from federal investigators looking into the potential role of the driver.

Amid Tesla Recall, Maps Show Crashes Involving Autopilot

Tesla Autopilot Oversight: NHTSA Investigation and Vehicle Recall

Since July 2021, there have been 113 Tesla accidents in the nine-county Bay Area where autopilot had been activated prior to the incident, according to a National Highway Traffic Safety Administration database. There were 970 such Tesla-related accidents across the country, with California having the most by a wide margin (353).

Every crash of autonomous cars in San Francisco is depicted on a map.

The statistics aren’t necessarily shocking: California has the highest Tesla per capita density of any state. However, they demonstrate just how much of an impact autopilot issues and a car recall are good to have in this situation.

A Tesla Model S immediately braked on the Bay Bridge just over a year ago, causing an eight-car pileup that resulted in nine injuries. Tesla accidents in Northern California may have involved autopilot. Another incident involved a Pittsburg man who passed away in February after his Tesla Model S collided with an Interstate 680 fire truck in Contra Costa County close to Walnut Creek. The second incident involved a Subaru and Tesla Model 3 that crashed in South Lake Tahoe in July, killing two people.

Through reports from the media and law enforcement, The Chronicle was able to identify those accidents. Each case corresponds to an accident that is listed in the NHTSA database.

After a two-year investigation by the NHTSA revealed flaws in the vehicles ‘ autopilot system, Tesla is recalling more than 2 million vehicles. According to NHTSA officials, the agency looked at 956 crashes where autopilot was originally thought to be in use before concentrating on 322 crashes that involved the system. This was done to investigate lateral impacts and impacts from potential unintentional system disengagement.

NHTSA Recall: Addressing Autopilot Misuse in Tesla Vehicles

The NHTSA stated in the recall report that “in some cases, when Autosteer is engaged, the prominence and scope of the feature’s controls may not be enough to prevent driver misuse of SAE Level 2 developed driver-assistance feature.” The organization went on to say that if a driver was unaware that the system was “operating in situations where its functionality may be limited” and was ready to step in, there might be an increased risk of collision.

According to the organization, the recall affects four Tesla models: the 2012–2013 Tesla Model S; the 2016–2023–Tataxium Model X; and the 2017–2020-model Tesla Y. According to the NHTSA, owners of vehicles affected by the recall would be able to fix them electronically using a software update at no cost to them.

According to an NHTSA spokesperson:”the investigation found that Tesla’s Autopilot system can provide limited driver engagement and usage controls that can lead to immediate misuse of the system.”

A “suite of driver assistance features that comes standard with a new car or can be purchased after delivery” according to Tesla’s website, is Autopilot, which was introduced in 2015. The exact features as autopilot are available in Full Self-Driving, which must be purchased and is currently in beta testing. Other features include autosteer on city streets and automatic stopping at lights or signs.

This is Tesla’s next recall of the year. Following the NHTSA’s discovery that the system would cause the car to enter intersections while in a turn-only lane, never come to an abrupt stop at stop signs, and enter junctions without caution, Tesla recalled 362,758 of its vehicles equipped with the whole self-driving beta in February.

Tesla Defends Autopilot Amid NHTSA Scrutiny

Beginning in July 2021, the agency started mandating that automakers report crashes in which the autopilot was turned on as part of the public standing order. If the system was in use within 30 seconds of the crash and if it resulted in property damage or injury, companies listed in the NHTSA’s public order, including Tesla, are required to file a crash report.

Tesla stated in a social media post on Monday that it was “morally unjustifiable” to stop the rollout of driver assist technology because “incontrovertible data that shows it is saving lives and preventing injury.” It also added that “best-in-class safety systems” must be improved.

Former NHTSA top safety adviser and professor at the College of Computing and Engineering at George Mason University, Missy Cummings, praised the agency’s decision as a step ahead but expressed skepticism about its long-term effects.

The fact that Tesla has consented to the volunteer recall, in her opinion, is what matters most. “I believe you’ll notice some advancements. A little is preferable to none. According to how I read this, Tesla is only inclined to take the bare minimum action.”

According to data from the third quarter of 2022 showing that a car using autopilot crashes after every 4.85 million miles driven, Tesla claimed that its vehicles are safer when the feature is activated. However, the information is not accessible to the general public. According to Cummings, if the data cannot be verified, the people cannot believe Tesla. She asserted that if you won’t allow anyone to independently verify the numbers, nothing Tesla says can be trusted.

Tesla recall: Maps show crashes involving autopilot in California

In the Bay Area, there have been more instances where autopilot might have played a role.

In March 2018, Walter Huang, a driver of a Tesla Model X, was killed when the vehicle slammed head-on into the freeway divider at great speed. In 2019, Huang’s family sued Tesla for unlawful death, alleging that the autopilot system was flawed and was to blame for his demise.

A petition from Minnesota engineer Ronald Belt, who contends that it is feasible for the vehicles to immediately accelerate without a driver pushing on the accelerator and without the vehicle’s data recorder detecting an issue with the car, is also being considered by the NHTSA as it looks into complaints about sudden braking. Belt’s petition has not already received a decision from the NHTSA.

Viktor Musil

Victor Musil, pen name for Edouard Py, advocates for inclusive, people-centered city development. His work underscores the importance of ethical considerations and equitable access, shaping the discourse on urban innovation worldwide.

Leave a Reply

Your email address will not be published.

Driver monitoring systems National Highway Traffic Safety Administration (NHTSA) Drunk driving prevention technology Bipartisan Infrastructure Law Advanced drunken and affected driving prevention technology anti-drunk driving technology
Previous Story

Automakers could face anti-drunk driving technology mandate

Renewable energy Sustainability Renewable Energy Certificates (RECs) IoT sensors Artificial Intelligence (AI) environmental accountability
Next Story

The need for energy suppliers to prove environmental accountability

Latest from News Feed

Don't Miss