The Hidden Dangers of Tesla’s Autonomous Driving Technology

The Hidden Dangers of Tesla’s Autonomous Driving Technology

In January of this year, Elon Musk found himself with an opening to put Tesla’s best foot forward on the promise of full autonomous driving—they got lucky. To be clear, recent crashes involving Tesla cars have raised alarming issues. These incidents have raised concerns over the safety and dependability of the company’s autopilot system. These worries are based on heart-wrenching encounters that have left families grieving. Moreover, there are still significant questions about what data Tesla collects, and what its implications are.

Rita Meier, who lost her husband in a recent tragic crash. As Teslas employees reported, they had been left without any data to review, looking into the mishap. That’s a big question mark for a system purportedly meant to acquire, store, and regulate every byte of customer information. Tesla touts the fact that its cars are able to benefit from the experience of billions of miles driven by other cars. Families impacted by these crashes face a lack of clear information, usually only getting an Excel spreadsheet marked “Incident Review” from the company.

Meier shared her story and those of others who lost loved ones in similar circumstances, creating a network for support and information sharing. It became so bad that prosecutors threatened to subpoena the tracking data from a different driver involved in the crash. Tesla waited over two weeks to provide that information, adding to the alarm. In stark contrast, the autopilot technology is praised and advertised for its capability to assume control of steering, braking and acceleration, among other functions, for extended periods of time.

Elon Musk has made public statements regarding the investigation into these matters, asserting that “the whole Tesla senior team is investigating this matter right now.” The company has difficulty tracking its fleet of vehicles in real-time because data takes too long to report back. This further deepens questions about the effectiveness of their oversight. Tesla claimed that it had provided the final full vehicle data set almost two weeks prior to the crash. This assertion cropped up in both Meier’s dispute and in a parallel case concerning Anke Schuster.

Even more concerning, incidents of complete erratic driving behavior have begun to emerge. For example, one passenger observed that “is the car driving erratically by itself normal? Yeah, that happens every now and then.” I’m not alone according to other Tesla drivers. They complain of sudden lurching while in motion, highway phantom braking, and forceful stopping. One driver described the fear of their car spontaneously rushing forward after they let their kid out at school.

Germany’s largest automobile club, ADAC, has advised that Tesla drivers should keep emergency window hammers on hand. This guidance is especially important when a passenger vehicle crashes into a bus. Build from the fear that autonomous technologies are going to lead us into danger. Increased risk. These technologies might underperform, which brings into play critical safety concerns.

Former NASA engineer and viral YouTuber Mark Rober added fuel to the fire by recreating dangerous Tesla actions in his controlled study. His research raised serious concerns about the lack of accountability in Tesla’s technology. His findings, though, sent shockwaves through the public and led to more questions about safety protocols.

In a similar and now-viral video, Musk did just that after amassing over 14 million views on the short clip. Even with his promise that vehicles are safer, families who have lost loved ones to crashes still have to share their pain and disappointment with him.

My husband died in an unexplained accident. And no one cared, Meier lamented. Similarly, Schuster echoed her sentiments, stating, “I lost my husband. His four daughters lost their father. And no one ever cared.”

Tesla’s internal investigations and the public responses to those investigations acknowledge that something has gone terribly wrong. While nationwide underfunding drives families to private alternatives in search of closure and answers, the gap between corporate promises and lived experiences is still hard to fathom.

As investigations move forward and more drivers step forward with their experiences, the transparency and accountability drivers deserve intensify. Tesla’s autopilot system under fire Over the past weeks, the rollout of Tesla’s new autopilot feature has faced scrutiny and condemnation.

Tags