A Tesla Cybertruck on autopilot crashes into a lamp post!

This page is translated from the original post "Un Tesla Cybertruck autonome s’écrase contre un lampadaire !" in French.

Comment
Accident Cybertruck FSD

The Tesla Cybertruck of a Florida engineer with FSD activated has crashed into a lamppost.

Tesla’s Full Self-Driving (FSD) technology, often praised as a groundbreaking feature, is once again in the spotlight after a Florida-based software engineer, Jonathan Challinger, reported an accident involving his Cybertruck. The incident, which occurred while version 13.2.4 of the FSD was activated, reignites the debate over the limits and responsibilities associated with autonomous driving systems.

Challinger shared his experience on the social network X: “So… my Tesla Cybertruck crashed into a curb and then a lamppost with version 13.2.4 of FSD. Big failure on my part, obviously. Don’t make the same mistake as me. Be cautious.” According to him, the vehicle did not react to the end of the lane, neither steering nor slowing down before hitting the curb and then the lamppost. Although the accident caused damage to the Cybertruck, no injuries were reported.

https://twitter.com/MrChallinger/status/1888546351572726230

Despite this incident, the American showed little resentment, calling the FSD “the best passive safety system in the world”, while urging drivers to avoid any slackness. He added that while he closely follows Tesla’s progress, he had not heard of similar accidents with the latest version of the FSD before this incident. He also stated that he possesses dashcam footage, which could be essential for understanding what happened.

Tesla under scrutiny

The National Highway Traffic Safety Administration (NHTSA) has expressed concerns regarding Tesla’s communication about the FSD. In a letter last year, the regulator criticized Tesla for social media posts that appeared to contradict the company’s official position, which states that active driver supervision is necessary. The NHTSA warned that these messages could mislead drivers, causing them to perceive the FSD as a fully autonomous system rather than a driver assist tool.

This incident occurs in the context of increased scrutiny of Tesla’s autonomous driving capabilities. In October, the NHTSA opened an investigation into 2.4 million Tesla vehicles following reports of accidents involving the FSD. Among these incidents, four were related to reduced visibility caused by glare, fog, or suspended dust. In one of these cases, the vehicle struck and killed a pedestrian.

This high-profile incident underscores an essential truth: regardless of technological advancement, driver vigilance remains paramount. Tesla’s FSD offers remarkable features, such as lane changing and automatic parking, but it does not yet replace human supervision. As this case demonstrates, over-reliance on automation can have costly consequences.

READ ALSO: Are Tesla’s autonomous vehicles ready as early as this summer?

We also suggestthese articles:

Electric Car

Has Tesla Lost Its Mind with the Standard Model Y Roof?

Recent articles