Tesla’s “absolute self-driving” mode has attempted to drive under a railroad crossing arm as a speeding train passes by. According to videos posted on social media, it nearly hit a concrete wall of a parking garage, made ill-advised left turns, clipped at least one curb, and at least one driver was able to set a top speed of 90 mph on a street where the posted speed limit was 35 mph.
These drivers were aware that they weren’t using a foolproof device and that there would be flaws when they decided to test early prototypes of Tesla’s “absolute self-driving” program, which is updated on a regular basis. The company warned them about their weaknesses and the importance of staying alert.
Experts are concerned that the feature’s name suggests more features than Tesla currently provides. However, the risks of “extreme self-driving” don’t appear to be deterring Tesla from releasing a public beta of the feature. Even as some Tesla loyalists testing the feature express doubts about what will happen next, Tesla is planning a big release.
Even before two people were killed in a Tesla crash into some trees over the weekend, some Tesla fans spoke out. One occupant was in the front passenger seat, while the other was in one of the back seats, according to police. According to the police, there was no one in the driver’s seat. The National Highway Traffic Safety Administration said on Monday that it is looking into the accident.
Autopilot, the commonly available predecessor to “absolute self-driving,” may have been involved and, if so, was being used improperly, according to the police statement.
According to data logs recovered so far, Autopilot was not allowed, according to Tesla CEO Elon Musk. However, Musk did not rule out the possibility that future discoveries would show that Autopilot was in use. He also refused to give an alternate explanation for the accident. Tesla did not respond to repeated requests for comment and does not communicate with reputable news organizations in general.
The “absolute self-driving” system, according to Tesla, will change lanes, navigate highways, and stop for traffic signals. Tesla has promised the feature since 2016, but just last fall began allowing a select group of drivers to test an early version. Approximately 2,000 Tesla owners were exploring “absolute self-driving” as of March, according to Musk. The company is planning a larger rollout of a “significantly improved system” than the one seen in the videos, with Musk tweeting that he would be “surprised” if a big beta update wasn’t available by June.
According to Tesla, even though the name implies a high level of autonomy, drivers must remain alert, keep their hands on the wheel, and retain control of their vehicles while using the feature. While the initial launch was rough last October, beta testers have described it as improving in social media messages, and Musk has said that it is “getting mature” on Twitter.
However, some of Tesla’s ardent supporters are worried about the system’s shortcomings. The steering wheel jerks back and forth unpredictably in YouTube videos of “absolute self-driving” in beta testing.
Teslas using a variant of the “absolute self-driving” beta have made ostensibly risky left turns, such as pulling out in front of oncoming high-speed traffic or making a slow right, causing nervous drivers to slam on the brakes to get out of harm’s way.
Since Tesla’s complete self-driving program, or FSD, is technically a driver-assist device, it can be tested on public roads in the United States. In Europe, where Tesla provides a more restricted suite of autonomous driving features, there are stricter constraints on driver-assist systems.
Even if the device seems to be functioning well, Tesla advises drivers to remain alert and ready to take control at any moment. However, others are concerned that these guidelines will not be followed.
AI DRIVR, a YouTuber who does Tesla videos and is already testing “absolute self-driving,” has expressed concern on social media about a large population receiving the feature, claiming that it would be abused.
AI DRIVR, like other social media users who often post about Tesla’s “absolute self-driving” program, said he was bound by an NDA and that he was unable to talk with CNN directly when approached.
“Please let’s not screw this up and make Tesla regret their decision and the freedom that they are giving people,” AI DRIVR said.
He cited a video in which a young man in a Tesla using Autopilot, the company’s predecessor to “real self-driving,” climbs out of the driver’s seat and lies down in the back of the Tesla as it appears to roll down a highway. Tesla has put in place precautions to avoid abuse of Autopilot, such as requiring the use of a seatbelt and detecting torque on the steering wheel, however, a driver could circumvent these. Mr Hub, who is known on YouTube as Mr Hub, did not respond to a request for comment.