An Autopilot-enabled Tesla Model 3 narrowly escaped colliding with a car on the highway by automatically decelerating when it detected the vehicle.
A Tesla Model 3 equipped with Autopilot narrowly escaped colliding with another vehicle on the highway, and the incident was captured on video. While the road to self-driving cars is still a few years away, automakers provide customers with intelligent driver assistance systems. Tesla currently offers three types – Basic Autopilot, Enhanced Autopilot, and Full Self-Driving. While the first is included on all Teslas as a standard feature, drivers will need to pay $6,000 and $15,000 respectively to access the other two. FSD is considered Tesla's most advanced driver assistance system as it not only includes Autopilot's features, but offers additional functions too.
In a video posted on Twitter by Model 3 user William Sellers, the Autopilot-enabled Tesla can be seen decelerating immediately after detecting a vehicle that pulled out on to the highway in front of it. The footage was obtained from the Tesla's inbuilt dash-cam. Initially, Sellers reported that FSD Beta performed the act, but later confirmed that it was his car's Autopilot system. The exact feature that helped avoid a potential crash is called Navigate on Autopilot. A feature available on Enhanced Autopilot, it builds on the Automatic Lane Change feature by enabling users to easily switch between off-ramp and on-ramp on highways. It also helps drivers to easily handle interchanges.
While Tesla prides itself as one of the most technologically advanced automakers, there are concerns about its suite of advanced driving assistance/safety systems. As more vehicles incorporate driving assistance aids, there's bound to be increased interest in their reliability, especially on busy roads. While Tesla has reiterated that Autopilot and FSD are designed to improve driver and passenger protection, some regulatory bodies disagree and have launched separate investigations into the system’s capabilities.
For example, the NHTSA recently announced that it was working to reach a decision on an Autopilot case it initiated in 2021. Similarly, the U.S. Department of Justice revealed that it was scrutinizing Autopilot over claims that it had self-driving capabilities. Meanwhile, Tesla has presented data of its own demonstrating Autopilot's ability to lower the chances of an accident when engaged. In its latest Vehicle Safety Report, the automaker claimed that it recorded just one crash for every 6.26 million miles covered by Autopilot-equipped cars. However, it does state that using Autopilot or FSD in inclement weather may negatively impact their abilities.
There's also an ongoing argument over who should be responsible for Autopilot-enabled vehicles getting involved in crashes. While some say that Tesla should be liable, others think that many users misuse Autopilot which jeopardizes their safety and that of other road users. Some Tesla drivers have been found sleeping while using Autopilot or utilizing defeat devices to trick the system into thinking that they have their hands on the steering wheel, which can have potentially fatal repercussions.
More: Video Shows FSD Tesla's Phantom Braking Causing Eight-Car Pile-Up
Source: William Sellers/Twitter
Michael Akuchie is a Tech Writer reporting the latest trends in the electric vehicle space. He has recently included electric micro-mobility solutions (scooters/bikes) and artificial intelligence in his coverage for Screen Rant.
He strongly believes that self-driving cars are possible in this lifetime, but we may be a long way from ever owning one. Despite broadening his scope, he remains an EV enthusiast and attempts to stay on top of the scene’s latest happenings.
His other writings cover the Customer Experience, HR Tech, Digital Transformation, and Automobile industries. He’s also an Anime enthusiast, with Naruto and AOT being his two all-time best Anime shows.