A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.
They can’t stop and ask for assistance at 100km/h on a highway.
I hope Tesla/Musk address this accident and get the telemetry from the car, cause there’s no evidence that FSD was even on.
According to the driver it was on FSD, and it was using the latest software update available.
https://www.reddit.com/user/SynNightmare/
Maybe the point is then, that Tesla FSD shouldn’t be legally used on a highway.
But it probably shouldn’t be used anywhere, because it’s faulty as shit.
And why can’t is slow down to let the driver take over in a timely manner, when it can break for no reason.
It was tested in Germany on Autobahn where it did that 8 times within 6 hours!!!