A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.
The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.
No if you look at Waymo as an example, they are actually autonomous, and stop to ask for assistance in situations they are “unsure” how to handle.
But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left causing it to roll upside down.
The attention required to prevent these types of sudden crashes negates the purpose of FSD entirely.