Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner. Only this time, the test […]
He didn’t even use the Tesla full self driving. He used the ancient autopilot software and even with that he apparently manually disengaged it before impact. Seems pretty disingenuous to me.
Source?
For example this and many more articles
https://tribune.com.pk/story/2534978/mark-rober-faces-backlash-over-tesla-autopilot-test-accused-of-misleading-viewers
Did you finish reading that article?
Because the end talks about Mark’s response to these accusations. And apparently he posted some unedited video on x showing that full self-driving disengaged itself 17 frames prior to hitting the wall.
They program it to do that on purpose so that Elon Musk can go to the median claim that it wasn’t using a full self driving mode before the Collision, it’s wrong but arguably technically correct and you would have to dig into it to learn the truth which is what he wants. We shouldn’t give him the benefit of the doubt because we know what he’s up to
For sure.
The first kid test they did with both auto-pilot and self-driving (or whatever you call that). Was that different for the later tests?
Both uses the same sensors. It’s not like Tesla has some hidden better cameras when you use fsd.
My point is that FSD is much, much more advanced piece of software. It’s wrong to label the video self driving when you are not using FSD. Autopilot is just adaptive cruise control that keeps the car in lane.
It’s wrong to label a Tesla or any of its software as ‘full self driving’.
Quite clearly Mark demonstrated that the safety systems are engaged in what ever mode he had it in; otherwise the vehicle would never stop for the obstacle in front of it.
@lemmyingly @yesmeisyes Tesla’s safety systems only do emergency stops for certain
stationary objects (cars, bicyclists, pedestrians). The real test would be to see if FSD would actively plan to drive through that wall.
You can see that even AP wasn’t enabled in most of the test so it’s not a test of FSD.
If you close your eyes, it doesn’t matter that you’re wearing glasses or not.
If the car sensors could not pick up the wall, what software version is using does not matter.
Tesla is only using vision. Software makes ALL the difference. If you don’t have a brain it doesn’t matter if you have eyes or not.
You clearly don’t design software. Like you are saying yourself, without the hardware, you cannot run the software.
Anyone who watches the video in question knows this statement is misleading. Autopilot also stops when it detects an obstacle in the way (well, it’s supposed to, but the video demonstrates otherwise). Furthermore, decades old adaptive cruise from other brands will stop too because even they have classic radar or laser range-finding.
If even the most basic go no-go + steer operation based on computer vision can’t detect and stop before obstacles, why trust an even more complicated solution? If they don’t back-port some apparent detection upgrade from fsd to the basic case, that demonstrates even further neglect anyway.
The whole point that everyone is dancing around is that Tesla gambled that cheaping out by using only cameras would be fine, but it cannot even match decades-old technology for the basic case.
Did they test it against decades old adaptive cruise? No, that’s been solved, but they did test it against that technology’s next generation, and it ran circles around vision not backed by a human brain.
Autopilot hasn’t received any updates for years. Tesla is only focusing on FSD. This makes your point invalid.
Like I said, demonstrates neglect.
Ok so every car manufacturer is also demonstrating negligence because they can’t even get their updates working in the first place.
Other car manufacturers update their cars all the time. Some OTA, some you have to take in to the service center, but that does not mean they don’t do it. The easy to steal Kias is an example of a recent recall that just required a SW update via service center. My ID.4 just got an OTA update last week. Tesla is showing negligence for not updating a service that is still in a lot of vehicles but they don’t actively sell anymore.
Mark posted a small video of unedited footage showing full self-driving disengaged itself 17 frames prior to hitting the wall on X. According to the end of the article you posted below anyway. I don’t go on X myself.