A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.
The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.
What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?
Because the US is an insane country where you can straight up just break the law and as long as you’re rich enough you don’t even get a slap on the wrist. If some small startup had done the same thing they’d have been shut down.
What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.
What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.
I’ve argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.
Well, because 99% of the time, it’s fairly decent. That 1%'ll getchya tho.
To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.
…It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn’t even time for human intervention, but I frequently had to take over when I used to use it (post v13)
Someone who doesn’t understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.
You are trying to judge the self driving feature in a vacuum. And you can’t do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn’t need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that “bit better” than you statistically.
FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.
FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.
Yeah people keep bringing that up as a counter arguement but I’m pretty certain humans don’t swerve off a perfectly straight road into a tree all that often.
So unless you have numbers to suggest that humans are less safe than FSD then you’re being equally obtuse.
Humans do swerve off perfectly straight roads into trees, I know because I’ve done it!
Can you confirm that to the best of your knowledge you are not a robot?
A simple google search, (which YOU could have done yourself), shows it’s abut 1 in 1.5 million miles driven per accident with FSD vs 1 in 700,000 miles driven for mechanical cars. I’m no Teslastan, (I think they are over priced and deliberately for rich people only), but that’s an improvement, a noticeable improvement.
And as a an old retired medic who has done his share of car accidents over nearly 20 years-- Yes, yes humans swerve off of perfectly straight roads and hit trees and anything else in the way also. And do so at a higher rate.
The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.
I wouldn’t really called it a solved problem when waymo with lidar is crashing into physical objects
NHTSA stated that the crashes “involved collisions with clearly visible objects that a competent driver would be expected to avoid.” The agency is continuing its investigation.
It’d probably be better to say that Lidar is the path to solving these problems, or a tool that can help solve it. But not solved.
Just because you see a car working perfectly, doesn’t mean it always is working perfectly.
Lidar doesn’t completely solve the issue lol. Lidar can’t see line markings, speed signs, pedestrian crossings, etc. Cars equipped with lidar crash into things too.
I oversold it in my original comment, but it still performs better than using regular cameras like Tesla did. It performs better in weather and other scenarios than standard cameras. Elon is dumb though and doesn’t think LiDAR is needed for self-driving.
Let me guess……you watched mark rober’s video? Lol
Are those the ones that you can completely immobilize with a traffic cone?
The same is true when you put a cone in front of a human driver’s vision. I don’t understand why “haha I blocked the vision of a driver and they stopped driving” is a gotcha.
Probably Zoox, but conceptually similar, LiDAR backed.
You can immobilize them by setting anything large on them. Your purse, a traffic cone, a person :)
Probably makes sense to be a little cautious with the gas pedal when there is an anything on top the vehicle.
That and if you just put your toddler on the roof of the car or something or trunk for a quick second to grab something from your pocket…VROooOMMM baby gone.
A human also (hopefully anyway) wouldn’t drive if you put a cone over their head.
Like yeah, if you purposely block the car’s vision, it should refuse to drive.
You say that like it’s a bad thing lol if it kept going, that cone would fly off and hit somebody.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week
That’s because it won’t, that’s because Elmo musk is gasp a liar. Always has been. That robo taxi is actuyab older lie he used a couple of years prior, but he dusted it lfft and re-used it.
Anytime Elmo says that he’s confident they can do it now, he means that they’re nowhere near a real product. Anytime he says “next year” it means that it won’t ever happen. Anytime he says that they alrethave a product, it just needs to me produced, it means that it’ll never happy
He is a vaporware con man who has been cheating people (and mostly the US government) out of billions
Literally look at all of his promises over the last decade, you start seeing patterns. It’s always almost there.
SpaceX, arguay his most successful company that he actually did with his leadership is a shit show of lies. According to him we’d be having colonies on Mars by now, it’s what he took 3 billion dollars in funding for, and he literally isn’t at 1% of that. Yet, he keeps claiming, within a few years now! Three billion dollars and he managed to blow up a banana over the Indian ocean, and obliterate a launch pad
If I commit fraud in the thousands, take thousands and then don’t deliver, I go to jail. He does it with countless billions and he’s still out there. Bit alas, his behavior finally is catching up with him, Tesla is going off a cliff bow that nobody wants to drive a Nazi brick anymore
If it was open source tech people could check it and see if it really is capable for themselves but because it’s not we don’t know what it is missing to be way way better
Nah, on the 5 levels of autonomous driving, telsas as at level 2
Elmo isn’t even close but that wint stip him from just lying about it because that is what Elmo does best