Birbla

Login
    Tesla changes meaning of 'Full Self-Driving', gives up on promise of autonomy (electrek.co)
    154 points by MilnerRoute - 13 hours ago

  • This looks to me like they are acknowledging that their claims were premature, possibly due to claims of false advertising, but are otherwise carrying forward as they were.

    Maybe they'll reach level 4 or higher automation, and will be able to claim full self driving, but like fusion power and post-singularity AI, it seems to be one of those things where the closer we get to it, the further away it is.

    by dlcarrier - 12 hours ago
  • Most Honest Company (Sarcasm)
    by ares623 - 12 hours ago
  • How have they gotten away with such obvious misadvertising for this long? It’s undeniably misled customers and inflated their stock value
    by an0malous - 12 hours ago
  • > Since 2016, Tesla has claimed that all its vehicles in production would be capable of achieving unsupervised self-driving capability.

    > CEO Elon Musk has claimed that it would happen by the end of every year since 2018.

    Even as a Tesla owner, it baffles me how rational adults can take this conman seriously.

    by pm90 - 12 hours ago
  • The lesson here is to wait for a chill SEC and friendly DOJ before you recant your fraudulent claims, because then they won’t be found to be fraudulent
    by yieldcrv - 12 hours ago
  • Looking forward to the class action on this one…
    by RyanShook - 12 hours ago
  • This is clickbait from a publication that's had it out for Tesla for nearly a decade.

    Tesla is pivoting messaging toward what the car can do today. You can believe that FSD will deliver L4 autonomy to owners or not -- I'm not wading into that -- but this updated web site copy does not change the promises they've made prior owners, and Tesla has not walked back those promises.

    The most obvious tell of this is the unsupervised program in operation right now in Austin.

    by freerobby - 12 hours ago
  • I was a fool's game from the start, with only negative aspects = what could possibly go wrong?
    by aurizon - 12 hours ago
  • Feels like Musk should step down from the CEO role. The company hasn’t really delivered on its big promises: no real self-driving, Cybertruck turned into a flop, the affordable Tesla never materialized. Model S was revolutionary, but Model 3 is basically a cheaper version of that design, and in the last decade there hasn’t been a comparable breakthrough. Innovation seems stalled.

    At this point, Tesla looks less like a disruptive startup and more like a large-cap company struggling to find its next act. Musk still runs it like a scrappy startup, but you can’t operate a trillion-dollar business with the same playbook. He’d probably be better off going back to building something new from scratch and letting someone else run Tesla like the large company it already is.

    by starchild3001 - 12 hours ago
  • "Full Self Driving (Supervised)." In other words: you can take your mind off the road as long as you keep your mind on the road. Classic.

    Tesla is kind of a joke in the FSD community these days. People working on this problem a lot longer than Musk's folk have been saying for years that their approach is fundamentally ignoring decades of research on the topic. Sounds like Tesla finally got the memo. I mostly feel sorry for their engineers (both the ones who bought the hype and thought they'd discover the secret sauce that a quarter-century-plus of full-time academic research couldn't find and the old salts who knew this was doomed but soldiered on anyway... but only so sorry, since I'm sure the checks kept clearing).

    by shadowgovt - 12 hours ago
  • What I don't understand about this is that to my experience being driven around in friends teslas, its already there. It really seems like legalese vs technical capability. The damn thing can drive with no input and even find a parking spot and park itself. I mean where are we even moving the goalpost at this point? Because there's been some accidents its not valid? The question is how that compares to the accident rate of human drivers not that there should be an expectation of zero accidents ever.
    by asdff - 12 hours ago
  • I thought we would have almost AGI by now? https://x.com/elonmusk/status/1858747684972048695
    by guluarte - 11 hours ago
  • They made tons of money on the Scam of the Decade™ from Oct 2016 (See their "Driver is just there for legal reasons" video) to Apr 2024 (when they officially changed it to Supervised FSD) and now its not even that.
    by jesenpaul - 11 hours ago
  • Earlier:

    Tesla’s autonomous driving claims might be coming to an end [video]

    https://news.ycombinator.com/item?id=45133607

    by ChrisArchitect - 10 hours ago
  • One problem might be that American driving is not exactly... well great, is it? Roads are generally too straight and driving tests too soft. And for some weird reason, many US drivers seem to have a poor sense of situational awareness.

    The result is it looks like many drivers are unaware of the benefits of defensive driving. Take that all into account and safe 'full self driving' may be tricky to achieve?

    by jaggs - 10 hours ago
  • War Is Peace. Freedom Is Slavery. Ignorance Is Strength. FSD is... whatever Elon says it is.
    by ciconia - 8 hours ago
  • I strongly believe LIDAR is the way to go and that Elon's vision-only move was extremely "short-sighted" (heheh). There are many reasons but that drives it home for me multiple times a week is that my Tesla's wipers will randomly sweep the windshield for absolutely no reason.

    This is because the vision system thinks there is something obstructing its view when in reality it is usually bright sunlight -- and sometimes, absolutely nothing that I can see.

    The wipers are, of course, the most harmless way this goes wrong. The more dangerous type is when it phantom-brakes at highway speeds with no warning on a clear road and a clear day. I've had multiple other scary incidents of different types (swerving back and forth at exits is a fun one), but phantom braking is the one that happens quasi-regularly. Twice when another car was right behind me.

    As an engineer, this tells me volumes about what's going on in the computer vision system, and it's pretty scary. Basically, the system detects patterns that are inferred as its vision being obstructed, and so it is programmed to brush away some (non-existent) debris. Like, it thinks there could be a physical object where there is none. If this was an LLM you would call it a hallucination.

    But if it's hallucinating crud on a windshield, it can also hallucinate objects on the road. And it could be doing it every so often! So maybe there are filters to disregard unlikely objects as irrelevant, which act as guardrails against random braking. And those filters are pretty damn good -- I mean, the technology is impressive -- but they can probabistically fail, resulting in things that we've already seen, such as phantom-braking, or worse, driving through actual things.

    This raises so many questions: What other things is it hallucinating? And how many hardcoded guardrails are in place against these edge cases? And what else can it hallucinate against which there are no guardrails yet?

    And why not just use LIDAR that can literally see around corners in 3D?

    by keeda - 6 hours ago

© 2025 Birbla.com, a Hacker News reader · Content · Terms · Privacy · Support