A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

    What I don’t get is how this false advertising for years hasn’t caused Tesla bankruptcy already?

      • ayyy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Someone who doesn’t understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

          • bluewing@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 month ago

            You are trying to judge the self driving feature in a vacuum. And you can’t do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn’t need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that “bit better” than you statistically.

            FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

            • Echo Dot@feddit.uk
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 month ago

              FSD isn’t perfect. No such system will ever be perfect. But, the goal isn’t perfect, it just needs to be better than you.

              Yeah people keep bringing that up as a counter arguement but I’m pretty certain humans don’t swerve off a perfectly straight road into a tree all that often.

              So unless you have numbers to suggest that humans are less safe than FSD then you’re being equally obtuse.

  • orca@orcas.enjoying.yachts
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.

      • KayLeadfoot@fedia.ioOP
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        Probably Zoox, but conceptually similar, LiDAR backed.

        You can immobilize them by setting anything large on them. Your purse, a traffic cone, a person :)

        Probably makes sense to be a little cautious with the gas pedal when there is an anything on top the vehicle.