The incident in northern California marked the latest mishap blamed on the electric vehicle company’s Autopilot tech

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    It’s exceptionally unlikely that AUTOPILOT did this.

    AP only follows the lane/road you’re in.

    Maybe it was FSD, but this is terrible reporting.

    • xthexder@l.sw0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      A perfect example of why calling it autopilot in the first place was a bad idea. The name misrepresents the feature, which is really just lane keeping and a few other minor things.

        • JovialMicrobial@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I feel like driver assist is a better representation of what this feature is.

          Gps autopilot on sail boats has been around for a long time now(not talking about windvane selfsteering) and it will keep course via the rudder, but if the wind changes it won’t shift the sails so you still have to keep watch and either change the sails, or change course to follow the wind.
          If you don’t pay attention and the wind changes the sails start flapping and shit can get messy.

          The point is that while car autopilot does match the definition of nautical autopilot and how it funtions(it needs human oversight) I would never expect someone who’s never gone sailing with an autopilot device being used to know how it works and that someone needs to watch it and why. It’s niche knowledge and kinda foolish to expect people to just know stuff like that.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            2 months ago

            I don’t expect random people to know, but the media who’s reporting on it? Yes, I do expect that, and it’s terrible reporting when they get it wrong. It just continues to spread incorrect information, which gets further turned into misinformation.

            Also anyone who actually activates it in the car goes through the little how it works and what it’s short falls are. If they didn’t know prior, they’ll know then, just like with boaters who learn how it works and what it’s shortfalls are.

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Was the driver asleep or something? The car drove quite a bit on the tracks… sure, blame Tesla all you want (and rightly so), but you can’t really claim today that the car has “autopilot” unless you’re hunting for a lawsuit. So what was the driver doing?

    • mannycalavera@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      California, so I’m I’m guessing the driver was getting head at the time whilst drinking beer.

    • NutWrench@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Who needs a driver? This car has AUTOPILOT.

      But seriously, Tesla “autopilot” is nothing more than a cruise control you have to keep an eye on. Which means, it’s NOT “autopilot.” This technology is not ready for the real world. Sooner or later, it’s going to cause a major, horrible accident, involving dozens or people. Musk has enough connections to avoid any real-world consequences but maybe enough people will get over their child-like worship of billionaires and stop treating him like he’s the next Bill Gates.

      • nevemsenki@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Somewhat ironically, autopilot for airplanes is more less attitude/speed holding for most history. More modern systems can now autoland or follow a preprogrammed route (the flight plan plugged into the FMS), but even then changes like TCAS advisories are usually left up to the pilots to handle. Autopilots are also expected to give control to the pilots in any kind of unexpected situation.

        So in a way tesla’s naming here isn’t so off, it’s just the generic understanding of the term “autopilot” that is off somewhat. That said, their system is also not doing much more than most other level 2 ADAS systems offer.

        On the other hand, Elon loves going off about Full Self Driving mode a lot, and that’s absolutely bullshit.

        • Wrench@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Comercial pilots also have a lot of training, huge list of regulations and procedures for every contingency, amd a copilot to double check your work.

          Tesla has dumb fuck drivers that are actively trying to find ways to kill themselves. And an Orange wedged in the steering wheel is the copilot. To trick sensors.

          Maybe the latter should not be trusted with the nuance that is the “autopilot” branding.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Because 2007+ have seen an influx of new computer users, mostly using mobile devices, many of them thinking that this is how computer use looks now and that this is the future.

          Now the iPhone generation (including adults and seniors who haven’t used anything smarter) thinks that you can replace any expert UI with an Angry Birds like arcade on a touchscreen.

          If real autopilot to be trusted were possible for airplanes now, we’d see fully automated drone swarms in all warzones and likely automated jets (not having the constraint of G-forces survivable by a human, and not requiring life support systems at all), but in real life it’s still human-controlled FPV drones and human-piloted jets.

          Though I think drone swarms are coming. It’s, of course, important to have control over where the force is applied, but a bomb that destroys a town when you need to destroy a house is often preferable to no bomb at all.

          The point was that people want magic now and believe crooks who promise them magic now. Education is the way to counter this.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        It’s rather reminiscent of the old days of GPS, when people would follow it to the letter, and drive into rivers, go the wrong way up a one-way street, etc.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Delicately put. But essentially that’s why self-driving cars are not really seen outside of Tesla. Unless the technology is basically perfect there’s essentially no point to it.

          Tesla have it because they use the public as guinea pigs.

          I wouldn’t mind if they all had to go to some dedicated test track to try it out and train it and outside of those environments it wouldn’t turn on. If they want to risk their lives that’s their prerogative, my problem is that it might drive into me one day and I don’t own a Tesla so why should I take that risk?

      • Takumidesh@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I just don’t understand how someone can read all the warnings, get a driver’s license (implying their knowledge of the rules of the road) and presumably have years of driving experience and magically think it’s ok to just stop paying attention.

        It doesn’t matter if the car fully promotes itself as self driving, it doesn’t matter if the laws surrounding it still require you to be present and in control.

        It’s no different than 1000hp cars, just because the car is marketed as such, doesn’t magically make it legal to go 200mph.

        • Todd Bonzalez@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          You don’t understand why people would think that a feature called “full self-driving” is capable of fully driving the car itself?

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            2 months ago

            Anyone who agree’s to this warning, and thinks the car is capable of fully driving themselves, is an idiot

            "Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent. … Use Full Self-Driving in limited Beta only if you will pay constant attention to the road, and be prepared to act immediately …

            Now, I don’t know if that warning exists after it became Full Self Driving (Supervised) when V12.X became a thing and he gave free demo’s to everyone, but at that point its name literally includes SUPERVISED, so really, they’re idiots either way.