The incident in northern California marked the latest mishap blamed on the electric vehicle company’s Autopilot tech

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    94
    arrow-down
    4
    ·
    4 months ago

    Was the driver asleep or something? The car drove quite a bit on the tracks… sure, blame Tesla all you want (and rightly so), but you can’t really claim today that the car has “autopilot” unless you’re hunting for a lawsuit. So what was the driver doing?

    • TheEighthDoctor@lemmy.world
      link
      fedilink
      English
      arrow-up
      67
      arrow-down
      1
      ·
      edit-2
      4 months ago

      That’s what I was thinking, your car starts doing something fucking stupid and you just let it?

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          Delicately put. But essentially that’s why self-driving cars are not really seen outside of Tesla. Unless the technology is basically perfect there’s essentially no point to it.

          Tesla have it because they use the public as guinea pigs.

          I wouldn’t mind if they all had to go to some dedicated test track to try it out and train it and outside of those environments it wouldn’t turn on. If they want to risk their lives that’s their prerogative, my problem is that it might drive into me one day and I don’t own a Tesla so why should I take that risk?

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        It’s rather reminiscent of the old days of GPS, when people would follow it to the letter, and drive into rivers, go the wrong way up a one-way street, etc.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          There was a legal case recently where somebody drove off a bridge that wasn’t there. At some point you have to take personal responsibility since the outcomes will be extremely personal.

    • BruceTwarzen@lemm.ee
      link
      fedilink
      English
      arrow-up
      44
      arrow-down
      4
      ·
      4 months ago

      It’s hard to tell who’s worse at driving. Tesla owners or their auto pilot

    • NutWrench@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      4 months ago

      Who needs a driver? This car has AUTOPILOT.

      But seriously, Tesla “autopilot” is nothing more than a cruise control you have to keep an eye on. Which means, it’s NOT “autopilot.” This technology is not ready for the real world. Sooner or later, it’s going to cause a major, horrible accident, involving dozens or people. Musk has enough connections to avoid any real-world consequences but maybe enough people will get over their child-like worship of billionaires and stop treating him like he’s the next Bill Gates.

      • nevemsenki@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        4 months ago

        Somewhat ironically, autopilot for airplanes is more less attitude/speed holding for most history. More modern systems can now autoland or follow a preprogrammed route (the flight plan plugged into the FMS), but even then changes like TCAS advisories are usually left up to the pilots to handle. Autopilots are also expected to give control to the pilots in any kind of unexpected situation.

        So in a way tesla’s naming here isn’t so off, it’s just the generic understanding of the term “autopilot” that is off somewhat. That said, their system is also not doing much more than most other level 2 ADAS systems offer.

        On the other hand, Elon loves going off about Full Self Driving mode a lot, and that’s absolutely bullshit.

        • Wrench@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 months ago

          Comercial pilots also have a lot of training, huge list of regulations and procedures for every contingency, amd a copilot to double check your work.

          Tesla has dumb fuck drivers that are actively trying to find ways to kill themselves. And an Orange wedged in the steering wheel is the copilot. To trick sensors.

          Maybe the latter should not be trusted with the nuance that is the “autopilot” branding.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          I think the counter to that is that aircraft manufacturers know that the people flying their aircraft are not idiots and actually know what the autopilot button does. Meanwhile Tesla knows that the people driving their cars are idiots and don’t know what the autopilot does.

          In the US they let kids drive for god’s sake. Sure they’ve passed a test but what does that mean in the real world?

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          4 months ago

          Because 2007+ have seen an influx of new computer users, mostly using mobile devices, many of them thinking that this is how computer use looks now and that this is the future.

          Now the iPhone generation (including adults and seniors who haven’t used anything smarter) thinks that you can replace any expert UI with an Angry Birds like arcade on a touchscreen.

          If real autopilot to be trusted were possible for airplanes now, we’d see fully automated drone swarms in all warzones and likely automated jets (not having the constraint of G-forces survivable by a human, and not requiring life support systems at all), but in real life it’s still human-controlled FPV drones and human-piloted jets.

          Though I think drone swarms are coming. It’s, of course, important to have control over where the force is applied, but a bomb that destroys a town when you need to destroy a house is often preferable to no bomb at all.

          The point was that people want magic now and believe crooks who promise them magic now. Education is the way to counter this.

    • mannycalavera@feddit.uk
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      8
      ·
      4 months ago

      California, so I’m I’m guessing the driver was getting head at the time whilst drinking beer.

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    4
    ·
    4 months ago

    This is that “AI” that investors keep jerking themselves purple over.

    Real “Self driving cars” will not be available in our lifetimes.

    • DaTingGoBrrr@lemmy.ml
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      7
      ·
      4 months ago

      I don’t believe that. Based on how far AI has come in the recent years I think it’s only a matter of time before someone (other than Tesla) manages to do it well.

      The biggest problem with the Tesla Auto pilot is Elon. Just the fact that he insists on using only camera-based vision because “people only need their eyes to drive” should tell you all you need to know about their AI.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        4
        ·
        4 months ago

        And you believe and think that why? Most of us criticizing do that, because we have some idea what machine learning is and what it simply doesn’t solve. It’s not a hard to get knowledge.

        • DaTingGoBrrr@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 months ago

          I think it’s unreasonable to state that it won’t happen within our lifetime. That’s hopefully 60+ years away for me. It’s a long time for computing and general AI development to advance. Just look at how much has happened in the technology field for the past 30 years.

          “It always seems impossible until it’s done.” - Nelson Mandela

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            4 months ago

            Sorry, but this is again abstractions and philosophy, in the genre of Steve Jobs themed motivational texts. Which I hate with boredom (get tired quickly of hating with passion).

            Many things have been called “AI” and many will be. I’m certain some will bring very important change. And those may even use ML somewhere. For classification and clustering parts most likely, and maybe even extrapolation, but that’d be subject to a system of symbolic logic working above them at least, and they’ll have to find a way of adding entropy.

            What they call “AI” now definitely won’t. Fundamentally.

            “It always seems impossible until it’s done.” - Nelson Mandela

            Quoting that one guy who hasn’t been hanged\shot\beheaded while many many many more other people trying the same have been. Survivor’s error and such.

            • DaTingGoBrrr@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              Innovators and visionaries is what drives us forward. If they listened all the nay-sayers we would get nowhere. I will keep being optimistic about the future developments of technology.

              • rottingleaf@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                Innovators and visionaries is what drives us forward. If they listened all the nay-sayers we would get nowhere.

                How can you not see that these sentences say nothing?

                What’s “forward”? From Hypercard and Genera times to today is “forward”?

                Who are “innovators and visionaries”? I mean, that’d be many people, but no Steve Jobs in the list, if that’s what made you write this.

                Who are “nay-sayers”? If that’s, say, Richard Stallman, then all his nays on technology (as it happens, he’s kinda weird on other things) were correct.

                And the final question, why do you think you can in any way feel the wind of change, when you don’t know the fundamental basics of the area of human knowledge where you “believe” in it? Don’t you think it’s not wind of change, it’s just usual marketing for clueless people?

                Say, I see a lot of promising and wonderful things, but people not knowing fundamentals get excited over something stupid which is being advertised to them.

                • DaTingGoBrrr@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 months ago

                  Blue LED-lights, the TV, radio, airplanes, the personal computer, the light bulb, nuclear fission, optical microscopes, shooting lasers for an aptosecond are among some thing previously thought to be impossible to do.

                  Who said anything about Steve Jobs? I never mentioned anyone specific and as you say, there are many people that would make that list.

                  I would consider the “experts” and laymen with a sceptical attitude towards innovation to be nay-sayers.

                  I think it’s weird how so many people suddenly became experts on AI as soon as OpenAI released ChatGPT.

                  I don’t like the current trend of companies putting half-assed AI in to everything. AI is the new buzzword to bring in hype. But that doesn’t mean I can not see the value it can potentially bring in the future once it’s more developed. The developments within the AI-field has only just begun.

                  My use of the word AI is very broad. I am not saying that ChatGPT could drive a car. But I 100% believe that we will have self-driving cars before I die of old age.

            • scratchee@feddit.uk
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              They don’t have to be any good, they just have to be significantly better than humans. Right now they’re… probably about average, there’s plenty of drunk or stupid humans bringing the average down.

              It’s true that isn’t good enough, unlike humans, self driving cars are will be judged together, so people will focus on their dumbest antics, but once their average is significantly better than human average, that will start to overrule the individual examples.

              • rottingleaf@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 months ago

                Right now they are not that at all.

                When people say neural nets are unable to reason, they don’t mean something fuzzy-cloudy like normies do, which can be rebutted by some other fuzzy-cloudy stuff. They literally mean that neural nets are unable to reason. They are not capable of logic.

                • scratchee@feddit.uk
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  4 months ago

                  Reasoning is obviously useful, not convinced it’s required to be a good driver. In fact most driving decisions must be done rapidly, I doubt humans can be described as “reasoning” when we’re just reacting to events. Decisions that take long enough could be handed to a human (“should we rush for the ferry, or divert for the bridge?”). It’s only the middling bit between where we will maintain this big advantage (“that truck ahead is bouncing around, I don’t like how the load is secured so I’m going to back off”). that’s a big advantage, but how much of our time is spent with our minds fully focused and engaged anyway? Once we’re on autopilot, is there much reasoning going on?

                  Not that I think this will be quick, I expect at least another couple of decades before self driving cars can even start to compete with us outside of specific curated situations. And once they do they’ll continue to fuck up royally whenever the situation is weird and outside their training, causing big news stories. The key question will be whether they can compete with humans on average by outperforming us in quick responses and in consistently not getting distracted/tired/drunk.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      Now AI may or may not be overhyped but Tesla’s self-driving nonsense isn’t AI regardless. Just pattern recognition it is not the neural net everyone assumes it is.

      It really shouldn’t be legal, this tech will never work because it doesn’t include lidar so it lacks depth perception. Of course humans also don’t have lidar, but we have depth perception built in thanks billions of years of evolution. But computers don’t do too well with stereoscopic vision for 3D calculations, and really can do with actual depth information being provided to them.

      If you lack depth perception, and higher reasoning skills, for a moment you might actually think that a train driving past you is a road. 3D perception would have told the software that the train was vertical and not horizontal, and thus was a barrier and not a driving surface.

      • FishFace@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Just pattern recognition it is not the neural net everyone assumes it is.

        Tesla’s current iteration of self-driving is based on neural networks. Certainly the computer vision is; there’s no other way we have of doing computer vision that works at all well and, according to this article from last year it’s true for the decision-making too.

        Of course, the whole task of self-driving is “pattern recognition”; neural networks are just one way of achieving that.

    • FishFace@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      4 months ago

      We have gone from cruise control to cars being able to drive themselves quite well in about a decade. The last percentage points of reliability are of course the hardest, but that’s a tremendously pessimistic take.

  • IllNess@infosec.pub
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    5
    ·
    4 months ago

    Roads really need a standard for sensors specifically for autopilots.

    GPS and cameras reading lines, signs ,and lights aren’t good.

    • wieson@feddit.org
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      2
      ·
      4 months ago

      Yeah, like a digital “ideal line” that the cars can follow.

      Maybe even a physical guiding line.

      We could even connect all the cars via WLAN (WiFi) to exchange info when they are braking and accelerating. That would increase efficiency.

      Maybe we could even connect them physically to have a stronger engine pulling more cars more efficiently.

      If we already have an ideal guiding line, we might actually save some asphalt and make the roads more optimised. Use different materials so the tyre particles don’t pollute as much.

      Ah, let’s just build a train.

        • FireRetardant@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          4 months ago

          Or just buld a tram that rides on rails. More effecient and no need to over engineer an autopilot system

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            4 months ago

            You have to lay tracks for a tram though, means changing the route isn’t that easy. Self-Driving buses would actually be more efficient since you could alter the route on a hourly basis if you wanted optimized by traffic and destinations required.

            All that could be easily automated, but you lose that if you have a physical track you have to run along.

    • FireRetardant@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      4 months ago

      I don’t want to waste any more tax money trying to make one of the least effecient modes of transport more autonomous. Just build an electricrfied tram if thats what you want.

      • blarth@thelemmy.club
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        8
        ·
        4 months ago

        The United States is simply too large and distributed for everyone to use public transportation. It will never happen, so get used to it and try to optimize what will be part of our future.

        • FireRetardant@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          2
          ·
          edit-2
          4 months ago

          The majority of trips people make are within their own city/local region. Thats where transit should be implemented first. Your country is not “too big” for transit

          If your country is too big for transit, it is certainly too big for all sorts of sensors and such in the roads to assist autonmous driving.

          • blarth@thelemmy.club
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            4 months ago

            It is too distributed in too many places for mass transit. The religious fervor over the fuck cars movement is not going to get people in highly populated, low density areas to walk a mile to catch a bus to catch a train full of homeless people to catch another bus to walk a half mile to their destination, when they could have completed that same journey in the comfort of their own car in 1/4 of the time.

            Take Dallas for instance. I’m not going to do the work for you, but feel free to plan a trip from a random house in Allen, TX to a business 5-10+ miles away using both the public transit system and then a car. No one sane with limited time in their day is opting for the public transit option. And this is in a city with a decent passenger rail system.

            • FireRetardant@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              1
              ·
              4 months ago

              All of those issues could be fixed by building around transit being the prioirty instead of the car. Some cities actually have transit that is faster than a car because transit gets priority at intersections and can take a more direct route.

                • FireRetardant@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  4 months ago

                  The new york city subway is often faster than driving. Many cities in the Netherlands have faster transit or cycling times than driving due to careful planning and priority. Japan has high speed rail connecting many of its cities, most trains going faste than highway speeds, some doubling or even tripling highway speeds.

                  Also north america was founded on trains. If we could build trains 100 years ago we can build better ones now.

  • Wolfeh@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    4 months ago

    News tomorrow: Police department retracts statement after Elon Musk lawsuit