Tesla Under Investigation After Fatal Crash May Have Involved Autopilot System, Report Says::undefined

  • Dojan@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    edit-2
    1 year ago

    The autopilot that tries to mince cyclists? That believe that tram tracks are for Teslas? The autopilot that mistakes the moon for a stop sign?

    No I don’t think a reputable brand whose cars fall apart in the rain, that sell vehicles with internal testing firmware, whose spokesperson does nothing but bluster and lie, that resells defective vehicles, that has more or less weekly recalls, would ever release a car with a dangerous feature.

    The best thing to come out of Tesla is the Finnish bloke who blew his up in anger over the insane quote he was given for repairs.

  • pixelscience@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    I don’t trust the tech at this point and I certainly don’t trust the people who are using the full-self driving mode.

    It’s endangering everyone on the roads right now.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      Statistically they’re still safer than human drivers but nobody writes articles about your car dodging the woman drifting into your lane while eating a bowl of cereal and applying mascara on her morning commute.

      • iWidji@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I think your statement and the fear for self driving can be true at the same time.

        Self driving is safer than humans most of the time… but not all the time. Nothing is perfect.

        Self driving currently assumes that a human can intervene when it fails. It assumes that a human is present and not eating a bowl of cereal and applying mascara. It assumes that the human is actually paying attention, in a situation where they usually don’t have to because self driving is usually safer.

        Yes, self driving is statically safer. Yes, self driving will one day be perfect.

        But I don’t think we can fault anyone for being worried about self driving, especially with companies like Tesla, who sell the promise that you don’t really have to pay attention… even though you kinda have to right now.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I don’t fault anyone for being uncomfortable with new things that they aren’t familiar with, but I absolutely do fault them for making wild accusations or fear mongering from a place of ignorance.

          If we listened to half the comments on this post, the tech would be completely banned, the developers in prison, the companies bankrupt, and the number of avoidable collisions and deaths on the road would increase. People are wanting to cut off their nose to spite their face here while the consequences fall on the rest of us who share the road. This is what I take issue with.

    • Synthead@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      This is my opinion, too. Their “autopilot” feature is a glorified driving aid. It’s not self-driving. It’s supposed to help with driver fatigue, and you’re supposed to keep both hands on the wheel. If it makes a mistake, that’s okay, because you’re driving the car, right?

      Traditional cruise control without radar will maintain the speed you asked and it won’t stop for emergency vehicles, but we don’t blame that. Even though the “autopilot” feature does more automation, you’re supposed to drive the car in an identical fashion with identical attention compared to traditional cruise control.

      But safety is still what matters first. If you’re sending a freeway-speed land missile into motorcyclists and police cars, I don’t care if you were driving a 90s Civic or a car with automated driving features. The car hit someone. Fix that problem first, then figure out who to blame later.

      In my option, until we have cars that are guaranteed to function as a completely autonomous experience, and the manufacturer of the car doesn’t tell you to keep your hands on the wheel, you’re still driving it. It’s your responsibility. You can still steer, brake, change lanes, evade, etc. That’s on you. As far as I’m concerned, anyone who thinks otherwise might as well blame their heated seats or radio station.

      I understand that Tesla would be improving their software, and I agree with this, too. It’s not great that they are fudging things quite a bit by pushing the self-driving rhetoric. They should focus on this, and it should be improved. But I still think that negligent drivers are at fault.

  • flossdaily@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    Not a fan of Tesla or Musk, but I think it always bears repeating in these conversations that AI driving will be much safer than human driving if it isn’t already.

    Unfortunately, accidents will happen, but when an accident happens with an AI, ALL the other AI’s get to learn from that failure going forward.

    I’m very happy that in my old age, I’ll have some future version of this driving me around… or more likely, taking the wheel from me if I do something stupid.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      AI driving is only as good as it’s sensors.

      While most other companies use LIDAR, Musk switched to video cameras because it’s cheaper.

      Which is why Tesla “FSD” is worse than competitors.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        This isn’t necessarily accurate. More sensors means more raw data that needs to be parsed and computed and you can run into issues where the two systems don’t agree and the computer won’t know what to do. Additionally, things like rain and snow can confuse LIDAR.

        It may very well be that LIDAR is a required component for autonomous driving, but no companies have a fully functional system yet, so none of us can do more than speculate on what sensors are necessary.

        • Cloudless ☼@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Computers don’t require two systems to agree. They just need good algorithms to analyse the data from both sensors.

          Human body has sight and hearing sensors. Sometimes our sensors disagree (lightning vs thunder from distance), but we have the algorithm to analyse the input and come up with the correct conclusions.

          • CmdrShepard@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            With good enough algorithms, you don’t even need two systems. Humans can drive perfectly fine off vision alone.

    • atp@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Autopilot is terrible and the fact that they advertise it as a reputable system is abhorrent. And yes, I own a tesla.

      • navi@lemmy.tespia.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’m pretty happy with autopilot in our cars, especially on road trips. It really helps with driving fatigue.

        • atp@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          a straight flat interstate with well painted lines in clear conditions is the only time that i trust it anymore. waaaaaay too many close calls in every other situation.

          • navi@lemmy.tespia.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            The FSD stack understands the road MUCH better than any other car I’ve used out there. But it’s decision making can still be dumb when deciding which lane to be in.

    • eksb@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      “This thing that does not exist and nobody has any idea how to make it” will totally be safer than human driving.

      You know what is safer than human driving and we know how to make? Trains.

      • iopq@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        I’m still waiting for my train from LA to SF. It’s been in the works since I was in college. I’ve already graduated, had multiple jobs, early retired, and there’s still no sign of it.

    • Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I believe statistically FSD is already better driver than a human. Ofcourse there are situations that confuse the AI and it makes errors a human wouldn’t but this kind of stuff gets slowly ironed out over time. People also seem to forget that human drivers do pretty fucking stupid mistakes too. Enough so that 40000 of them die every ear in US alone. 100% safety is probably impossible to achieve and 99.99% safety means 33000 accidents per year.

      It’s easy to pick on Tesla due to the CEO being quite unpopular so every time a Tesla does something it’s not supposed to it gets so wide media attention that it seems way more common that it really is.

      Nevertheless self driving cars are here to stay and there will be time when wanting to drive by yourself will be considered irresponsible and unsafe. And I say this as someone with zero interest in owning such car.

    • silvercove@lemdro.id
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      AI driving will probably be safer one day, but there is no real data today that demonstrates its current state is. At the same time we’re getting lots of examples where it fails at the most basic stuff.

  • bauhaus@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    just to think that James Cameron thought Terminator would be a walking, talking robot man and not a car