YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

  • TIEPilot@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    1 year ago
    • RMA Armament is named for providing the body armor Gendron wore during the shooting.

    No he bought it.

    • Vintage Firearms of Endicott, New York, is singled out for selling the shooter the weapon used in the attack.

    Not their issue he passed the background check.

    • The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.

    Any knob w/ a dremel can make a gun full auto, let alone defeating a mag lock. And he broke NY law doing this.

    • YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.

    This is just absurd.

    My guess is they are hoping for settlements vs going to trial where they lose.

    • vertigo3pc@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Only responding to the last point, but if they can prove that Google somehow curated his content to push him towards fringe, terroristic websites, they could be found liable as a civil suit.

      • dx1@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Any basic “you may like this” algorithm can produce those results.

        • Neil@lemmy.ml
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          I don’t know about Google, but there is something severely wrong with Youtube’s algorithm. I’m far left leaning, and it only takes one or two wrong clicks to throw my entire recommended page into a right-wing conspiracy hell.

          • TechnoBabble@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            That is legitimately a problem.

            For some reason, YouTube’s algorithm heavily favors extremist content if you show even a casual interest in related material.

            It’s probably as simple as “shocking content gets more clicks”, but still, it’s not good for our society to have entertainment platforms recommending extremist views.

            In the old days, you’d have to seek out this kind of fringe content on your own. And you’d get pushback from your community if you started talking nonsense.

            Nowadays, my aunt is getting blasted with reptilian democrat stuff after showing an interest in typical conservative lady content years ago. And there is not much of a community left to help her out. The algorithms just amplify all the worst shit.

          • Edgelord_Of_Tomorrow@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Oh you watch WWII videos because you like hearing about how liberal democracy stomped fascism with superior tactics, weapons and intelligence?

            Here’s some videos by actual fascists! Women are the patriarchy!

            Oh you like videos about Cold War Russia and espionage?

            How about this video about why Ukraine is run by Jewish paedophile Nazis?

    • Hype@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Next they will announce that they are suing Disney because he watched the History Channel, and that had violence on it which contributed to his actions.