• drone509@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 months ago

    I understand those concerns, but I’m not sure if this really improved the security of mastodon, an inherently very insecure software, and it definitely deprived us of a useful tool. Defederation works at stopping spam, but I don’t think it really helps much when it comes to preventing people from seeing things you post. It stops a single server, but bad actors can just migrate to a new one, or spin up a new hostname.

    • Blaze (he/him)@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      but bad actors can just migrate to a new one, or spin up a new hostname.

      Then you defederate from it too. I just went through some instances list, some servers have been defederating Mastodon instances like crazy

      • rglullis@communick.news
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Then you defederate from it too.

        Okay, let me create an account on mastodon.social and use it to scrape content from every other instance.

        Better yet, let me create an account on “i-want-privacy-in-a-public-internet.example.com” and access the federated timeline directly, then I can go and push the content from everyone into this discovery service.

        What are they going to do? Unless they go to the point of asking for physical evidence behind the person asking for accounts and/or only give invitations to people they already know, and *completely shut down their own servers to the outside world, they will never be able to avoid data leakage.

        And if they do get to do any of this, then what is the point of using anything based on ActivityPub? They will be better off by just using any of the existing group chat servers like Discord (or Matrix/XMPP if they still care about FOSS.)

        • Blaze (he/him)@feddit.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          3 months ago

          The point we were discussing was not data leakage, it was the inability to defederate from a huge instance which would overflow the number of users, similar to the way people imagined what would happen if Threads federated, and Lemmy is suddenly overflown with people usually on Facebook.

          It’s not a bad thing per se (anyone can make their own opinion), but not having even the option to defederate is the issue.

          • rglullis@communick.news
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 months ago

            No, admins might think of defederation as a way to avoid interaction with larger instances, but in the case of the bridge it was mostly regular users crying “I don’t my content going in a place that I do not control”, with “lack of opt-in” and “this violates GDPR” being the main reasons cited to be against it.

            With Threads is the same thing. The whole thing with users asking their admins to block threads is not because they were worried about Threads pushing too much to the smaller instances, but to block Threads from mining data from the Fediverse to their profit.

            • Blaze (he/him)@feddit.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 months ago

              I wouldn’t be so sure, a lot of people pointed out that the privacy argument wasn’t one as everything is accessible publicly.

              • rglullis@communick.news
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 months ago

                Yeah, lots of people were trying to point that out, those people were not the ones screaming at snarfed. It was the “mah privacy” crowd that was panicking at the thought of data being available and searchable in a server outside of their own.

                • TimLovesTech (AuDHD)(he/him)@badatbeing.social
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 months ago

                  I think what you are talking about is instances that may have a large population of marginalized groups, and the fear that someone is creating a database that could be used to easily seek them out and use it for trolling and such. Which I think is a very valid concern.

                  And as mentioned above, you have the crowd that wants to take an instance and give all their posts over to for-profit corporations like Threads and Bluesky, that should not even be called part of the fediverse IMHO.

                  I don’t know how you make a global search for the fediverse that avoids both of those issues though.

                  • rglullis@communick.news
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    edit-2
                    3 months ago

                    marginalized groups, and the fear that someone is creating a database that could be used to easily seek them out and use it for trolling and such.

                    The fear might be justified. I don’t question that the issue exists, but the belief that they can stop it.

                    Let me repeat: there is no real privacy in any social network. If people are genuinely afraid of being targeted because of what they write online, the solution is not to give them a false sense of privacy, but to educate and empower them to use messaging platforms that are provably secure.

                    Those that are telling marginalized folks to use instance XYZ because “they don’t federate with threads and therefore are safe” think that they are being helpful, but in reality are putting them at even more risk because they are telling all of them to concentrate in the same place and make the targeted tracking even easier for malicious actors.