I have seen some critical views on Nostr as a part of decentralized network discussions, but most seem to be focused on culture not function.

What are the functional / protocol differences that make you prefer ActivityPub over Nostr?

  • lmmarsano@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    16 hours ago

    Why is that better?

    User control & flexibility > illegitimate authority. Also, I remember an earlier, untamed, unrulier, more subversive internet than this corporate-friendly crap: it was funner.

    any community anywhere online still needs to remove CSAM and gore and other things.

    Legal compliance is different from legally unnecessary moderation.

    Because a hashtag under a no-moderation concept could still be hijacked.

    Not really: Nostr content is cryptographically signed. User’s client subscribes to some content curators who post as signed events their tags for other events. The client processes these tagging events to filter according to the user’s preferences.

    Some proposals already exist:

    the fediverse will never be what you want it to be

    Not the topic of discussion, which is function & protocol.

    • Skavau@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      18 hours ago

      Legal compliance is different from legally unnecessary moderation.

      Right, so there would still need to be moderators. And if they can remove that, they can remove anything.

      Not the topic of discussion, which is function & protocol.

      Right, but you’re just gunna have an unpleasant time here if you loath all moderation. It’s that simple.

      • lmmarsano@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        18 hours ago

        And if they can remove that, they can remove anything.

        That’s why there’s no limit on the number of relays (events usually publish to multiple) or subscriptions to them (clients usually subscribe to multiple).

        unpleasant time here if you loath all moderation

        Still off topic, and the moderators here aren’t reddit moderator scum so far. The modlog offers decent transparency.

        • Skavau@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 hours ago

          That’s why there’s no limit on the number of relays (events usually publish to multiple) or subscriptions to them (clients usually subscribe to multiple).

          I’m lost. If a mod removes child porn on NOSTR, it removes it for everyone, right?

          Still off topic, and the moderators here aren’t reddit moderator scum so far.

          Well, sure, but you can still be moderated.

          • lmmarsano@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 hours ago

            I’m lost. If a mod removes child porn on NOSTR, it removes it for everyone, right?

            No moderators. A relay operator can remove it from their relay. There’s any number of relays. It’s roughly like usenet with respect to server-side content removal.

            Strange to ask me when authoritative information is public online. Seems you don’t know much about nostr.

            • Skavau@piefed.social
              link
              fedilink
              English
              arrow-up
              2
              ·
              10 hours ago

              So the child porn still remains present, effectively.

              So Ada described Nostr like this:

              "Nostr uses relays. In some ways, a relay is like an instance on the fediverse. Where they differ though, is that a) relays don’t talk to each other and b) users can sign up to many different relays and pull/push content to all of them.

              So in practice, in order to see a wide amount of content, you need to end up connecting to multiple relays. And even though a relay does have some moderation capabilities to block content, unless every relay you use blocks the content from the bigoted account, you’ll see it.

              If you signed up only to a single relay, and that relay had good moderation, then in theory, your Nostr experience wouldn’t be terrible, but a single niche relay like that will mean you see basically no content. And as soon as you connect to a larger public relay to get more content, you lose all of the moderation advantages offered by your first instance. Which means in practice, there is no incentive to run a well moderated instance.

              And so all of the moderation ends up on the end user, who has to manually block accounts only after they appear and dump their load of hate (at which point, the bigot will just spin up another account). Some people prefer that experience, but when you’re the regular target of hate, that approach just doesn’t work for many folk."

              This is accurate?

              • lmmarsano@lemmynsfw.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                5 hours ago

                So the child porn still remains present, effectively.

                Compulsory legal compliance still exists: it’s the free, open internet. Did you know laws existed before, too?

                Ada

                Don’t know about that. I think the brief descriptions on websites like nostr.com did a good enough job: there’s not much to get.

                It’s a protocol, not a platform. There’s no global moderation/censorship just like there isn’t on the whole internet. Relay operators have full discretion over the content available on their relays: if they want to do more than the bare minimum, they can. Clients are free to subscribe to other relays or multiple. It’s technically free association rather than anti-moderation.

                A user can choose to see only the content of followed users: that should eliminate most unwanted content. Apart from that, there’s no perfect moderation solution even on centralized platforms, so there isn’t here.

                Client-side filtering remains the best approach for those who care. It doesn’t have to be manual as I mentioned before.

                I recall earlier days of the internet when no one gave a fuck about this, and internet rage was just entertaining, easily ignored nonsense. Then it became eternal summer, and tightass n00bs started acting like moderating the entire internet & foisting their dumbass expectations on everyone made perfect sense without ever having to learn the zen of not giving a fuck. That was the start of when it all turned to shit.

                • Skavau@piefed.social
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 hours ago

                  I think you will find that most forums before Digg and Reddit did have rules. There is some revisionism from muh free speech types that seek to redefine them as free speech zones, when many were not.

                  Moreover, many clients and apps that were had much smaller userbases.


                  The point about the CP here is that you are saying that it would remain technically on the systems you are referring to.