I have seen some critical views on Nostr as a part of decentralized network discussions, but most seem to be focused on culture not function.
What are the functional / protocol differences that make you prefer ActivityPub over Nostr?
I have seen some critical views on Nostr as a part of decentralized network discussions, but most seem to be focused on culture not function.
What are the functional / protocol differences that make you prefer ActivityPub over Nostr?
This is a good thing: bitchasses need to learn words are harmless & they can ignore them like humanity has done for millennia. It’s not built into the servers. Client-side tooling would handle it, so it’s entirely at the discretion of the user, which seems better to me.
Client-side curation sounds like whitelisting effectively, if you follow only the curated feed and the curated feed resigns all events posted by selected keys, that’s a whitelist and seems like a decent solution for casual users so long as they can find trusted curators and clients that enable them to be easily discovered and subscribed to. What client is best for this currently?
On the flipside, if those “curators” were able to export and import lists of keys to automatically exclude from feeds, that would be very useful for the curators who have to manually or automatically sort events and new users to build their feeds. Is that feature currently available? Eliminating known bot accounts from feeds seems like minimum viable feature set for new curators in the current state of play.
It’s not about that even if you reduce it to purely that. Without moderation every platform becomes infested by spammers, trolls, astroturfers. Topical communities lose focus and communities become little more than hashtags.
Yup, better. Moderation should be opt-in & is better handled at the client: the user could opt-in to a “moderation community” that publishes tags their client would follow. Such curation for anyone who wants that is a better idea. Far better than moderators we don’t get to choose.
So in your world, I could spam the network with CSAM, gore, rape, and everything else and it would be up to a small group of people to filter that out for the rest so they can subscribe to what that small group thinks is appropriate?
Why is that better?
Debateable given that any community anywhere online still needs to remove CSAM and gore and other things.
And what do you mean by “tags” here? Just hashtags or something else? Because a hashtag under a no-moderation concept could still be hijacked.
Well I suggest you go there then, because the fediverse will never be what you want it to be.
User control & flexibility > illegitimate authority. Also, I remember an earlier, untamed, unrulier, more subversive internet than this corporate-friendly crap: it was funner.
Legal compliance is different from legally unnecessary moderation.
Not really: Nostr content is cryptographically signed. User’s client subscribes to some content curators who post as signed events their tags for other events. The client processes these tagging events to filter according to the user’s preferences.
Some proposals already exist:
Not the topic of discussion, which is function & protocol.
Right, so there would still need to be moderators. And if they can remove that, they can remove anything.
Right, but you’re just gunna have an unpleasant time here if you loath all moderation. It’s that simple.
That’s why there’s no limit on the number of relays (events usually publish to multiple) or subscriptions to them (clients usually subscribe to multiple).
Still off topic, and the moderators here aren’t reddit moderator scum so far. The modlog offers decent transparency.
I’m lost. If a mod removes child porn on NOSTR, it removes it for everyone, right?
Well, sure, but you can still be moderated.
No moderators. A relay operator can remove it from their relay. There’s any number of relays. It’s roughly like usenet with respect to server-side content removal.
Strange to ask me when authoritative information is public online. Seems you don’t know much about nostr.