Calling everything censorship is confusing and a waste of time. Using the term in its broadest sense leaves so much room for confusion that you'd be better off using a different term. And that may just be the case at this point. For example: Government putting you in jail for saying you hate war is censorship. X removing child pornography is censorship, in the broadest sense. But it's also an illegal and immoral material. Let's say someone on X posts a video detailing how much they hate some particular race and want them to all die, for example. This isn't necessarily illegal in the US without some threat, but that doesn't mean that it isn't a violation of the service's terms of use or moral standards. Removing it is censorship in a broad sense, but it is their right and I'd guess a large portion of X users and customers would prefer to not associate with such content. I prefer to use censorship in the context of force, such as government force against pure speech or thought. The lines are a lot more clear. Private property vs Government force. Nostr clients and relays should absolutely be able to filter certain objectionable content. I would not use Nostr if I had to be subjected to whatever anyone anywhere wanted to post. They can run their own relay and client if they wish, but that's the freedom aspect to me. I don't value the freedom to subject anyone to anything at any time. That isn't what free speech even means., and I don't think telling someone to simply not use Nostr is a strategic response. That's a great way to go nowhere fast and end up associated with something that becomes known as a place for child pornography creators and similar to hangout. Good luck with that. What gets filtered will always be debated, and my opinion is that Nostr is beautiful because we can all choose what level of filtration is right for us. I think this issue will be a bigger focus as more people join. It has to be. I've already seen things using certain clients that were annoying, and other things that were absolutely disgusting and immoral. I would have already dipped if that was the case no matter what I did as a user. Some baseline safety is going to be required for most people to comfortably use Nostr, and that means there will always be some 'censorship' involved. It isn't always bad, and having it as an option isn't a bad thing either. What is bad is having no control as a user one way or the other, which is what I think Nostr will actually solve.

Replies (1)

banjo's avatar
banjo 1 year ago
So... Let's suppose a relay operator "filters" out spam--users are happy, and they sign up for that relay. And it pospers--in fact, it becomes one of the most utilized relays on Nostr... And...then let's suppose that relay operator at some point in the future decides that "Hunter's Laptop" is disinformation, and for the good of his users he begins "filtering" that...but...he never tells his users. You can (of course) think of many such examples. And yes in this model what and how to "filter" becomes the choice of each relay operator...and yet (if so) it also then becomes the RESPONSIBILITY of those relay operators to act altruistically, and to not become individual arbiters of truth. This then becomes the proverbial "slippery slope"... And while advocates would say "I'd never censor 'Hunter's Laptop'" the unfortunate truth is (likely) that some relay operators will be tempted to inject their own biases into their relays. How will relay users then know what's being "filtered" (censored) by the relay operators? Or will users need to blindly trust those operators to not censor something else? And isn't that exactly what happened with Facebook and Twitter (and why Nostr was "born" in the first place)?