I did read the links, and I still strongly feel that no automated mechanical system of weights and measures can outperform humans when it comes to understanding context.
But this is not a way to replace humans; it’s just a method to grant users moderation privileges based on their tenure on a platform. Currently, most federated platforms only offer moderator and admin levels of moderation, making setting up an instance tedious due to the time spent managing the report inbox. Automating the assignment of moderation levels would streamline this process, allowing admins to simply adjust the trust level of select users to customize their instance as desired.
There has to be a way to federate trust levels otherwise all of this just isn’t applicable to the fediverse. One of the links I posted talks about how to federate trust levels. So the appeal is processed by a user with a higher trust level.
A system like this rewards frequent shitposting over slower qualityposting. It is also easily gamed by organized bad faith groups. Imagine if this was Reddit and T_D users just gave each other a high trust score, valuing their contributions over more “organic” posts.
You are just assuming that this would work similarly to Reddit based on karma. I don’t know why you would assume the worst possible implementation just so you can complain about this. If you had read the links, you would know that shitposting wouldn’t help much because what contributes most to Trust Levels in Discourse is reading posts.
deleted by creator
deleted by creator
Having AGI as moderators would be a futuristic dream come true. However, until that becomes a reality, it’s crucial to consider the well-being of human moderators who are exposed to disturbing content like CSAM and graphic images. I believe it would be important to provide moderators with the ability to decrease their moderation levels to avoid such content.
Why would anyone contribute? Would you pay someone to work for you if they don’t want to listen to anything you have to say? When they close issues without allowing the community to provide input, that’s exactly what they are doing. If they were too busy to engage with the issue tracker, I wouldn’t mind. However, if they simply appear to close issues with numerous upvotes and no downvotes, it frustrates me.
I think an appeal process to punish moderators abusing power would help with that.
You are probably thinking about StackExchange, I don’t see anybody saying anything about popularity when talking about Discourse. It’s a matter of doing it like Discourse and not like StackExchange.
I have the same right they have to ask for contributions and support for something they are going to do their way.
I don’t really care all that much about any particular issue. I enjoy copying the ideas suggested by others in the fediverse and transforming them into new issues, as many individuals do not take this initiative.
deleted by creator