I was just interested to see how much had changed since December when I was randomly banned with no reason given and the appeal went unheard.
I was just planning to browse and not log in but it immediately gave me a pop-up notification to create an account.
Curious if it would allow me, I submitted the application and it went through without problem.
I didn’t want to draw too much attention to myself, so I didn’t comment or post for the first month.
Then, I made another mistake. I got invested, again and it didn’t take long before it inevitably became political.
I posted a reply to someone who had clearly been chugging the orange kool-aid. Nothing terrible, just what I thought was objective reality, easily verifiable through past reporting through many credible media outlets.
But, reddit is going to reddit. Somehow objective reality has been classified as hate speech on reddit.
I’m not surprissd. Just a little disappointed.
I appealed, I do not expect a reply.
Previously, I had been on reddit for 10 years without much incident.
This time I lasted only 2 months
Huh interesting that you can say the most vile shit over there as long as it’s towards the “correct” people you won’t get banned.
The same thing will happen here if we’re not vigilant. Here’s the problem:
Conservatives love trolling, and they don’t care if they break every rule to push their agenda. Progressives do not love trolling as much.
A troll is out to start a fight and to cause problems. This means that conservatives are out there in every subreddit/community trolling. One way to troll is to become an expert at reporting things. Read all comments trying to find anything that could get them banned. If trolls don’t find something that can get their targets banned, then they can join the conversation. Either trolls can pretend like they agree with them, and see if you can’t get them to say something that’s technically against the rules, or they can argue with them, and troll them just within the bounds of the rules, in the hopes that they’ll retaliate outside of the rules, so they can be reported.
Meanwhile, progressives are less likely to troll. They’re not looking for reasons to report. They’re trying to discuss the issues, not abuse the system.
And conservative moderators are going to ban people for reporting things if they detect that there is political disagreement, making it more difficult to bother reading and participating. Meanwhile normal moderators will judge based on the actual reports.
they are also known to brigade and mass report someone, plus alot of them are propaganda bots too.
i like how reddit made thier “violations” more vague, identity based hate can mean anything lol. im guessing that now includes republican promoting misinformation is protected speech. making easier for one to be banned.
reddit allows heavy astroturfing by the right, drowning out left leaning comments
Also I once received a message from a Reddit admin, and I thought their grasp on the English language was… tenuous. It wouldn’t surprise me if they offshore people to do this work, and if they don’t speak English very well, then there’s no chance for them to make great decisions about violations.
Ai moderation allows them plausible deniability, so they can ignore peoples appeals for bans, most of them. alot of thier responses are most likely AI generated too.
We don’t love trolling as much because we have actual jobs.
It’s not just a matter of time, but a matter of mental fortitude.
I’m sure anyone could find tons of content to get conservatives banned, but that requires reading conservative content.
You know, this might actually be a good use case for everyone’s least favorite tech… AI. You could find bannable content without having to wade through all of the surrounding mind numbing garbage.
I’m really looking forwards to post-AI bubble tech because its going to be really super interesting seeing all of the creative usecases for machine learning where it actually works and can do well at what its meant to do. Machine learning has been a purely academic field and the sudden crazy amounts of money being thrown around for it over the last 3 years is absolutely insane even without that context. Now there’s tons of nice tools that didn’t exist before for working with AI models and tons of new education that didn’t exist previously. Its going to get really interesting for sure!
i known someone pre-pandemic that wanted to get into AI research, i asusme machine learning before the pandemic, yea i was discussing that with him, but he has a degree in another field and i told him t just get a job in tech right off the bat, since he alot of experience in programming, wasting his time trying to get a MASTERS, and plus universities often dont allow a 2nd bachelors which hes trying to get for some reason.