Owner and writer of CovertWiki.org. It’s basically a wannabe spy handbook in wiki format. Feel free to leave a bookmark until more content is released, or message me on Discord under the same username to become a contributor.

  • 7 Posts
  • 57 Comments
Joined 3 months ago
cake
Cake day: July 23rd, 2024

help-circle













  • A semi-auto rifle ban is also one issue that I believe if we laid off of it, Republicans would be more willing to play ball with common sense gun regulation knowing negotiations weren’t being made in bad faith and with an ultimate goal of opening a pathway to banning semi-auto rifles.

    All of what I’ve said is already common knowledge to Republicans, but polls show they are open up to things like universal background checks and mandatory licensing. Just not when they feel like they need to use those things as a buffer to less justifiable regulatory ambitions. The Democratic attempt at voter appeasement with a hardball “all or bust” approach and a low willingness to have regulatory talks without a semi-auto rifle ban on the table has been very counterproductive on a federal level.




  • Remind me in ten years after semi-auto rifles are banned and handguns are up next on the chopping block because they [offers specific advantage to criminals that rifles didn’t]. I could probably sit here and name a good few, but the handgun’s overwhelming majority usage in crime compared to other weapon types already testifies to the dishonesty of the campaign against semi-auto rifles; a captivating or tragic story coupled with a classic alarmist piñata gets better ratings and speaks much louder than statistics, and as a consequence of effective campaigning, a conversation about banning handguns now would feel like an irrational leap to the public until semi-auto rifles are out of the way first.

    Mass shootings specifically with semi-auto rifles are of the perfect (relative) rarity to make the rounds on national news occasionally, whereas the same coverage with handguns would have to make headlines almost every morning. And when you realize that only about 3% of all criminal gun homicides are related to mass shootings, it should be clear that anyone with public safety as a concern should keep their televisions off when considering how attention and resources should be allocated to the gun violence issue. The conversation about semi-auto rifle bans has put politicians and the media into symbiosis, and truthfully, there’s a chance it might not even happen.

    Many of the common sense gun laws proposed are long overdue, but a ban on semi-auto rifles isn’t one of them right now.



  • I didn’t read very far up into the thread. Sorry.

    Automated filters will just drive determined botters to play the system and perfect their craft until they can no longer be automatically identified, in my opinion. I’m more of the stance that accounts should be reviewed manually so that a leap into convincing bot accounts will need to be much more dramatic, and therefore difficult. If it’s done the hard way from the start with staff who know how to identify these accounts, it may keep it from growing into an issue to begin with.

    Any threshold to be automatically flagged for review should be relatively low, but the process should also be quick and efficient. Adding more metrics to the flagging process only means botters will have a narrower gaze to avoid. Once they start crunching the numbers and streamline mimicking real user accounts it’s game over.


  • Signup safeguards will never be enough because the people who create these accounts have demonstrated that they are more than willing to do that dirty work themselves.

    Let’s look at the anatomy of the average Reddit bot account:

    1. Rapid points acquisition. These are usually new accounts, but it doesn’t have to be. These posts and comments are often done manually by the seller if the account is being sold at a significant premium.

    2. A sudden shift in contribution style, usually preceded by a gap in activity. The account has now been fully matured to the desired amount of points, and is pending sale or set aside to be “aged”. If the seller hasn’t loaded on any points, the account is much cheaper but the activity gap still exists.

    • When the end buyer receives the account, they probably won’t be posting anything related to what the seller was originally involved in as they set about their own mission unless they’re extremely invested in the account. It becomes much easier to stay active in old forums if the account is now AI-controlled, but the account suddenly ceases making image contributions and mostly sticks to comments instead. Either way, the new account owner is probably accumulating much less points than the account was before.
    • A buyer may attempt to hide this obvious shift in contribution style by deleting all the activity before the account came into their possession, but now they have months of inactivity leading up to the beginning of the accounts contributions and thousands of points unaccounted for.
    1. Limited forum diversity. Fortunately, platforms like this have a major advantage over platforms like Facebook and Twitter because propaganda bots there can post on their own pages and gain exposure with hashtags without having to interact with other users or separate forums. On Lemmy, programming an effective bot means that it has to interact with a separate forum to achieve meaningful outreach, and these forums probably have to be manually programmed in. When a bot has one sole objective with a specific topic in mind, it makes great and telling use of a very narrow swath of forums. This makes Platforms like Reddit and Lemmy less preferred for automated propaganda bot activity, and more preferred for OnlyFans sellers, undercover small business advertisers, and scammers who do most of the legwork of posting and commenting themselves.

    My solution? Implement a weighted visual timeline for a user’s points and posts to make it easier for admins to single out accounts that have already been found to be acting suspiciously. There are other types of malicious accounts that can be troublesome such as self-run engagement farms which express consistent front page contributions featuring their own political or whatever lean, but the type first described is a major player in Reddit’s current shitshow and is much easier to identify.

    Most important is moderator and admin willingness to act. Many subreddit moderators on Reddit already know their subreddit has a bot problem but choose to do nothing because it drives traffic. Others are just burnt out and rarely even lift a finger to answer modmail, doing the bare minimum to keep their subreddit from being banned.