PDF.

The investigation looked at what content from political parties and politicians is shown to newly created accounts that are interested in parties and politicians from either the left or the right.

It found that the algorithms push content from the far right to the right-leaning and left-leaning accounts and, to a lesser extent, push content from the radical left to left-leaning accounts.

  • Diplomjodler@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 hours ago

    Algorithmic content is not bad per se. It’s the algorithms that are specifically designed to create division and hate. The people who design these things know what they’re doing and are doing it because the outcomes are that they want to achieve.

    • TubularTittyFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      They aren’t designed to do that.

      They are designed to give people what they want. People want division and hate. It fuels their attention.

      In the 1980s the people who had the longest listening times to Howard Stern were the people who claimed to hate him. The outraged fueled their attention such that they could not turn away because they wanted to see if he got more outrageous and offensive.

      People are like kids in an candy store when it comes to negative content. They cannot get enough.

      Before algos things were random and they had to try really hard to go find that stuff. The algos spoonfed it to them and they lapped it up eagerly. Back when IG was just my friends, I saw one angry rant every few weeks, today it’s 20 angry rants and 20 ads, and my friends content about their kid or cat being cute is buried by 50 other posts which I can’t get rid of.