Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?

Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.

  • deffard@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    The author demonstrated that the challenge can be solved in 17ms however, and that is only necessary once every 7 days per site. They need less than a second of compute time, per site, to be able to send unlimited requests 365 days a year.

    The deterrent might work temporarily until the challenge pattern is recognised, but there’s no actual protection here, just obscurity. The downside is real however for the user on an old phone that must wait 30 seconds, or like the blogger, a user of a text browser not running JavaScript. The very need to support an old phone is what defeats this approach based on compute power, as it’s always a trivial amount for the data center.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      That’s counting on one machine using the same cookie session continuously, or they code up a way to share the tokens across machines. That’s now how the bot farms work