Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?

Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.

  • pipe01@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    I wouldn’t be surprised if most bots just don’t run any JavaScript so the check always fails