Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.


The robots are a problem, but luckily we’re not into the hepamegaquintogilarillions… Yet.
12,000 visits, with 181 of those to the robots.txt file makes way, way more sense, albeit, still an abusive number of visits.
I couldn’t wrap my head around how large the number was and how many visits that would actually entail to reach that number in 25 days. Turns out that would be roughly a quinquinquagintillion visits per nanosecond. Call it a hunch, but I suspect my server might not handle that.