• xthexder@l.sw0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    21 hours ago

    Unfortunately robots.txt only stops the well behaved scrapers. Even with disallow all, you’ll still get loads of bots. Setting up the web server to block those user agents would work a bit better, but even then there’s bots out there crawling using regular browser user agents.