This may sound like a weird thing to do, but I realised that many crawlers and bots are somehow still able to get past my Anubis. I presume they have gotten smarter and are capable of using JavaScript.

To counter this, I want to link my Anubis to an Iocane setup such that:

Internet > nginx reverse proxy > Anubis > Iocane > my site/app

My hope is that two different filtering mechanisms (one of which will actively poison and waste the bot’s resourced) will protect my system better.

I thought I’d ask before actually trying out something like this.

  • nemecle@jlai.lu
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Yes, bots are starting to get around it so you need to keep it up to date but it turned two of my services from inaccessible to users to usable (not just for a few hours, been running it for months)