This may sound like a weird thing to do, but I realised that many crawlers and bots are somehow still able to get past my Anubis. I presume they have gotten smarter and are capable of using JavaScript.
To counter this, I want to link my Anubis to an Iocane setup such that:
Internet > nginx reverse proxy > Anubis > Iocane > my site/app
My hope is that two different filtering mechanisms (one of which will actively poison and waste the bot’s resourced) will protect my system better.
I thought I’d ask before actually trying out something like this.


What do you mean? Where does it do it’s own detection?
Around here. In the default configuration, it is using the built-in handler. The script can be replaced with something like Nam-Shub of Enki (used by pretty much everything I host, and by Codeberg too, for example).
Ah that wasn’t there when I deployed it.
Scriptability has been a thing since 2.2.0, released on 2025-06-16, but the built-in script appeared in 3.0 (2025-11-14).