This plugin is included in
i) Paid (unbranded) Tac Anti Spam Collection
This plugin is not included in the free Tac Anti Spam Collection
What this plugin does:
- lowers bandwidth & query resource usage from spam bots
- lowers bandwidth & query resource usage from scrappers
- lowers bandwidth & query resource usage from human simple DOS attacks (but humans are not what this add-on is targeting)
- avoids catching spiders/crawlers
- Dynamically synchronise the .htaccess file, for a truly 0 query method of stopping bots from hammering
- Turn Off DeDos for logged in users.
This plugin primarily targets Spam Bots/Scrapers with high resource usage. When this plugin is used in combination with FoolBotHoneyPot, a large percentage of spam bots are detected and cached. Once cached, these spam bots then use limited sever resources.
FoolBotHoneyPot: Detects bots that attempt to register, and then caches them.
DeDos: Detects bots that attempt to quickly Login/Register/Scrape pages over and over and then caches them.
By default, the ACP options for DeDos are set up so that it should be very rare that humans will even see the warning page (if at all, unless they are malicious), but it will still catch spam bots that would have used significant resources.
- Please note, this is not a preventive measure for DDos attacks, Distributed Dos attacks (those from many thousands of IP addresses usually from botnets) should be prevent with hardware, not software.
Many spam bots, scrapers and some users will often hit your site many times within a small time range. When they do this, they can take up a significant amount of bandwidth (from downloading page content over and over) and can also hit your database with many queries and take up server resources (from hitting pages with a large number of queries over an over). Spam bots and scrapers do not cache pages, so each time they visit, the full content of the page is often downloaded.
- DeDos reduces the number of times this is possible, by default, if the user hits 6 pages or more within 7 seconds, a friendly user message is displayed to the user. This friendly message then counts down and redirects them to the original page. If they continue to hit more pages after seeing the message (bots will, humans shouldn't), by default if they hit 8 pages or more within seven seconds, they are locked out of the site and their IP is cached. From then onwards, that IP will only see a 401 Unauthorised page (and only take up 1 query instead of 15- 25 queries).
Friendly User Dos Message if the user hits 6 pages or more within 7 seconds:
After further attempts and hitting 8 pages of more within 7 seconds, this message is all they will see site wide (their IP is cached):
When spam bots visit your site, they will often unintentionally Dos attack your site. They do this by hitting many pages in a very small time frame. For instance, many bots may think they have already registered with several accounts (or some of their accounts), they then hit the login page with various user accounts to see which were successful (if any). They can hit the login page 10 times in just a few seconds, if many spam bots do this with many attempts, this can have an impact on your resources. They will often also jump from page to page (finding relevant forums to post in to) and attempt to register many times within a small time frame. All of these attempts can have an impact on resources (particularly if many spam bots do this and you have limited server resources)
Scrapers are bots that go from page to page either looking for relevant threads/posts to automate against, often leaching the data for their own purposes including looking for email addresses they can retrieve and spam. Often scrapers will not care how malicious they are to your site and can go though thousands of pages (taking up gigs of data) in a very small time frame
Some human users will Dos attack your site by simply pressing ctr-F5 (refresh) over an over. This is a very simple Dos attack and the impact can be reduce with DeDos. However, if you notice a human doing a simple DOS attack from the logs, preventing an IP address with htaccess / CPanel should provide earlier protection (this plugin focuses on spam bots/scrappers, where continuously updating htaccess/CPanel is tiresome)
Spider/Crawlers can hit many pages quite quickly, however this type of bot detection is avoided with DeDos. DeDos uses the XenForo core methods to avoid detecting these types of bots and also looks at the User Agent. Spam Bots will almost always disguise their selves as normal browser users (I have never seen a case in recent years where they haven't), spiders/crawlers will always exposes their selves with the user_agent (user_agent is always logged to confirm spiders have not been stopped). If the user agent does not look like a browser, the dos is ignored (since it could be an unknown spider/crawler). The user_agent of each Dos attempt is always shown in the logs