Before taking Roboo into use for protection against DDOS it is needed to take a look into cons and pros of it.
Firstly nothing comes without drawbacks and sideeffects. Here are some that you need to consider.
- Searchengine crawlers have trouble indexing site. You never want that.
- Webservice clients have issues. Api calls might breaks and SVN server over https does not work well.
- Developers http://www.ecl-labs.org website by itself is not using the Roboo.
Good whitelisting plan must be developed to combat valid non-browser interactions.
I did some quick bruteforce analysis of performance with 3 virtualmachines on vmware. Target was simple vulnerable web application WackoPicko used to test web application vulnerability scanners 1 core 1GB RAM. Roboo machine was ubuntu server 1 core, 1GB RAM. Third was more powerful server where httperf was run. All of these machines were run inside one physical server on vmware ESXi.
Here are the testing results:
- Target alone was able to handle around 140 req/s
- Roboo was able to process 400 req/s
- Roboo memory usage in ubuntu stayed under 150MB even under 100% CPU usage
- Setting Roboo cookie took around 250ms which was added to first request loading time.
- While website was protecting website and CPU load was 100% then website seemed more reponsive than during direct attack to website.
Such a simple scenario shows that using Roboo makes your website more ddos proof by about a few times with same hardware, not hundreds or tens of times.
Even though Roboo is not a miracle machine it still has applications that must be considered. For example if easy and most active DDOS machines are blocked already then Roboo helps to handpick out these last DDOS remainders that did manage to go through firewalls. More on this later.