Go to Top

History

A brief history of the project from the horse’s (founder’s and lead developer’s) mouth:

The short version is: It was summer and I was bored.

The slightly longer version: I had lots of free time the summer of 2010 because I couldn’t get a job as I was waiting to see if I had been accepted for the Royal Holloway’s InfoSec MSc programme.
And with a whole summer free and nothing to do but sit on my ass I had to find something to keep myself busy.

Thus, Arachni was initially developed as a way to kill time and as a Ruby exercise; I had been curious about the language for quite some time and a web application security scanner seemed like something that would cover a lot of programming aspects.

I was also curious about these sort of scanners too so writing one in Ruby seemed like a good way to kill 2 birds with 1 stone — here’s a monologue about when I caught the bug: http://trainofthought.segfault.gr/2010/06/20/thinking-of-developing-a-foss-web-application-vulnerability-scanner/.

Things kind of snowballed from there, the challenges posed by that sort of system were too interesting to give up and so I kept working on it even after I got accepted into the MSc programme.
Truth be told, I attended a grand total of around one and a half hours of classes as I devoted most of my time developing Arachni instead of studying — that was money well spent (luckily I’m a freakishly quick study so I a 2 month preparation prior to the exams was enough for me to to pass :) ).

At this point Arachni started leaking into the industry with a few companies using it (both internally and to provide web application security scanning services to their clients) which increased my resolve to keep working on it and meant that it had surpassed the initial hacky/side-project time-killer and started to become a viable solution that could stand side-by-side with the established products/projects.

This brings us to now (30/01/2011 at the time of writing), Arachni is widely gaining in popularity and industry adoption with the inclusion of brand new distributed capabilities and sophisticated analysis techniques. A lot of people consider it one of the best solutions out there and a few have even gone so far as to say that is it the future of such systems and will soon overtake the massive commercial systems — fingers crossed.