(Open Source Web Application Security Scanner Framework)
There's a new version of Arachni, an Open Source, modular and
high-performance Web Application Security Scanner Framework written in Ruby.
Brief list of changes:
* Massively decreased RAM consumption.
* Amount of performed requests cut down by 1/3 -- and thus 1/3 decrease in scan times.
* Overhauled timing attack and boolean/differential analysis algorithms to fix
SQLi false-positives with misbehaving webapps/servers.
* Vulnerability coverage optimizations with 100% scores on WAVSEP's tests for:
* SQL injection
* Local File Inclusion
* Remote File Inclusion
* Non-DOM XSS -- DOM XSS not supported until Arachni v0.5.
* Implemented Scan Scheduler with support for recurring scans.
* Redesigned Issue table during the Scan progress screen, to group
and filter issues by type and severity.
Issues tableThe issues table has been massively redesigned to provide more context at a glance and help you prioritize and focus on the issues that interest you most.
While the scan is running and new issues appear, High and Medium severity type groups will, by default, be displayed as expanded, to show each logged issue, while Low and Informational severity ones will be displayed as collapsed. This way your attention will be drawn to where it’s most needed.
Of course, you can change the visibility settings to suit your preferences, using the controls on the left of the table, as well as reset them to their default configuration.
Scan schedulingThe major change for the web interface is the addition of the much awaited Scheduler, which combined with the existing incremental/revisioned scans provides quite a powerful feature. In essence, it allows you to schedule a scan to run at a later time and optionally configure it to be a recurring one.
What’s interesting here is the recurring bit, each scan occurrence is not a separate entity but a revision of the previous scan, this way you’ll be able to track changes in your website’s security with ease. It also allows you to speed things up by providing you with the ability to feed the sitemaps of previous revisions to the next one (either to extend or restrict the scope), thus making the crawl process much faster (or skipping it altogether).