Sunday, January 5, 2014

SpiderFoot 2.1.0 - Open Source Footprinting Tool


                    simple web-based interface enables you to kick off a scan immediately after install - just give your scan a name, the domain name of your target and select which modules to enable.

                    You will quickly obtain information such as: URLs handling passwords, network ranges (netblocks), web servers, open ports, information about SSL certificates, and much more.

                       "Footprinting" is the process of understanding as much as possible about a given target in order to perform a more complete security penetration test. Particularly for large networks, this can be a daunting task.

                         The main objective of SpiderFoot is to automate this process to the greatest extent possible, freeing up a penetration tester's time to focus their efforts on the security testing itself.

                        SpiderFoot is designed from the ground-up to be modular. This means you can easily add your own modules that consume data from other modules to perform whatever task you desire.

                         As a simple example, you could create a module that automatically attempts to brute-force usernames and passwords any time a password-handling webpage is identified by the spidering module.


SpiderFoot 2.1.0 is now available, a major update over 2.0.5 which was released back in September.

Major improvements are as follows:

- Identifies sites co-hosted on IPs of your target.
- Checks whether your target, affiliates or co-hosts have a bad reputation (PhishTank, Google
SafeBrowsing, McAfee SiteAdvisor, and many more.)
- Identifies the ISPs and BGP AS of your target.
- Smarter at identifying owned netblocks.
- UI enhancements, including some data visualizations.
- More comprehensive searches across other Internet TLDs.
- Identifies the use of non-standard HTTP headers.
- Bing searches.
- Many tweaks, improvements and bug fixes.

Website & Download:
 Source Forge :

Thursday, January 2, 2014

Arachni v0.4.6-0.4.3 (Open Source Web Application Security Scanner Framework)

Arachni v0.4.6-0.4.3 has been released :

                     (Open Source Web Application Security Scanner Framework)

                      There's a new version of Arachni, an Open Source, modular and
high-performance Web Application Security Scanner Framework written in Ruby.

Brief list of changes:

* Massively decreased RAM consumption.
* Amount of performed requests cut down by 1/3 -- and thus 1/3 decrease in scan times.
* Overhauled timing attack and boolean/differential analysis algorithms to fix
  SQLi false-positives with misbehaving webapps/servers.
* Vulnerability coverage optimizations with 100% scores on WAVSEP's tests for:
  * SQL injection
  * Local File Inclusion
  * Remote File Inclusion
  * Non-DOM XSS -- DOM XSS not supported until Arachni v0.5.

* Implemented Scan Scheduler with support for recurring scans.
* Redesigned Issue table during the Scan progress screen, to group
  and filter issues by type and severity. 

Issues table

The issues table has been massively redesigned to provide more context at a glance and help you prioritize and focus on the issues that interest you most.

While the scan is running and new issues appear, High and Medium severity type groups will, by default, be displayed as expanded, to show each logged issue, while Low and Informational severity ones will be displayed as collapsed. This way your attention will be drawn to where it’s most needed.
Of course, you can change the visibility settings to suit your preferences, using the controls on the left of the table, as well as reset them to their default configuration.

Scan scheduling

The major change for the web interface is the addition of the much awaited Scheduler, which combined with the existing incremental/revisioned scans provides quite a powerful feature. In essence, it allows you to schedule a scan to run at a later time and optionally configure it to be a recurring one.

                                            What’s interesting here is the recurring bit, each scan occurrence is not a separate entity but a revision of the previous scan, this way you’ll be able to track changes in your website’s security with ease. It also allows you to speed things up by providing you with the ability to feed the sitemaps of previous revisions to the next one (either to extend or restrict the scope), thus making the crawl process much faster (or skipping it altogether).

 For more details about the new release please visit:

Download page:     

Homepage                      -
Blog                                 -
Documentation               -
Support                           -
GitHub page                   -
Code Documentation     -
Copyright                        - 2010-2014
License                             - Apache License v2