Some of you might have read my little tutorial about how to use Google Skipfish for web vulnerability scanning. While I was fascinated by the efficiency and speed of this application, I started to use it more often. Although manual testing can’t be replaced by a machine, web vulnerability scanners are still a helping hand. During […]

Author:

Some of you might have read my little tutorial about how to use Google Skipfish for web vulnerability scanning. While I was fascinated by the efficiency and speed of this application, I started to use it more often. Although manual testing can’t be replaced by a machine, web vulnerability scanners are still a helping hand.

During my tests, Google Skipfish discovered some vulnerabilities within websites (CMS, blogs etc.) and did a very good job revealing especially XSS vectors. But as the title of this blog post already states, I am no longer excited about Skipfish.

Too noisy about unimportant stuff
Skipfish is very fast in comparison to other tools, but for a reason I fail to understand the application also scans for charset declerations and numeric names (which can be enumerated). This means that the scan takes longer than necessary and that the log files are spammed with false positives. Yes, you can switch some of that stuff off, but still you get results which can’t be used for security purposes.

Log files get generated _after_ the scan
When you start Skipfish and know that the scan takes while, you are normally curious about first results while the scan is still in process. Right? Yes, me too. Sadly the log files only get generated when the scan is completed (or aborted) and sometimes even this log file generation failes when there is not enough disk space. It would be awesome if the log file would be created when the scan starts and then be extended during the scan.

Obvious vulnerabilities are not found
Skipfish constantly failes to find LFI or SQLi vulnerabilities within prepared websites I crafted. Where manual testing succeeds, this application fails to discover most of the stuff.

Too many false positives
For an unappearent reason, Skipfish declares secure websites as vulnerable to e.g. SQL injection attacks. An example is Joomla: While scanning my test installation, Skipfish triggered “high impact vulnerabilities” by calling the URL /joomla/index.php/index.php. While proceeding in the scan, Skipfish also thought that /joomla/index.php’ is vulnerable (which is wrong). Another example would be that Skipfish sometimes declares websites as vulnerable to XSS attacks when the search term “skipfish” appears somewhere in the source code. Skipfish fills out all forms in the test website and then sometimes discovers itself in the source code.. although the filters are effective in protecting from XSS attacks.

Skipfish loves to enumerate own log directories
Don’t make the mistake and run Skipfish on the same machine where your test object is located at :) Skipfish loves to crawl its own log directories and tries to enumerate file names (e.g. /var/www/skipfish/log_dir_1/admin.tar.gz). In fact this is not really wrong since Skipfish should find log files on _other_ web servers but still this is very annoying. Scanning the log file folders takes very long and does not have many advantages.

Please don’t get me wrong – I like skipfish. It does a good job in many ways, it is fast and easy to use. I think it just needs some improvements and maybe in 1 or 2 years, it is the leading application on the free vulnerability scanner market.

Update 2010-09-20: I have received an email from Michal Zalewski, the or at least one guy behind Google Skipfish. He comments my blog post and I feel obligated to share his opinion with you. This is only fair, right? :)

Hey,

Some comments :-)

1) “Skipfish is very fast in comparison to other tools, but for a
reason I fail to understand the application also scans for charset
declerations” – actually, there are very good, security-related
reasons for this – see item #12 here:

http://code.google.com/p/skipfish/wiki/KnownIssues

You can limit the verbosity of these checks by using the -J option, though.

Brute force of file names and directories can be trivially disabled,
too – but it’s done for a very specific purpose – to discover things
such as index.php.old, secret /admin/ directories, etc.

2) “Obvious vulnerabilities are not found”

Have you reported these to me?:-) The only way I can improve the
scanner is when I get feedback from users, and it’s actually extremely
frustrating that people are so hesitant to do so.

3) “Another example would be that Skipfish sometimes declares websites
as vulnerable to XSS attacks when the search term “skipfish” appears
somewhere in the source code.” – that’s hopefully not true. Skipfish
consider pages to be vulnerable to XSS only when it successfully
managed to inject a special, unique HTML tag, or its own HTML
parameter, on the page. Again, if you see any examples to the
contrary, please let me know.

“Skipfish fills out all forms in the test website and then sometimes
discovers itself in the source code.. although the filters are
effective in protecting from XSS attacks.” – again, this is unlikely.
The XSS checks are actually one of the strongest suits of the tool,
and usually alert you to valid XSS vectors, even though some of them
may be very subtle.

4) “Skipfish declares secure websites as vulnerable to e.g. SQL
injection attacks. An example is Joomla: While scanning my test
installation, Skipfish triggered “high impact vulnerabilities” by
calling the URL /joomla/index.php/index.php.” – please report false
positives if you see any. See problem #10 for one possible
explanation, though:

http://code.google.com/p/skipfish/wiki/KnownIssues

Cheers,
/mz

Comments on this entry (no comments)

Did you like this post? You can share your opinion with us! Simply click here.

Add Your Comment

Powered by sweetCaptcha



8 + = ten