Fuzzing Lies at the Heart of 2006 Vulnerability Increases
Posted by Gunter Ollmann on October 12, 2006 at 1:09 PM EDT.

OK, so, in general, developers are writing better code and the applications of 2006 are generally more secure overall.  So why are there still so many vulnerabilities being disclosed?  In fact, why is 2006 already a record year for vulnerabilities? - As with all problems in security, nothing is ‘in-your-face’ obvious.

I think the reason for the massive (and continued) up-tick in vulnerabilities is largely due to three key factors:
(1) There are more applications out there.
(2) The applications out there do more ‘stuff’.
(3) Every application wants to share something-or-other with something else.

The consequence of this is that applications have become much more complex and it has become increasingly difficult for the developer (and vendor) to assess every aspect of how their application (and components of their application) may be called and used when deployed on the host.

In the meantime, security testing tools have advanced substantially over the last couple of years, and the fuzzers of today are a orders of magnitude better at uncovering new bugs.  It’s just unfortunate for many vendors that (upcoming) security professionals are seemingly more proficient in their use and have a better eye for differentiating between a bug and a vulnerability. 

In fact, if you look closer at the makeup of vulnerabilities disclosed thus far in 2006, you will notice that the majority of them are ranked medium ‘risk’ and are associated with content-level flaws (e.g. office documents, media files, HTML/PHP/XML pages, etc.) – things that can be very easily fuzzed.  In a way this is good news.  While those (upcoming) security professionals are manically fuzzing everything file format they can lay their hands on, they are spending less time looking for vulnerabilities in network services (i.e. Critical and High risk vulnerabilities) – good.  I don’t think anyone likes to hear about a new network routable vulnerability that doesn’t require authentication and grants system-level access – anyone but the botworm designers anyhow.

However, I do wish some software vendors should be a little more rigorous in their testing processes.  It doesn’t take much effort to learn about fuzzing and incorporate it in to a testing process.  In comparison to the relative cost of developing commercial applications, the cost of getting in a security consultant to do some security code auditing (and fuzzing) with a little shoulder-surfing thrown in is pretty low and I think many vendors could quickly reap the benefits.

So, while code is gradually getting more secure, the opportunities for someone to uncover new bugs and vulnerabilities are on the increase.  I guess, if (as a software vendor) you’re happy enough to rely upon the new generation of script-kiddies to fuzz your application in their quest for fame and fortune (in case you haven’t been reading the news, there’s money to be had for vulnerabilities!), and think of them as a free QA resource – good luck to you.  I just hope you’re putting that money you’re saving from security testing towards your PR team to help manage the effects of their public vulnerability disclosures at a later date.  
    Copyright 2001-2007 © Gunter Ollmann