Web Site Application Security Testing Still Surprisingly Neglected

Ralph DeFrangesco

The Web Application Security Consortium, a non-profit organization dedicated to improving Web application security standards, released its Web site vulnerability report in September. The report is available free of charge from the WASC in a document titled, "Web Application Statistics Project 2007." What is interesting about the WASC report is the percentage of sites that are still subject to well-known vulnerabilities. Of the 32,717 sites scanned, 31 percent were vulnerable to cross-site scripting, 23 percent to information leakage, 8 percent to SQL injection, and 10 percent to predictable resource location.

 

In addition, the report went on to point out that manual scanning (black box or white box) was superior to automated scanning. Although I think this bears more testing data, IT executives should consider where they are spending their IT dollars for Web site vulnerability testing. The report does not detail the automated tool(s) used or their configuration. In addition, it does not indicate the level of effort needed for the manual scans. However, when you look at the overall numbers, automated scanning detected only 8 percent of high-severity vulnerabilities, whereas manual scanning detected 97 percent of the same vulnerabilities, a difference too large not to take notice.

 

The WASC categorized vulnerabilities into one of six classifications: authentication, authorization, client-side attacks, command execution, information disclosure and logical attacks. Manual testing outperformed automated testing in every vulnerability category by at least a factor of two.

 

Today, more than ever, Web site application security testing should be a very important part of an organization's overall security program. I think this report is a good starting point as to where to spend our IT dollars and security efforts.



Add Comment      Leave a comment on this blog post
Dec 17, 2008 5:30 AM Gerald Gerald  says:
The difference between automated scanning catching only 8% of vulnerabilities and manual scanning catching 97% sure stands out. What also stands out is that the report is a year old having been done in 2007 (or is that a typo?). Do current scanning tools catch more? Reply
Dec 18, 2008 8:57 AM Ralph DeFrangesco Ralph DeFrangesco  says:
Gerald,Good question. If you look at the report, it collects data from companies for the year prior. So the data from this project was collected in 2007 and published in 2008. I don't know why they take so long to publish it.-Ralph Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.