Security agencies release Top 25 programming errors

Public-private effort looks to tie list to contract language

A coalition of government, academic and private-sector security organizations today released an updated version of their Top 25 list of programming errors considered to be responsible for the majority of security vulnerabilities plaguing software.

The project was managed by Mitre Corp. and the SANS Institute, but impetus came from the National Security Agency with funding from the Homeland Security Department’s National Cyber Security Division, and help from more than 30 organizations that volunteered time and effort. The list, first created last year, builds on the work done by Mitre and DHS in developing the Common Weakness Enumeration scheme, which provides a common language for identifying and discussing software errors that can create security vulnerabilities.


The List: Top 25 programming errors


“We’re trying to understand the fundamentals” of software errors, said Joe Jarzombek, director for software assurance in the DHS National Cyber Security Division. Before the CWE was developed, “we didn’t have a way of identifying explicitly exploitable software errors.”

This year’s Top 25 was selected from 41 common errors, and the list prioritizes and ranks the errors based on prevalence and severity. A set of mitigations and best practices also has been developed that addresses many of the problems. But one of the key efforts this year is to use the list as a basis for standard contract language that would require developers to test for and remediate the bugs.

“The purpose is to shift the responsibility for fixing errors to the vendor so that the vendor fixes it before it is delivered,” said Alan Paller, director of research for SANS.

New York State has produced draft procurement standards that includes errors specified in the list. Language is being developed by a working group with the state’s Cyber Security and Critical Infrastructure Coordination office and drafts will be posted at www.sans.org/appseccontract. The standards would be aimed at custom software developers who create programs under contract rather than retail vendors such as Microsoft. The goal would be to put responsibility for basic security on the vendor rather than the customer. By defining a set of minimum standards of due care, it also could help to give developers a legal safe harbor from liability.

“My guess is that the language will go through two more drafts” before it is finalized and begins showing up in government procurements, Paller said.

Use of the standardized language would be voluntary, but some security experts are not enthusiastic about tying the Top 25 list to contracts.

“There are some good points to lists like this,” said Gary McGraw, chief technology officer of Cigital and an outspoken critic of the scheme. “Making people aware that there are security bugs is good. But if you start tying this to liability and create safe harbors, I think that borders on silly and it worries me.”

McGraw last year published a list of 11 reasons why Top 10 (or Top 25) lists don’t work. The lists are too generic to fit all cases, he says, and cannot address all of the issues that need attention in managing risk.

“Instead of a popularity contest, I’d like to see some data used,” to quantify the importance of a given programming error, McGraw said.

This year’s list takes a step in that direction by ranking the 25 errors with numerical scores based on severity and prevalence. The errors, in order of importance, are:

1. CWE -79: Failure to Preserve Web Page Structure ('Cross-site Scripting').

2. CWE-89: Improper Sanitization of Special Elements used in an SQL Command ('SQL Injection').

3. CWE-120: Buffer Copy without Checking Size of Input ('Classic Buffer Overflow').

4. CWE-352: Cross-Site Request Forgery (CSRF).

5. CWE-295: Improper Access Control (Authorization).

6. CWE-807: Reliance on Untrusted Inputs in a Security Decision.

7. CWE-22: Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal').

8. CWE-434: Unrestricted Upload of File with Dangerous Type.

9. CWE-78: Improper Sanitization of Special Elements used in an OS Command ('OS Command Injection').

10. CWE-311: Missing Encryption of Sensitive Data.

11. CWE-798: Use of Hard-coded Credentials.

12. CWE-805: Buffer Access with Incorrect Length Value.

13. CWE-98: Improper Control of Filename for Include/Require Statement in PHP Program ('PHP File Inclusion').

14. CWE-129: Improper Validation of Array Index.

15. CWE-754: Improper Check for Unusual or Exceptional Conditions.

16. CWE-209: Information Exposure Through an Error Message.

17. CWE-190: Integer Overflow or Wraparound.

18. CWE-131: Incorrect Calculation of Buffer Size.

19. CWE-306: Missing Authentication for Critical Function.

20. CWE-494: Download of Code Without Integrity Check.

21. CWE-732: Incorrect Permission Assignment for Critical Resource.

22. CWE-770: Allocation of Resources Without Limits or Throttling.

23. CWE-601: URL Redirection to Untrusted Site ('Open Redirect').

24. CWE-327: Use of a Broken or Risky Cryptographic Algorithm.

25. CWE-362: Race Condition.

Despite his lack of enthusiasm for Top 25 lists, McGraw said that he believes the attention to software quality and security created by such efforts has contributed to the quality of software.

“I am optimistic we have made a lot of progress,” he said. “Ten years ago there was little attention being paid to it. Now I am aware of 58 large-scale software security initiatives by companies that produce a lot of software.”

Results are not always immediately visible, he said. “Product cycles take a long time. But if you look at Windows 7 compared to Windows 2000, things are a lot better.”

About the Author

William Jackson is freelance writer and the author of the CyberEye blog.

Who's Fed 100-worthy?

Nominations are now open for the 2015 Federal 100 awards. Get the details and submit your picks!

Featured

Reader comments

Thu, Feb 18, 2010

There is a software company in Cambridge, MA (WebLayers?) that offers a product that prevents these types of things from happening. I saw them at a trade show and they talked about how they automate standards throughout the SDLC- they specifically mentioned being able to prevent the CWEs. Cool technology

Wed, Feb 17, 2010 Mike Salisbury DC

You call the errors "Programming" errors, but if they are not mentioned in the requirements they are requirements or contract errors. If they are in the requirements, but not addressed in the design they are design errors. If they are in the design, but not addressed in the program, only then are they programming errors. Your (correct) use of "developer" to mean the development team or organization tends to lose out to the incorrect use of "Programming" here. That, in turn, exacerbates the widespread misconception that programmers are the only ones that are to be called "developers." If programmers are all we need to fill the role of developers then we tend to lose sight of the requirements analysts, design engineers, tech writers etc. An understanding of Configuration Management and the benefits of Configuration Reviews would help to aleviate the problem of missing requirements elements, missing design elements and missing source code elements from our documents. -- Mike

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above