Using ASVS for real

The last time I talked about OWASP’s new Application Security Verification Standard, I had performed a Level 2B-3 review of my forum software, UltimaBB.

This time, I’m working on a real project for a real customer. It’s been interesting.

  • Level 1A and in particular, 1B has been emasculated. I’m not really sure of the value of these reviews as they have 22 and 13 controls to review respectively. Taken together at Level 1, it might have *some* value, but I’m not sure you’re getting a whole lot of assurance. I would only recommend this level if you have like 1000 apps to review, and you need to see which are the most atrocious so you can target Level 2B / 3 / 4 reviews. Sure they can be done quickly, but I’m not sure they prove anything in particular. The good news is I reckon the current state of the art dynamic and static tools could produce reports absolutely compliant with this level with no real changes other than ASVS report format (R1-R4) and risk rating methodology. The problem is that there are no tools today that do both automated dynamic testing (a la IBM’s AppScan or HP’s WebInspect) and automated static code analysis (like Fortify’s SCA or Ounce O2) in the one tool, so combining the output of two different tools in the time available would be a challenge (whilst being incredibly boring and unrewarding to a skilled assessor).
  • The basis of a Level 2B review is “manual” verification, most likely using the results of automated scanning. You don’t HAVE to do the automated scan – you can do it by hand. Based upon my experience of both UltimaBB and this review (a mixture of ASP and ASP.NET), the automated scan was more of a waste of time than a blessing. Yes, it found XSS and was quite specific to its location, but honestly, in terms of performing an ASVS review, as long as you know how XSS works (lack of input validation and output encoding), you should be able to find them with grep or your favorite search feature in far less time. The problem is that the scanners are asked to scan for 88 things at Level 2B, and I’m pretty certain that without AI, the scanners are only going to be able to do around 50 of the findings automatically. For example, I have a CSRF token in my forum. The scanner I use claimed that each of the forms had no CSRF protection – a false positive that took nearly all of the alloted 30 minutes to eliminate, and thus put me behind schedule.
  • Time ~= money ~= quality. We tried to make Level 2B reviews work in five days as this is a current sweet spot of this rather depressed market. Five days was chosen on best practice productivity of 15 minutes per control + slop time + QA. With the new ASVS release, there’s now 88 controls to review. That’s about 1 control needing to be written up every 20 minutes, assuming four hours for QA (10%) and four hours for the Exec summary. Doesn’t sound like much, but there are many controls that simply take longer than that. Some don’t. I honestly think Level 2B reviews cannot be completed satisfactorily on large or complex code bases in five days.
  • Level 3 reviews have 109 controls in 13 categories. This is at least 10 working days’ worth of work – each control gets 30 minutes (assuming 1 day for QA and four hours for Exec summary), which is about right. I personally think Level 3 is about the right level for most code reviews as it does nearly everything a Level 4 review does, but is realistic in terms of schedule, budget and outcomes.
  • Level 4 Reviews have 121 controls in 15 categories. If anyone is offering a Level 4 review of any size code base in less than four weeks, take a very hard look at their methodology. I just don’t believe you can satisfy the need to look through each file for malicious content in less than that time frame and give it a “professional indemnity insurance proof” result. This level of review is painstaking and I doubt many people will end up asking for it.
  • Report Length will be an issue. Using the reporting format (R1-R4), the average Level 2B report in 12 point fonts will be about 100-120 pages long. This is far too long for the paying customer (== execs, which is different to the consuming customer – the developer), so spending time on the Executive Summary and follow up report out is essential if you’re to communicate the results in any meaningful way. So you know those time lines I mentioned above, take out a good four+ hours out of the juice to write up a proper Executive Summary and other report out materials. This reduces your time per control down to 20 minutes for a five day report and about 30 minutes for a ten day report. Obviously, if you have a large or complex code base, you’re going to be hosed if you’re not on the top of your game every single work day. Put away the nerf guns – it’s work time!
  • For those of you used to writing reports that eliminate sections that don’t apply, you’re going to get a shock with ASVS. You need to write up ALL 22, 13, 56, 88, 109 or 121 findings – regardless of if the code is awesome or awful. Leave time for it!
  • The OWASP risk rating scheme is a monster. 16 elements per item x say 88 items = A BUCKET LOAD OF CALCULATING. If you’re still using Word to write your reports, you may want to write macros to automate the calculation and Executive Summary elements, or else you’ll be here next year working out what the risk is. You should also check the balancing on the OWASP risk ratings. I find they produce a lot of mediums and highs. I will talk to Jeff about making the scale 0, 1..5, and producing a universal 1, 3, and 5 set of elements to make it easier to produce a more balanced risk rating. Find some of your previously QA’d risks and try them out on the OWASP risk rating to see if you get similar results (and you should!). If you don’t, adjust the risk rating methodology and document it in your report. You don’t want to be known as the risk manager’s nightmare. Too many highs == less work in the future if they constantly (successfully) argue with your ratings for being unrealistic and too high.
  • Missing controls. By design, ASVS does not have every control under the sun. Some missing controls are actually very surprising. As there’s so much work in an average ASVS review already, I doubt many folks would add back these missing controls. However, I think the ASVS team is making a mistake by not making Level 3 and 4 include some of these more common controls, particularly if the clients asking for Level 3 / 4 reviews probably already have these controls in their IT security polices and would like to know about the status in their apps. We’re talking things that are found in ISO 27001 and COBIT 4, so I’m not just being tin foil hat crazy here.

So what do I think about ASVS for code reviews? The more I use it, the better I feel that we’re meeting the our customers’ need to produce something that doesn’t produce HIGH risks for information disclosures like internal IP addresses (which are irrelevant). The customer is in control of the amount of looking at their code, and we’re producing developer ready results that affect the design and architecture of the code, which hopefully means much safer applications in a few months’ time.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *