Products + All Products + Software Integrity + Semiconductor IP + Verification + Design + Silicon Engineering
Posted by Synopsys Editorial Team on March 30, 2011
‘A short but important one, while I hop a train. Static analysis proponents, myself especially, have taken up the flag of “visibility” and paraded chanting “Customize to reduce False Positives”; I apologize. This provides tremendous benefit but misleads. Discussing the topic with @Wh1t3Rabbit, it occurred to me: time to change perception.
So, why talk about false positives in the first place? Well, tool adopters I correspond with unilaterally indicated that “dealing with false positives” was their big challenge. I thought, “Address the pain directly.”
However, this is not strictly how I went about it with more savvy folk. Instead, we focused on positively identifying those vulnerabilities we could easily verify as needing fixed. This may sound like two sides of the same coin–and to some extent it is: reducing false positives reduces the pile of hay in which one must look for needles.
However, other techniques bore just as much (or more) fruit. We successfully tuned through other means such as:
This tuning scheme is very myopic: fixating on assessment. That is, it focuses on the relative effort associated with various different security assessment activities. It ignores broader risk management factors (discover-ability, probability of exploit, and impact) and remediation factors (effort to fix, regression rate, etc.).
Advanced practices use all these factors to tune. But, organizations without risk management practice or development relationships can use these assessment-driven measures to drive improvement without external dependencies today.
Expect more on the topic of “Focusing on what to fix” from me.