Software Integrity

 

Dan Geer explores the DevOps ‘Law of the Jungle’ dilemma

Dan Geer explores the DevOps 'Law of the Jungle' dilemma

As humans have evolved over time, so has cybersecurity, but at an arguably faster rate. Just as nature weeds out evolutionary ideas through a concept known as survival of the fittest, so do various attackers on the internet by targeting the most vulnerable systems. The dilemma facing DevOps today is which evolutionary theory is the correct approach to cybersecurity going forward. In other words, whether to produce a few strong systems or produce numerous systems knowing that some may not survive.

Dan Geer, CISO for In-Q-Tel and a security researcher, explored this topic in his closing keynote at SOURCEBoston last Friday. As with any of Geer’s keynotes, this one was dense with powerful observations. And, as with his keynotes at Black Hat and elsewhere, it was also a call to action.

Windows 95

Geer traces the dawn of cybersecurity to Microsoft’s introduction of the TCP/IP stack to Windows 95. He said that Microsoft effectively took an operating system designed for a single owner on a private net and connected it to “a world where every sociopath is your next door neighbor.” Of course no one recognized it as such at the time.

The results of millions of personal computers joining the internet was an example of Geer’s premise. Predators took advantage of end-user inexperience. Microsoft learned that much of its code had to be better protected on the internet, a process that continues today.

Script kiddies got paid

Geer says the next major moment was around 2006 when “adventurers and braggarts on the internet” became paid professionals, albeit criminals. This was a time when skills associated with breaking into remote computers commanded prices on the black market. If you needed someone to write a virus, you could obtain that virus for a price.

This continues today, with nation-states contributing to the rise of new exploits. Today there is a market for zero day vulnerabilities, vulnerabilities that the vendor does not know about and therefore doesn’t have patch to resolve. These are valued between $4 and $10 million dollars annually.

Machine code

The third major moment occurred last year, according to Geer. The annual DARPA Cyber Grand Challenge demonstrated machine learning, that algorithms will someday be capable of providing a high-level of coding and cybersecurity. Security experts, he said, will still be in demand.

With such a rapid rise in new machine learning there is the potential for both good and evil. Geer has previously argued that all security technologies are dual use, used both as defense as well as offense. Mike Wallace, the manager of the DARPA Cyber Grand Challenge agreed, “I cannot change the reality that all security tools are dual use.”

Machines aren’t better

If machines can rapidly generate new lines of code, then all should be good, right? Geer notes that colleagues have informed him that machine-written code is not without vulnerabilities. Even intelligent machines write vulns.

So, if machines can generate more vulns, where does that leave us? Writing in the Atlantic Monthly, Bruce Schneier once asked whether vulnerabilities in software are dense or sparse? Turns out the difference matters.

Are vulnerabilities dense or sparse?

If vulnerabilities are dense, Schenier argues, then fixing vulnerabilities is a waste of time because there will always be many, many more. You won’t have made a dent. But if vulnerabilities are sparse, then fixing each one moves us closer to achieving information security. To test this, Geer has previously proposed in a keynote speech at Black Hat USA 2014 that the US Treasury buy zero day vulnerabilities as a means of artificially forcing the reduction of vulnerabilities and therefore improving security.

Robert Graham at ErrataSec builds on Schneier’s question and proposes another option, that “vulns are sparse, but code is dense.” Graham explains that “vuln disclosure helps specific software, and the overall way to create software, even while code is so ‘dense’ that disclosure makes no dent in the total number of vulns in the universe.”

Parallel goals

Geer cites Daniel Bilar, who showed in his analysis of the Conficker malware that, “attackers and defenders each present moving targets to the other.”  This is like nature’s predator or prey dynamic. As we evolve better security, the attacker must up their game.

The problem is we’re only human. Attackers exploit, Geer says, the assumptions on which your code was built. Research Sandy Clark writes in a paper “Familiarity breeds contempt: The honeymoon effect and the role of legacy code in zero-day vulnerabilities” if software security is your goal, then “software re-use is more harmful to software security than beneficial.” That’s because once an attacker learns how to attack your code, you help by re-using it.

The DevOps dilemma

The downside of this is that we reach a point where we only trust code that we write ourselves. This has consequences for DevOps, Geer said. Like the discussion around dense or sparse vulnerabilities, DevOps must ask whether it is better to invest in better cybersecurity or to reduce the number of software components they must struggle to secure. Both have consequences, and neither are ideal. For example, we may not be addressing the fundamental problem if we’re only beefing up the defenses. On the other hand, a financial institution that limits itself to code it produces may be lacking innovation and ultimately becomes prey through predictable programming.

Geer proposes a different goal. “The pinnacle goal of security engineering,” he says, “is no silent failure; it is not and cannot be no failure.” In other words, we know there will be vulnerabilities and failures, and to hope for something different is wrong.

A silent failure is when the device either works or it doesn’t. Increasingly, as our world depends more and more on digital devices, we cannot accept failure as a safeguard—a self-imposed denial of service. Rather, we need to have a mitigation pre-built and within easy reach.

The InfoSec dilemma

Geer concludes his evolution metaphor with a question: Either we damp down the rate of change in information security to give it predictive operation validity, or we purposefully increase unpredictability so that the opposition targeting exercise is too hard. The first, he says, causes us to give up many forms of progress. The second, we give up freedom, as we start to let the machines decide for us.

Geer does not answer his own question, leaving it for us. “You have not picked a career,” he said. “You have picked a crusade.”

Ready to learn more about streamlining security into your development/operations process?

Get started