Software Integrity Blog

 

Best practices for secure application development

Application security is your best defense against the hackers who want your organization’s data. Here are best practices for secure application development.

Best practices for secure application development

Apps rule. They’re everywhere. Every business with an online presence has web applications—sometimes hundreds or thousands. They are in constant use in just about every workplace. There are about 2 million available on the Google Play store and another 1.83 million on the Apple App store, according to Statista. Smartphone users carry an average of 80 apps in their pockets and use at least 40 of them every month.

Which is good—and bad.

Everybody knows about the good. Apps enable everything from entertainment to education, fitness, friendship, efficiency, and convenience—magical convenience. You can lock your doors from hundreds of miles away. You can check on the best way to avoid traffic jams before you start driving. You can find which club your friends chose to hang out at tonight, and more.

The bad: Cyber attackers know all this too. They know that apps are built with, and run by, software. They know software is rarely perfect. They also know that not all organizations prioritize secure application development. Even when patches are issued for bugs or other vulnerabilities, they know that not every organization or user installs them.

That, predictably, makes apps one of the favorite “attack surfaces” of hackers looking to get inside your device or your system.

But there are ways to increase the good and decrease the bad. And National Cybersecurity Awareness Month (NCSAM) is the perfect time to talk about them.

Web apps are a top attack vector

The bad is well documented. As multiple reports on data breaches have found, web applications are one of the most frequent ways attackers breach both individuals and organizations.

In Forrester’s The State of Application Security, 2019, author Amy DeMartine opens with this declaration: “Application weaknesses and software vulnerabilities continue to be the most common means by which cybercriminals carry out external attacks.”

The most recent Verizon Data Breach Incident Report (DBIR) found that web applications are among the top three attack vectors in eight of the nine industry verticals it covered. They are No. 1 in four of them.

And according to SAP, 84% of cyber attacks happen on the application layer, making it the number one attack surface for hackers.

There is no mystery about this. If an attacker can exploit a vulnerability in an app, it offers what that attacker is seeking—potentially unlimited access. “Malicious attackers who exploit an application through a vulnerability or weakness will also have access to the data that application has access to, no matter what data security or network protections you may have in place,” DeMartine wrote in the Forrester report.

Best practices for secure application development

Best practices for secure application development

Bottom line: Insecure applications put organizations at risk in multiple ways—financial, legal, brand damage, and more. Which is something everybody should know all the time, not just during NCSAM.

But there is a major gap between “should know” and “do know,” not to mention that many who do know still don’t do what they should. So, given that the online world in which we live remains riddled with application vulnerabilities, and that the theme of the month is “Own IT. Protect IT. Secure IT,” it makes sense to turn some extra focus on one of the most fundamental elements of how to do that: application security, or AppSec.

Secure application development is a well-established “thing.” Every major security conference features dozens of presentations or keynotes on its importance and ways to do it better. There are well-established tools to help developers “shift left,” or “build security in” to the software development life cycle (SDLC).

Know what’s in your code

For starters, if you’re going to “own IT,” you have to know what you own. While most organizations create proprietary software, virtually all—99%, according to the 2018 Synopsys Open Source Security and Risk Analysis (OSSRA) report—also use open source.

Which is why software composition analysis (SCA) is useful. It helps find open source components in an app while it’s in development.

Know how your apps will be used

Beyond knowing that, developers need to know how an app is going to be used. “The core challenge is that appropriate tooling and strategies will depend upon your development paradigm,” said Tim Mackey, technical evangelist at Synopsys.

“For example, in a highly agile DevOps-centric engineering team where the applications are deployed as microservices in containers, there are capabilities within containerized deployments that can be described as security mitigation models—the result of which are more difficult to attain when developing IoT firmware or mobile applications.”

So it’s critical for engineering teams to “understand how their applications are deployed and incorporate that info into their threat models,” he said.

Use the right tools

There are also multiple tools for software security testing throughout the SDLC: SAST (static application security testing), DAST (dynamic application security testing), IAST (interactive application security testing), RASP (runtime application self-protection), and penetration testing. They all play a role in delivering a product that, while it won’t be bulletproof (nothing is), will be secure enough to discourage all but the most motivated and expert hackers.

While it may seem like splitting hairs, it is important to note that these tools don’t “build” anything on their own, any more than a hammer, saw, screwdriver, and drill build a cabinet on their own. They help the builder.

But, of course, those tools have to be used. So do software testing tools. And the reality is that security testing is too often perceived as a drag on development—that it slows it down and makes teams less likely to meet their deadlines.

Create security requirements

One reason for that, according to Sammy Migues, principal scientist at Synopsys, is that security testing isn’t always written into the specifications for an app.

“Building security in slows you down only if you weren’t going to do it in the first place,” he said. “If you were going to build security in, then doing it takes exactly the expected amount of time. That’s not a perception issue; it’s a fact.”

He notes that while product managers specify what features an app should have, which also take time to build, nobody frets about features slowing down development.

“When the product management industry learns to write nonfunctional security requirements, then developers will build security in, allotting the proper time estimates to achieve the acceptance criteria that include building security in. So, no friction,” he said.

Enable developers

Tanya Janca, co-founder and CEO of her recently launched company Security Sidekick, said security tools are evolving to meet the need for speed. “The most up-to-date tools and strategies all focus around the following goals: automating as much as possible, not slowing down developers, eliminating entire bug classes and customizing solutions/fixes if you feel the need to,” she said.

But she also sees a need for a culture change to improve understanding between security and development. “When I was a developer, I never had the chance to work with a security person who knew how to build software, and I heard a lot of ‘no, you can’t do that,’ with few suggestions of what I could do,” she said. “Security still has a lot of culture around ‘no’ and gatekeeping, rather than the newer approaches (which are working quite well) of building guardrails.”

There are also incentives—or a lack of incentives—to push development teams to make security a priority. Mackey said if a software vulnerability that could have been caught and fixed leads to an exploit, it’s not the development team that suffers the consequences, at least at first. “When a security incident occurs, it’s the production operations team that bears the bulk of the cost in remediating the issue, not the development team,” he said. “The core cost to development teams occurs after the forensic analysis is complete, when they need to rework or refactor their code.”

“So anything not deemed a ‘feature’ will be perceived as a speed bump to functional feature development.”

Will privacy concerns boost secure application development?

However, ironically enough, the privacy laws now blossoming throughout the world—the EU’s General Data Protection Regulation (GDPR) was just the start—could force companies to improve their app security, Mackey said. Speed bump or not, the cost of failing to do it will be greater than the cost of doing it.

“Privacy regulations impose functional requirements on products that simultaneously expect a security review to occur,” he said.

“For example, if data on a user is collected, GDPR dictates that the collection occur under limited scenarios and that users have a right to receive an accounting of what data an organization collected but also how it was used.

“The security aspect of this is that now data collection no longer is ‘because it’s useful’ and data retention is no longer ‘as long as I need it’ because in reality, the only information subject to a data breach is what was collected and retained.”

Ultimately, he said, this paradigm change will be good for both the companies that make apps and the consumers who use them.

“The good news is that privacy paradigms have a direct mapping to security objectives, making the task of complying with privacy dictates while building secure applications a seamless endeavor,” he said.

Will privacy concerns boost secure application development?

Don’t wait for privacy laws: Start now

Migues is a bit less optimistic, saying he thinks it will be a while before the new privacy laws push secure application development to a new level.

He notes that the multiple social media apps—Facebook, Instagram, WhatsApp, TikTok, Snapchat, and so many more—aren’t seeing any erosion of users even though they collect, store, and share massive amounts of personal data.

So far, he said, “the public seems to understand the difference between a breach of privacy caused by an attacker and a breach of trust by companies that share our private data. Still, the evidence seems to indicate the public is pretty fickle about who they will punish for either kind of breach. It’s easy to move your savings account or your dating profile, but how easy is it to move your entire social existence?” he said.

But for those who take secure application development seriously, besides the available tools, there is a roadmap to doing it better, called the BSIMM—Building Security In Maturity Model. The annual report on software security initiatives (SSI), observed mainly from eight industry verticals, came out last month. Now in its tenth iteration, the BSIMM covers 119 “activities” already in use.

Migues, a co-author of the report since it began, called those 119 activities “perhaps the best collection of controls ever amassed for the purpose of building security capabilities into an organization’s SSI. It captures the state of the industry—what’s being done right now.”

So the means for secure application development are available. Now all it takes is doing it.

Download BSIMM10

 

More by this author