Bug bounty programs are becoming more popular. Do they work? What are the pitfalls of crowdsourcing application security testing? Our experts weigh in.
The original version of this post was published in Forbes.
Bug bounties are hot. They are everywhere.
Of course, popularity doesn’t guarantee quality. Just because everybody is doing it doesn’t necessarily mean it’s the best way to maintain the security of your organization.
But the word from most experts is that bug bounties are a good thing, as long as they’re not the only thing—they’re not a cheap substitute for rigorous testing throughout the entire software development life cycle (SDLC).
Whether that’s happening or not is, at least at the moment, essentially drowned out by the sheer number of bug bounty programs in existence and the number of hackers hoping to cash in from them.
According to one list from vpnMentor, there are 734 programs in operation this year, not just from the predictable giants like Google, Apple, Facebook, Microsoft, Alibaba and Amazon Web Services, but seemingly everybody else too, from Craigslist to Dropbox, GitHub to GoDaddy, Netflix to PayPal, the United Nations to United Airlines, WordPress to Walmart and Yahoo to Yelp.
And HackerOne, a company that hosts bug bounty programs, says more than 300,000 people have signed up for them, although critics say some of those are zombie accounts.
The surface appeal to those involved is obvious. For the companies, it’s kind of like crowdsourcing your security. You get thousands of eyes—typically those of some of the best white hat (i.e., ethical) hackers—on your software, looking to find weaknesses.
If they find problems, you agree to pay them—the amount depends on the severity of the vulnerability—but you don’t have to put them on the payroll full time.
And of course it is vastly cheaper to pay anybody, on staff or not, to find bugs in your network, system or applications than to deal with a major data breach, with the potential to cause what is now a well-known list of horrors—major brand damage, possible fines or other sanctions, liability that can run into the hundreds of millions, etc.
For ethical hackers, it’s a chance to get paid a few extra bucks—sometimes more than a few—for doing what you love to do.
Google reported recently that it paid $3.4 million in 2018 to hackers through its Vulnerability Reward Program. The details: 1,319 reported bugs by 317 researchers from 78 countries. The largest single reward was $41,000. Not exactly megabucks, although the biggest bounty paid in 2017 was close to three times that, at $112,000.
Facebook recently paid a hacker $25,000 for finding a vulnerable endpoint that could have tricked users into accessing a malicious URL.
But even relatively high paydays are not the norm. Parsia Hakimian, senior consultant at Synopsys, views bug bounties as the equivalent of multilayer marketing operations (sometimes labeled pyramid schemes), where very few people make most of the money while the rest don’t make much at all.
“In general, the bug bounty platforms are hyping large payouts, like $10,000 to top-paid researchers, while the overwhelming majority do not get paid or get paid less than minimum wage,” he said.
“These programs make a lot of noise—I mean a lot—but only 4-5% of reported bugs receive payouts. Companies need to sift through a lot of noise to get anything meaningful.”
Indeed, a recent post on Trail of Bits cited “Fixing a Hole: The Labor Market for Bugs,” a chapter in a book titled New Solutions for Cybersecurity that makes the case that “trying to make a living as a programmer participating in bug bounties is the same as convincing yourself that you’re good enough at Texas Hold ’Em to quit your job.”
According to the authors, a select few earned the vast majority of the bounties while the rest fought over the remaining “table scraps.”
And even the top earners weren’t making all that much. “The top seven participants in the Facebook data set averaged 0.87 bugs per month, earning an average yearly salary of $34,255; slightly less than what a pest control worker makes in Mississippi,” Trail of Bits said.
Of course, given the freelance nature of it, the perception is that most of those working on bug bounties are doing it as a side gig, not their main gig. And if that’s true, $34,000 wouldn’t be a bad for a second income.
For companies with a bug bounty program, the question is whether they are using it effectively. While it is better to detect and fix bugs at any time than to have them be exploited by attackers, just about any security expert will tell you it is better to “build security in” to your software throughout the SDLC than it is to try to “bolt it on” with patches when systems, networks and applications are in use.
But even the best software security initiative (SSI), using the best tools for multiple kinds of testing, doesn’t guarantee bulletproof software.
So while a bug bounty program doesn’t, or shouldn’t, replace an SSI, it can serve as a crowdsourced element of it—pen testing—generally done by hired hackers at the end of the SDLC.
“Bug bounties are fine—everyone should have them,” said Gary McGraw, author of Software Security and former vice president of security technology at Synopsys, “as long as they have a way to diagnose, triage and fix the bugs. Any company that can’t do that shouldn’t turn on the bug-bounty firehose.
“It’s helpful for people who are already doing software security right,” he said.
That is also the view of Christopher Littlejohns, senior manager, sales engineer at Synopsys. “Bug bounty programs are a good thing, but not as a substitute for appropriate security diligence during software development,” he said, calling them “a backstop that rewards individuals for reporting hard-to-find, complex vulnerabilities that can be exploited by advanced cyber criminals.”
He said using a bug bounty program as a substitute for an entire SSI would be like “paying the neighbors to check that your windows and doors are shut and the alarm is enabled because you don’t have the time to do it. As opposed to asking them to keep an eye on the property to see if there are any suspicious characters around.”
Organizations should use bug bounty programs to “maintain their software post-deployment so they know if new vulnerabilities are identified that need to be addressed,” he said.
And Hakimian, even though he opposes them in principle, said bug bounties can “fit as a complement to post-release vulnerability assessments and to provide a disclosure channel.”
Finally, any organization with a bug bounty program—even used as intended—should be aware that it is playing in a somewhat free market where it has to compete for talent. Hyatt, for example, is offering $4,000 for critical bugs, $1,200 for high severity, $600 for medium and $300 for low.
That is unlikely to attract top talent. “Bounties attract people who can automate mass scanners and researchers living in countries with a low cost of living,” Hakimian said, noting that since he was born in the Middle East and knows the cost of living there, he is aware that a few thousand dollars a year in that region can be significant. “But it’s not a lot in most of the industrialized world,” he said.
Littlejohns agreed. The pros, he said, “will chase the money in places where they have the skills to generate income.”
Taylor Armerding is an award-winning journalist who left the declining field of mainstream newspapers in 2011 to write in the explosively expanding field of information security. He has previously written for CSO Online and the Sophos blog Naked Security. When he’s not writing he hikes, bikes, golfs, and plays bluegrass music.