Facebook has extended their long-running bug bounty program to include data misuse by third-party application providers.
I applaud Facebook for making this stand. Despite the news being about one social media platform, one third-party application collecting data for a purported psychological survey, and the firms and people that surround the incident, it is important to note that over 1,400 apps did the exact same thing before 2014. In 2014, Facebook changed how apps could access profiles and social graphs. This change coincided with the demise of many “free” apps that were simply harvesting personal data. Good riddance to bad rubbish.
If you’ve ever allowed apps access to your Facebook profile to find out what hair color is best, learn why 97% of people cannot spell simple words, or run a virtual farm, you’ve granted access to an app similar to the one now causing everyone to #deleteFacebook. I hope researchers will investigate this thoroughly, as that one app is only the tip of the iceberg.
Sadly, 2018 is far too late to delete social media accounts except as a way of encouraging change in social media providers. The internet never forgets. I’ve been on the internet since 1989, and if you search hard enough, you can find the 18-year-old me making a fool of myself.
Modern kids know nothing else and are taught how quickly information spreads and how private details cannot be deleted once shared. I support the right to be forgotten, but sadly, I just do not think it is technically possible, simply because if you are using something for free, you are the product. Bad actors will not delete your data they collected a long time ago just because you deleted your social media profile today. Your collected data circulates forever.
The best advice I give to friends and family—and the way I now run my social media—is to share only things that you do not mind being in the press or in search engines for all time. Do not share anything—sensitive personal information, photos, or your location—that you would not want to share with a criminal casing your house. This does not mean giving up social media or deleting specific apps; it means being smarter about how best to use the platforms you have been using.
Have I always done this? No. I used to post about my travels on Twitter, which is a social media platform I deliberately keep open and public. I now announce only upcoming talks, and not the travel itself. I used to post photos of family events to Facebook, which I deliberately keep private to my 700 closest friends, but my daughter has asked for me to ask permission first, which is only fair.
In the latest OWASP Top 10, I pushed for A3:2017 “Sensitive data exposure” to be primarily about protecting data about humans. That would mean, for the first time, mentioning data privacy laws, such as the EU’s GDPR.
Previously, the OWASP Top 10 concentrated on “sensitive” data exposure in terms of stack traces or other technical issues, with only passing reference to privacy, but that has always been insufficient. Unluckily for the billions of victims, the data breaches of the last few years gave us more than sufficient reason to allow my fellow Top 10 co-leaders to redraft A3:2017 to protect personal information and to encourage application owners, designers, and coders to protect against data misuse.
I recently updated Synopsys’ OWASP Top 10 instructor-led training for the OWASP Top 10 2017. To explain the differences between the older Top 10s and the Top 10 2017, I teach that A3:2017 “Sensitive data exposure” is about regulated sensitive or private data exposure regarding humans, and A6:2017 “Security misconfiguration” is about unregulated and nonsensitive data exposure regarding system configuration or internal application state.
More recently, I have taken on technical leadership of our security testing services. I am reviewing and modernizing every single service. As part of this overhaul, I will be including data misuse as a first-class security issue. Users expect their data to be transmitted, processed, and stored securely. Users expect to have clear understanding of how their data will be handled and used by collecting organizations. Meeting these expectations is the minimum we can do as an industry.
Unfortunately, automated security tools and services struggle to understand what data is sensitive and have no idea about data misuse, so as I come to revise each service, I am reminded of how far our industry has to go. We need to improve the way we build and test software, and this will be a focus for my team as we innovate our security testing services over the next 12–18 months. In the meantime, data misuse testing will be included in every offering that has a manual testing component.
I encourage all organizations to insist their security teams and security partners do the same. Users demand nothing less, and as one popular social network is finding out, mishandling personal data comes at an extreme cost—not only in financial damages and shareholder value but also in trust.
Trust is the fundamental currency of the internet. It is time to make data misuse a first-class security concern. Design and build data protection into everything you do, and test for it.
Andrew van der Stock is a senior principal consultant at Synopsys, providing technical leadership in security architecture, threat modeling, security architecture reviews, secure coding guidelines and reviews, assurance and penetration tests, risk assessments, and developer training. He has worked in the IT industry for over 20 years and is a seasoned web application security specialist and enterprise security architect. Andrew currently leads the OWASP Top 10 2017 and Application Security Verification Standard projects.