Black Duck (now Synopsys) held its inaugural European user conference this month in Amsterdam. Turnout was great, with almost 100 representatives from European businesses attending our training and presentations. I was privileged to lead a panel discussion on the security implications of open source in connected cars. Gordon Haff, Technology Evangelist at Red Hat, and Simon Gutteridge, Global Information Security Manager at TomTom, joined me to explore the topic.
“Car hacking” is certainly a fun subject to talk about (and even more fun to watch). But it’s also a serious topic as the volume of code increases in modern automobiles. The trend started in the 1977 Oldsmobile Toronado, in which a small amount of code managed electronic spark timing. As the chart shows, a high-end car today can include over 100 million lines of code. This software provides convenience (driver assistance), entertainment (infotainment systems), safety (blind spot detection, collision avoidance), and vehicle management benefits.
As Gordon and Simon pointed out, there are a number of security challenges in connected cars. Today, I’ll focus on four.
When we think of building software, we first think of our internal development teams. But the connected car is different, relying on hundreds of independent vendors supplying hardware and software components to Tier 1 and 2 vendors, as well as the OEMs.
The software from each of those vendors is likely a mix of custom code written by the vendor and third-party code (commercial and open source). Synopsys’ 2017 Open Source Security and Risk Analysis research found that 96% of the commercial applications analyzed included open source. On average, over 35% of an applications code base was comprised of open source, made up of 147 individual open source components. When you multiply this by multiple vendors, understanding exactly which components are part of connected cars is extremely difficult for the OEMs. When you add to this the fact that over 3,000 vulnerabilities are reported in open source every year, the security implications are clear.
There are several steps that are required to correct a software issue in the connected car. Let’s assume a Tier 2 vendor is using an open source component, and a vulnerability is disclosed.
Even when all this is done, the software update needs to go to the OEM or Tier 1 vendor, be incorporated into an update of that entity’s component (hardware and software) and, ultimately, be updated in each consumer’s vehicle. The Jeep hacking referenced earlier was addressed by sending a USB stick to each affected vehicle owner (how many car owners are comfortable updating their own software?). Alternatively, vehicles can be updated during routine service, IF that service is provided by an authorized dealer, a prospect that decreases as a vehicle ages. Over-the-air updates of software are still the exception rather than the rule, and may require that the vehicle be running but not moving (we don’t want to reboot systems when the vehicle is at highway speed).
The product lifecycle of vehicles presents challenges as well. Your cell phone may have a practical life of 2-3 years, but receives regular operating systems updates and perhaps hundreds of app updates each year. The laptop I’m using will likely be replaced after 3-5 years, and likewise receive regular updates and patches. This is the typical lifecycle software vendors are used to addressing.
A modern car, however, is in design for years prior to production, and the average vehicle may be on the road for 10-15 years. Supporting software over that period of time requires a different thought process. Vendors (and open source communities) need to be considered in light of the operational risk they present. Questions vendors need to ask include:
While not a direct security threat in terms of vehicle safety, gaining control over this data is critical. The connected car may collect more personal data on a driver than any device other than a personal computer. It has the ability to record where you drive, how long you stay in a location, how fast or erratically you drive (e.g., frequent lane changes or blind spot alerts, hard braking, auto-assisted braking), who you call, what you search for, and even your musical and news preferences.
The use of this data is largely uncontrolled at this point. Privacy policies may differ between OEM and vendors, as well as between countries where the vehicle is operated.
Mike Pittenger has 30 years of experience in technology and business, more than 25 years of management experience, and 15 years in security. He previously served as Vice President and General Manager of the product division of @stake. After @stake’s acquisition by Symantec, Pittenger led the spin-out of his team to form Veracode. He later served as Vice President of the product and training division of Cigital. For the past several years, he has consulted independently, helping security companies identify, define and prioritize the benefit to customers of their technologies, structure solutions appropriately and bring those offerings to market. Mike earned his AB in Economics from Dartmouth College and an MBA with a finance concentration from Bentley College.