Technology due diligence may turn up surprises, but it’s better to find them sooner rather than later. Third-party audits can help you assess your risk.
With thousands of transactions and billions of dollars spent on technology mergers and acquisitions (M&A) every year, acquirers must vet their investments by conducting due diligence. M&A due diligence allows acquirers to do three important things:
M&A is high stakes for everyone involved with jobs and capital on the line. To ensure a solid transaction, a buyer needs to examine the acquisition from all angles:
Software and systems companies derive their worth based on their technology valuation. In a “tech transaction,” the product is software or driven by software and the intellectual property embodied in the software is central to the value. So, while an acquirer will need to dig into all of the above angles, confirming, planning, and finding risks in the technology is of paramount importance.
The “tech” in a tech transaction comprises:
Presumably the acquirer has a high-level understanding of these aspects of the target and the fit with their own business. But they will want to go into more depth during due diligence through more extensive demos, examination of the roadmap details, and strategic discussions with the product leaders.
While the buyer will have become acquainted with the seller’s top brass, more personnel will become involved during due diligence. This is the time to assess the overall talent of the team, to plan post-acquisition roles, and to identify key and weak players in the mix.
If the development team will be kept intact and will continue current processes, it’s critical to assess the associated supporting processes and tools. This will almost always be the case for private equity deals, though a strategic acquirer may plan to merge the team into their own system. Modernizing inefficient processes take time and investment. Poor processes or a lack of processes can raise questions about management competence. Lastly, weak process areas should direct deeper diligence efforts. For example, if there is no process for managing open source risks, it’s extra-critical to examine the technology for such issues.
Due diligence on the software itself must look at its architecture and code. The architecture provides the foundation, defining how the code is assembled and structured. As with a home inspection, a firm foundation is critical to maintainability. Software that needs to be “refactored,” meaning the architecture overhauled, brings with it significant technical dept that needs to be paid off. Digging into how well the architecture is implemented in code also reveals technical debt in the form of bugs, security vulnerabilities, licensing issues, and other problems that require code modification.
Technology due diligence requires a combination of resources to perform the range of inquiries. For starters, the acquirer’s senior technical managers will have formed a view based on publicly available information and pre-diligence discussions. Diligence usually kicks off with requests for additional information from the target in the form of disclosures. Assessing the technology disclosures about policies, process documentation, bills of materials of open source components, security breaches, and others related to software can help direct exploration into the software itself.
Most acquirers will make their own assessment of strategy, product, and people. Private equity firms may go outside or may recruit a CTO from a portfolio company to look at the development process. Strategic acquirers usually have the expertise, access, and skills to do it themselves. A good starting point involves looking at process documentation and then probing into how adherence is monitored and ensured.
However, delving deeper into the architecture and code requires access to material that targets are particularly sensitive about. Even during formal due diligence most targets shy away from sharing architecture details let alone the code itself. Acquirers, too, should be wary of becoming “contaminated” by too much information. The concern is with downstream claims of IP theft by the seller if the deal does not close and the would-be buyer later comes out with a competing product.
For these reasons (among others), many buyers feel that performing due diligence on the architecture and code is best handled by a third party. Because the acquirer has little insight into the details of the code going into diligence, this evaluation is less about confirming assumptions and more about identifying risks. There may also be an element of integration planning if the acquirer plans to integrate at the software level.
Acquirers need to ensure that the target has rights to use the third-party code incorporated into their software. The extreme misuse of software can result in lawsuits or even loss of proprietary intellectual property. Today it is typical for half or more of a target’s code to be third-party code—mostly open source. The starting point is to generate a comprehensive list of open source and other third-party code. Buyers often request this in a disclosure from the target, but few targets manage their open source well enough to provide anywhere near an accurate list. The complexity of software requires specialized tools as well as auditor expertise to create a comprehensive list.
Once the list (also known as a software bill of materials) is established, an auditor can identify associated licenses and obligations for each component. Understanding how the third-party components are implemented, auditors can suggest license conflicts and where obligations may not be met. Obligations can range from simply putting copyright information in a notice file to making source code available; and with respect to commercial code, the obligation is often to pay license fees. Although a competent auditor can give directional advice on issues, software licensing can be complex and so it is prudent to involve a knowledgeable lawyer.
In today’s climate, software must be secure. That doesn’t require a lot of explanation. Security vulnerabilities are defects in the code that can be exploited by bad actors. An acquirer’s security experts may be able to perform some level of security analysis themselves, but to examine the code for vulnerabilities will require more access, which most targets aren’t willing to provide.
A comprehensive security evaluation must first identify high-level design flaws—which account for 50% of all security vulnerabilities—and ensure security controls are designed into the architecture. Next, it must analyze the proprietary code for security bugs from the outside in via ethical hacking and from the inside out via application security testing tools. Finally, it should draw upon the bill of materials from the open source audit to evaluate known vulnerabilities in the components. Here again, most targets do not have processes to keep open source components patched with the latest security fixes.
Software quality risk may not be as acute as legal or security risks, but it’s more insidious. It’s the “gift that keeps on giving.” Low-quality software written in a nonmodular way is hard to maintain which makes it difficult to add features, fix bugs, and to patch vulnerabilities. Thus, poor quality imposes a burden that steals resources from the future by requiring non-value-added work and by reducing developer productivity to perform it.
The first step in evaluating overall software quality is to assess the quality of the design and architecture to determine whether the software is hierarchically structured and modular, and therefore scalable. It’s not unusual for the actual architecture to differ from what is on paper. As such, this evaluation needs to go beyond reviewing a diagram and get into the real code.
A great architecture can have a poor implementation, and vice versa, so it’s also important to dig into the quality of the code with specialized static analysis tools and human review. Ideally, the proprietary code is well-written and the open source components are up-to-date and supported by the community.
Inevitably, due diligence will uncover issues; buyers expect it and should have worked some expectation of remediation into their offer and plans. Some issues rise to a level of import where they need to be handled with language in the definitive agreement. Sellers may agree to remediation and a timeline or provide a warranty for persisting issues they deem as low risk. Where there may be real risk of loss, for example, a license violation with legal exposure, acquirers may increase holdback value or timeframe. Extreme due diligence surprises sometimes lead to an adjustment in valuation. Deals sometimes go south, usually not solely based technical problems, but software risks can certainly be straws on the camel’s back.
For a tech acquisition to be successful in the long run, it’s best if buyer and seller have a shared clear picture of the state of the technology going into it. Third-party experts are queried to get a complete technical picture. A thorough assessment allows buyers to know what they are getting into and to plan accordingly. Though technology due diligence may turn up surprises, it is always better to find them before close.
To learn more, check out our webinar Top Considerations for Software Audits in M&A Due Diligence.
Phil is General Manager, Black Duck On-Demand. He works closely with Black Duck’s law firm partners and the open source community. A frequent speaker at industry events, Phil chairs the Linux Foundation's Software Package Data Exchange (SPDX) working group. With over 20 years’ software industry experience, Phil came to Black Duck from Empirix where he served as Vice President of Business Development and in other senior management positions, and was a pioneer in VoIP testing and monitoring. Prior to Empirix, Phil was a partner and ran consulting at High Performance Systems, a startup computer simulation modeling firm. He began his career with Teradyne's electronic design and test automation (EDA) software group in product, sales and marketing management roles. Phil has an AB in Engineering Science and an MS in System Simulation from the Thayer School of Engineering at Dartmouth College.