Like the other transitions discussed, this one happened slowly at first, and then picked up momentum in an exponential manner. For many years, chips defined industries. Intel is a great example. Their X86 architecture created the PC industry. So many generations of computing were based on migrating to the next version of the Intel microprocessor. Companies like Nvidia rode the same wave for graphics acceleration. Qualcomm similarly defined a lot of new smartphone products based on what they announced. Life was good for these companies.
About 10 to 15 years ago, something started to change. Up to that point, the chip companies introduced new products, sent the software team the specs, and said, “Build your software to run on our hardware and we hope you prosper.” The change, subtle at first, was a reversal of roles. The software and the user experience started to be the leader, and the chips needed to implement that user experience became the supporting cast. One can argue that Apple saw this first.
I don’t want to get into too much history here but consider that Apple bought PA Semi in 2008 and essentially turned their product development paradigm upside down. The software team used to read the processor manual and write code. Now, the software team wrote the code and the hardware team built a chip to implement that code in the most efficient, lowest latency, lowest power way. The user experience was king in this model.
Apple has since become the most valuable company in the world.
The widespread deployment of AI accelerated the whole process. The algorithms used for pattern recognition in self-driving cars have been around since the 1950s. There was simply no way to run those (very complex) algorithms fast enough to actually drive a car.
In the New World of Innovation, software leads the way and chips make it possible. If you can harmonize the software and hardware design flows, you can, in many cases, dominate a market and print money.