The devices we use every day have raised the bar on consumer expectations for reading and interpreting environmental conditions. In the car, sensors help avoid collisions by alerting the driver to traffic in the blind spot before changing lanes, or efficiently control wiper blades by detecting rain on the windshield. In the home, video games can be controlled by body movement, and touch screens are ubiquitous interfaces on appliances, computers, and phones. On the go, we rely on sensors in our mobile devices to help us navigate, interpret weather conditions, and determine proximity.
The increasing sophistication of sensor technology has led to multiple sensors being combined to provide higher order functions. For example, geo-fencing combines movement, gesture, and geo-location sensors to customize a mobile device’s user experience for that particular location. This combination of various sensor elements (analog and digital) is known as sensor fusion.
The number of systems incorporating sensor fusion technology is exploding as semiconductor suppliers push to integrate sensor interfaces into their SoC offerings. According to Semico Research and shown in Figure 1, the number of systems incorporating sensor fusion is predicted to grow from 400M units in 2012 to over 2.5B units in 2016 – an annual growth rate of almost 60%.