AI Vision: Creating Chips Today for Tomorrow’s Cars and Robots

Kevin Wei

Aug 11, 2025 / 3 min read

The global race to build smarter and safer vehicles is driving record demand for artificial intelligence (AI) that can understand the world in real time. For chip designers, the opportunity is enormous: Global demand for automotive-grade AI chips is expected to expand six-fold in the next 8 years, growing from a $10 billion market in 2025 to roughly $60 billion by 2033.

iCatch Technology is moving quickly to seize the moment. The Taiwan-based fabless design house specializes in AI-driven, imaging-focused systems-on-chip (SoCs) for automotive, robotics, and smart vision markets. Founded in 2009 by a team with deep experience in the field, iCatch is now building on its success in consumer imaging to deliver advanced automotive imaging solutions — and partnering with Synopsys to accelerate its progress.

For iCatch, like all IC design shops, time to market is critical.

Automotive OEMs are clamoring for chips that deliver real-time scene perception, sensor fusion, and machine-learning inference without compromising power budgets or safety. Those that launch first can lock in design wins for an entire vehicle generation — and shaving even a few quarters off the delivery timeline is a strategic advantage.

By adopting a “shift-left” approach and leveraging our tools and proven IP, iCatch is shortening its development cycles to address rising demand for AI vision systems.


Exec. Guide | Rethinking Automotive Development: Virtualization for the Software-Defined Era

Discover how virtualization improves profit margins, accelerate time to market for vehicles, and secure a competitive edge in the software-defined vehicle era.


Powering next-gen vision solutions

iCatch recently marked a major milestone in its pursuit: After less than 12 months of development, it taped out its ThetaEye AI Solution SoC for automotive, drone, and robotics deployments. This complex platform incorporates an innovative, high-performance, low-power modular vision subsystem, ThetaEye.ai, that can be rapidly integrated and customized to process data from an array of sensors — including RGB, thermal, radar, LiDAR, and neuromorphic events, which mimic human brain processes.

The key to achieving an accelerated development timeline?

iCatch chip developers used our advanced prototyping tools for system-level verification, which allowed functional, power, and performance validation to occur much earlier in the design cycle. Traditional simulation and field-programmable gate array (FPGA) flows are insufficient for full system-level validation, which can lead to critical runtime issues that aren’t apparent until after silicon has been delivered, causing costly delays.

Instead, iCatch utilized Synopsys ZeBu (emulation) and HAPS (prototyping) as its main system-level verification platforms. With ZeBu, designers observed real system behavior and could debug complex interactions between CPU, NPU, and memory at an early stage.

The result: fewer surprises and first-pass silicon success. 

ai-vision-subsystem-thetaeye-ai-icatch-diagram

Cultivating AI leadership

iCatch is building on that success with a next-generation automotive AI project in partnership with Taiwan’s AI Chip Design Lab.

The lab is a joint venture between Synopsys and the Industrial Technology Research Institute (ITRI), with support from the Ministry of Economic Affairs and Department of Industrial Technology. It was established to foster AI innovation in the country’s semiconductor industry by equipping Taiwanese engineers with state-of-the-art design tools and services.

One of those tools is the Synopsys ARC Processor AI Reference Design Platform. Providing everything from architecture design to virtual prototyping and system verification, the reference platform dramatically lowers the barrier of entry into AI and shortens design cycles.

iCatch is adopting this comprehensive platform to design its fifth-generation 6 nm Vision SoC, a scalable vision solution for automotive, drones, and robotics. To tackle the complex sensing and computation tasks on the SoC, iCatch is leveraging our VPX5/NPX6 AI engine, advanced memory subsystems (LPDDR5), high-speed connectivity (PCIe, MIPI3, USB3.1, Ethernet), and more.

Targeting tape-out in early 2026, iCatch expects its N6 Gen5 SoC project to once again shrink the gap between architectural design and real-world application of advanced AI systems. And it is expected to provide a competitive edge in fast-paced markets.

Driving automotive AI vision forward

From our perspective, the partnership reaffirms how collaborative innovation accelerates time-to-market while meeting stringent reliability benchmarks. Joint efforts promote knowledge transfer and create standardized workflows that benefit the entire semiconductor ecosystem — in Taiwan and beyond.

To satisfy the market demands of automotive and robotics customers for AI vision systems, fabless IC firms like iCatch will require continual breakthroughs in chip design, system integration, and verification. By leveraging our scalable design platform, robust emulation capabilities, and a commitment to industry best practices, iCatch is better able to enhance its reach.

Additional projects and business growth are on the horizon — and next-gen AI vision systems are bringing them into focus.

 

Continue Reading