Table of Contents

Introducing Synopsys Cloud

Cloud native EDA tools and pre-optimized hardware platforms. Experience unlimited EDA licenses with true pay-per-use on an hourly or per-minute basis.

Semiconductors play an immeasurable role in our daily lives and are essential components in appliances, transportation, infrastructure, and computational devices. Furthermore, chip performance and energy efficiency demands continue to increase with the rise of the cloud, AI, and IoT, creating ongoing advances in the semiconductor industry. Here, we’ll cover some of the latest chip design technologies and tech-sector trends you should know.


IBM’s New Two-Nanometer Chip Technology

IBM recently unveiled the world’s first 2-nanometer chip technology, kicking off a new era of semiconductors. This essential innovation helps advance the state-of-the-art in the semiconductor industry and could achieve 45 percent higher performance with 75 percent lower energy use compared to today’s 7 nm chips. This technology helps increase phone battery life, reduce the carbon footprint of data centers, accelerate processing, allow faster reaction times in autonomous vehicles, and assist in AI tasks such as language translation. 

IBM’s “nanosheet technology” fits up to 50 billion transistors on a surface the size of a fingerprint. The transistor designs are called nanosheets, or gate-all-around transistors, and have been in development since 2017. Nanosheet transistors contain a channel of stacked layers surrounded by a gate on all sides. This architecture provides better current control through the channel while preventing leakage. Smaller transistors allow for faster, more reliable, and more efficient devices that enable core-level innovations for processor designs and allow for AI and cloud computing workloads while supporting hardware-enforced security and encryption. 


Application-Specific SoC Developments

We have seen Apple and Google release custom SoCs—Apple’s M1 and M2 chips in their larger personal computers and Google’s Tensor Processing Unit (TPU)—that offer more efficient AI workloads. Many other industry giants are following similar custom SoC trends. 

Microsoft has invested in chip designs for data centers—among them a class of programmable chips founded on Intel FPGAs for running AI tasks. Furthermore, Microsoft has been developing an ultra-secure coprocessor called Pluton that focuses on security and adds another layer of hardware and software protection. 

Amazon, on the other hand, has been designing a wide range of networking chips and specific processors for its data centers, as cloud computing services based on in-house ARM-server CPUs cost significantly less than the frequently used Intel’s.


Rising and In-Development Technologies

Beyond those advances occurring at large companies such as IBM, Apple, Google, Amazon, and Microsoft, there are a variety of additional technology trends:

  • 3-nanometer processor entering mass production. In October of 2021, TSMC released the news of their N3 process. Although adoption will require time, the N3 process entering mass production is approaching soon. Samsung 3 nm GAA structure also is predicted to enter mass production. 
  • DDR5 becoming more accessible. As demand for DDR5 began to emerge in 2020, it has continued to gain more of the market sector and will soon be adopted by mainstream markets, with phones, notebooks, and PCs utilizing DDR5 technology. 
  • Growth of DPUs. An increasing trend has emerged in chips targeting data centers (e.g., NVIDIA acquiring Israeli company Mellanox). These chips operate at lightning-fast speeds and are specifically suited for large servers and data centers. 
  • Storage and computing integration technologies. In traditional computing technologies,  handling times can be multiple orders of magnitude longer (and inefficient) when data is extracted from the memory outside the processing unit. Advances in low-voltage subthreshold digital logic ASICs, neuromorphics, and analog computing with memory-computing abilities integrated onto the chip can address this. 
  • EDA tools utilizing AI in chip design. AI and ML are beginning to be adopted throughout the chip design process by various methods. 
  • Adopting the Matter protocol into IoT and smart home devices. Matter, a home automation connectivity standard, is the application layer that unifies devices operating with various IP protocols and standards, supporting Ethernet, Wi-Fi, and Thread (in addition to low-power Bluetooth BLE as a pairing method). It runs on top of existing protocols and should support Zigbee and Z-wave. With version 1.0.0 released in early October, many upcoming devices will begin utilizing this standard.


Synopsys, EDA, and the Cloud

Synopsys is the industry’s largest provider of electronic design automation (EDA) technology used in the design and verification of semiconductor devices, or chips. With Synopsys Cloud, we’re taking EDA to new heights, combining the availability of advanced compute and storage infrastructure with unlimited access to EDA software licenses on-demand so you can focus on what you do best – designing chips, faster. Delivering cloud-native EDA tools and pre-optimized hardware platforms, an extremely flexible business model, and a modern customer experience, Synopsys has reimagined the future of chip design on the cloud, without disrupting proven workflows.

 

Take a Test Drive!

Synopsys technology drives innovations that change how people work and play using high-performance silicon chips. Let Synopsys power your innovation journey with cloud-based EDA tools. Sign up to try Synopsys Cloud for free!

Continue Reading