Can AI-Driven Chip Design Meet the Challenges of Tomorrow?

Synopsys Editorial Staff

Apr 05, 2024 / 4 min read

While we don’t yet know the full impact that AI will make on chip design, one thing is for certain: the impact will be far-reaching, with potential to transform engineering productivity and the chips themselves. This transformation comes at an opportune time, given the conflicting pressures of increasing compute demands on chips coupled with engineering talent shortages.

Since we’re in the early stages of this AI journey in the semiconductor industry, there’s no shortage of perspectives and questions raised around the topic. Some of these thoughts were covered during a lunchtime panel that packed a ballroom at the Santa Clara Convention Center on the first day of SNUG Silicon Valley 2024.

Moderated by Dave Altavilla, co-founder and principal analyst at HotTech Vision, “AI-Driven Chip Design: Meeting the Challenges of Tomorrow” featured an esteemed roster of industry experts:

  • Prith Banerjee, CTO, Ansys
  • Silvian Goldenberg, partner and GM, Microsoft
  • Sabyasachi Sengupta, senior director, PDCAD, Intel
  • Kelvin Low, VP, Market Intelligence, Marcom, Strategy and Business Development, Samsung Foundry
  • Thomas Andersen, VP of AI and ML, Synopsys

Altavilla solicited viewpoints on issues ranging from future possibilities with generative AI (GenAI) to the impact of AI on engineering jobs. Read on for a recap of the panel. 

ai eda tools snug silicon valley 2024

At SNUG Silicon Valley 2024, panelists discussed and debated the impacts and opportunities of AI in chip design.

Faster, Better Decision Making

Much like how touchscreens have created more natural ways to interact with mobile devices, large language models (LLMs) are providing a natural way to interact with some collection of processing power in the background, noted Sengupta. How this will evolve is hard to say at this time, he said, but engineers will undoubtedly tap into their creative minds to bring new possibilities to the forefront.

Chip design involves a lot of heuristics and, as such, there are virtually infinite ways to do things. We now rely on humans to figure out the best methods through their expertise and experience, Sengupta said, but AI can uncover new things to consider that can, in turn, reveal new or better solutions. As confidence in AI-driven tools grows, and as the tools themselves mature, engineers should be positioned to rely on the tools to handle bigger chunks of work, he noted.

“At the end of the day, our industry has always increased productivity by leaps and bounds,” Sengupta said. “How do we explore that search space quicker and get to a better decision quicker? These are interesting possibilities which will make designers more productive and the designs better.”

Banerjee highlighted how AI is suited for repetitive tasks such as simulation. One simulation alone can take 1,000 hours, and the reality is, design teams must simulate across multiple parameters, which requires substantial compute power. The Holy Grail in simulation includes three key factors: fast runtime, high accuracy, and an easy-to-use system, he said, but all the tradeoffs involved make this difficult to achieve. Neural networks, he noted, can be trained to do this work at a tremendous speed up, while providing accuracy because they can be exhaustive in their approach. Meanwhile, the addition of generative AI “copilots” makes such systems easier to use.

Enhancing Semiconductor Engineering Productivity

Each speaker stressed the continued importance of engineering talent in this changing world. Rather than viewing AI as a replacement for human engineers, the panelists highlighted how the technology can boost engineering productivity.

Concerns about machines taking jobs away date back to the Industrial Revolution, Andersen noted. EDA tools, he continued, are still the primary workhorse, with AI wrapping around them to help optimize workflows. “You have to adjust, and you have to use the skill sets that humans are good at,” he said. “AI right now is taking away tasks that many people don’t want to do, so people can focus on creative work.”

Banerjee chimed in that universities also need to adapt their instruction to train the designers of the next century. Currently, this means teaching GenAI-based chip design. For Goldenberg, AI helps to democratize chip design. Rather than the traditional hand-offs to different teams along the design process, “now we have every single design engineer creating and continuing to do the work of others. That has made a huge difference in terms of opportunities,” he said.

Low, too, feels that AI will enhance productivity while mitigating talent shortages in the short term. He called on engineers to embrace AI and find ways to build their skillsets on top of the technology.

Creating New Opportunities

In the chip design space, GenAI has emerged in an assistant role, largely because LLMs don’t yet have the needed accuracy, said Andersen. “Every year is different. Next year, there might be a breakthrough in the accuracy. It is theoretically possible that we’ll someday have capabilities that will generate designs or layouts themselves,” he said. “GenAI is just one technique. There are many AI techniques, but the point where AI will supersede a human is very far away.”

If we do get to a point where GenAI can generate RTL, that could raise other concerns that would then create new opportunities for innovation. As an example, Andersen noted the prospect of malicious components being generated, which would present an opportunity to create verification software to assess for security threats.

Andersen also highlighted other potential applications for GenAI, aside from the support assistants that are already in place: RTL and code optimization, workflow automation, improved human/machine interfaces, and content creation from specs. Before these applications can be realized, the industry will need to address issues such as copyright and intellectual property protection, problem complexity, access to high-quality data for model training, as well as the compute resources needed to train models from scratch.

Banerjee imagined a scenario where designers could train an AI tool with a multitude of different designs, provide the tool with a new requirement, “and out comes a design. We’re not there yet,” he said, adding, “That is the future.” 

Low mused that bespoke silicon and bespoke systems may not be too far-fetched in the future. Goldenberg sees potential for GenAI platforms that tap into the optimal methods and work across the design flow, as well as the ability to use AI to generate chips from much higher-level description languages. These are indeed exciting times, Low said, as AI provides yet another tool for accelerated innovation. 

Continue Reading