AI and the Real Capacity Crisis in Chip Design

//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

Chip industry veterans are used to the cyclical nature of semiconductor supply and demand, but the ongoing shortage of chips has been particularly difficult for many. Supply chain disruptions are likely to persist for years to come and the semiconductor industry is unlikely to return to old standards.

There is, however, a more pressing crisis on the horizon that will bring the semiconductor industry to its next turning point: the lack of engineering throughput will persist unless we optimize the chip design process.

Ongoing chip shortages appear to be driven by relatively short-term economic factors. But if we start thinking about chip design in a different way, it could provide new opportunities for advancements in chip production. Disruptions in semiconductor design certainly did not trigger the global chip shortage, but they are helping to exacerbate the crisis.

Talent crisis

Stelios Diamantidis, Senior Director of Synopsys AI Solutions

Supply and demand economics dictate that when a supply shortage occurs, demand will quickly drive new investment to fill the supply gap. Why is this not the case with the current supply crisis?

This is partly because each chip is specifically designed and optimized to the specifications of a manufacturing process. Photomasks, used to produce a pattern on a substrate, are extremely rigid and cannot be easily redesigned to accommodate different specifications. Think of analog vinyl records: once the music is burned onto the lacquer disc, you can’t modify the vinyl to play a different track.

The effort involved in optimizing a design for multiple target processes is minimally uneconomical and practically unfeasible. The reason for this is that it not only requires doubling the size of a design team, but also having access to expertise. This is where the chip industry hits another wall.

A critical shortage of technical talent is catching up with the chip industry and everything it powers, while understaffed teams face more complex challenges. The chips that seemingly enable everything in our daily lives – from mobile devices, appliances and cars to the industrial equipment manufacturers use to create these machines – are becoming increasingly complex.

These ever-evolving technologies are placing increasing pressure on chip designers to keep pace with demands from consumers, market leaders, corporate competition and stakeholders, pushing the boundaries of what manufacturers can produce and how fast. Moore’s Law predictions are lurking and a new AI-based approach might be what it takes to avoid judgment.

AI for chip optimization

AI is opening up new areas of technology such as autonomous vehicles and smart devices. Many of these AI-powered applications require complex, power-hungry chips. The level of research, experimentation and management required to design these devices is beyond human capability.

As customers continue to reveal how overstretched and under-resourced engineering teams are, it’s crucial that we find ways to free them up from tasks that could be handled by AI.

Engineers continue to develop AI-based tools that work independently, analyze huge streams of data, and learn from experience. Those same engineers who develop AVs, smart devices, and complex statistical calculations that use machine learning to process data at lightning speeds now rely on the very technologies they produce: it’s the dawn of the self-designed chip. Autonomous chip design can relieve some of the pressure on engineering teams and maximize productivity.

The transition from chip design a manual process to an automated process is not a new concept. Since the 1980s, the engineers used software tools to deploy automation and improve chip design. The EDA tools are a necessity for the design of modern fleas, but the appetite for bigger and better technology – which in turn require larger and better chips – continues to grow. Designers learn that AI engineers can help meet this demand.

Take, for example, the digital implementation, which is one of the most complicated aspects of the IC design process. Placement and routing tools have mostly kept pace with progressive technologies, determining where to place logic and IP blocks as well as how to route traces and interconnects.

The inputs to the placement and routing tools imply a large search space for potential solutions. These cover functionality (macro-architecture), form (micro-architecture) and fit (silicon technology). Manual processing and analysis of all this data is time-consuming and resource-intensive. AI technology could drastically reduce the load by discovering new ways to optimize the design.

One approach is to implement design space optimization (DSO), a model that uses ML to evaluate chips to improve design solutions. Prior to DSO, human engineers used design space exploration, the process of manually sifting through terabytes of data from a variety of inputs.

This manual effort is arduous and requires exhaustive experimentation often hampered by human limitations. Data is available to optimize the design, but it is too much and too complex for engineers to manage. Engineers end up partitioning data to make it more manageable, limiting design potential.

AI can use reinforcement learning technology to generate robust optimizations and continuously analyze the results for further improvement at a much higher rate than human designers. As the AI ​​learns from experience and expands its capabilities, it becomes the engineering team’s catalyst for earlier band-outs that meet power, performance, and PPA goals.

The AI seems to be the perfect assistant engineer, able to perform tasks, generate and analyze design data and produce results faster than humans. This allows engineers to spend more time at work to add value, improve the efficiency of the chips, to discover bugs and differentiation designs. By taking advantage of AI, design teams can also focus on reducing power leakage and improving the performance of the chips.

The implementation of AI has already begun, resulting in record design productivity. Tasks that would normally take an entire team months to complete, individual engineers could complete in weeks thanks to AI.

science fiction? Well, Samsung has already announced an AI-designed silicon with the help of Synopsys’ AI-based DSO.ai system. Japanese Renesas Electronics also achieved a previously inaccessible PPA using AI tools to meet time constraints weeks ahead of schedule while increasing the maximum frequency by hundreds of megahertz.

Synopsys DSO.ai optimization process. (Source: Moor Insights & Strategy) (Click on image to enlarge)

AI can improve more than speed. A North American manufacturer of embedded devices was able to achieve a 10% improvement in total power at the SoC level in just a few weeks by replacing manual methods with AI tools.

Which brings us back to the chip shortage. What if AI could help bridge the productivity gap, allowing design teams to optimize content not only for different market needs, but also for different process technologies? Could today’s rigid photomasks be easily repurposed by AI to meet the needs of a dynamic global economy also driven by AI applications? Could trained AI assistants remaster creative content on silicon in the same way recordings were turned into multitrack digital media?

Potential benefits for small and large businesses include reduced labor requirements. AI also responds to the growing demands of chip design and the current talent shortage. As automation spreads, we become increasingly comfortable with handing over our metaphorical and tangible keys to AI. In a bit of poetic symmetry, AI for chip design could be the key to future chip innovation.

Abdul J. Gaspar