ADVERTISEMENT

AI Is Helping Create The Chips That Design AI Chips

The human role is slowly being handed over to machines and algorithms.

Designing a chip in reverse Photographer: Tim Culpan/Bloomberg Opinion and Nvidia
Designing a chip in reverse Photographer: Tim Culpan/Bloomberg Opinion and Nvidia

As artificial intelligence drives demand for more advanced semiconductors, new techniques in AI are becoming crucial to continued progress in chip manufacturing.

The entire semiconductor supply chain, from design through to final fabrication, is now dominated by data. Over 100 petabytes of information is created and collated during the manufacturing process, according to one estimate by Intel Corp. That’s equivalent to a 170-year-long YouTube video.

Data analytics and machine learning, a discipline within AI, is so integral to the process of making and testing chips that Taiwan Semiconductor Manufacturing Co. employs dozens of AI engineers and has its own machine-learning department. Whereas humans were once trained to visually inspect a chip for defects, the small scale and increasing complexity of electronic components has seen that function handed over to AI systems. 

Photolithography is one of the most critical steps. This is the process of shining a light through a glass mask onto a chemically treated slice of silicon to create a circuit. It’s similar to old-school photography where a final print is developed in a darkroom. 

The problem is that light diffracts, which means that the lines actually drawn on the surface of a chip differ from the mask’s pattern. At larger geometries these flaws didn’t matter too much because the design had enough wiggle room to still be functional. But as dimensions shrunk in line with Moore’s Law, tolerance for errors disappeared. For decades engineers tackled these distortions by deploying a technique called optical proximity correction (OPC) which adds extra shapes to the original design so that the final result more closely matches the intended circuitry. 

Today’s chips have connections as thin as 5 nanometers, 20-times smaller than the Covid-19 virus, spurring the need for new approaches. Thankfully the errors between design and result aren’t entirely random. Engineers can predict the variations by working backward: Start with what you hope to achieve and crunch a lot of numbers to work out what the photolithography mask should look like to achieve it. This technique, called inverse lithography, was pioneered 20 years ago by Peng Danping at Silicon Valley software startup Luminescent. That Peng, who since moved to TSMC as a director of engineering, completed his PhD not in electrical engineering but applied mathematics hints at the data-centric nature of inverse lithography technology (ILT).

Designing a chip in reverse
Designing a chip in reverse

With hundreds of different parameters to consider — such as light intensity, wavelength, chemical properties, width and depth of circuitry — this process is extremely data intensive. At its core, inverse lithography is a mathematical problem. The design of an ILT mask takes 10-times longer to compute than older OPC-based approaches, with the size of a file holding the pattern up to seven times larger. 

Collating data, formulating algorithms, and running thousands of mathematical computations is precisely what semiconductors are made for, so it was only a matter of time before artificial intelligence was deployed to try to more efficiently design artificial intelligence chips. 

 It is, in many respects, a very complicated graphics problem. The goal is to build a microscopic three-dimensional structure from multiple layers of two-dimensional images. 

Nvidia Corp., which is now the world’s leader in AI chips, started off designing graphics processing units for computers 30 years ago. It stumbled upon AI because, like graphics, it’s a sector of computing that requires massive amounts of number-crunching power. The company’s central role in AI saw it on Wednesday forecast sales this quarter that surpassed expectations, driving the stock up around 25% in pre-market trading. That pushes it toward a $1 trillion valuation.

Images on a computer screen are little more than a superfine grid of colored dots. Calculating which to light up as red, green or blue can be done in parallel because each point on the screen is independent of every other dot. For a graphics-heavy computer game to run smoothly these calculations need to be done quickly and in bulk. While central processing units are good at performing a variety of operations, including juggling multiple tasks at once, modern GPUs are created specifically for parallel computing. 

Now Nvidia is using its own graphics processors and a library of software it created to make semiconductor lithography more efficient. In a blog post last year, the Californian company explained that by using its graphics chips it could run inverse lithography computations 10-times faster than on standard processors. Earlier this year,  it upped that estimate, saying its approach could accelerate the process by 40 times. With a suite of design tools and its own algorithms, collectively marketed under the term cuLitho, the company is working with TSMC and semiconductor design-software provider Synopsys Inc.

This collection of software and hardware wasn’t develop by Nvidia for altruistic reasons. The company wants to find more uses for its expensive semiconductors, and it needs to ensure that the process of bringing its chip designs to market remains smooth and as cheap as possible. While we all marvel at the ability of ChatGPT software to write software, we’ll see the increasing role of AI chips in creating AI chips.More From Bloomberg Opinion:

  • TSMC Keeps Spending to Leverage the AI Boom: Tim Culpan
  • Doctors Will Train in India's VR Medical Labs: Andy Mukherjee
  • US Chip Curbs Highlight Cracks in China AI Strategy: Tim Culpan 

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Tim Culpan is a Bloomberg Opinion columnist covering technology in Asia. Previously, he was a technology reporter for Bloomberg News.

More stories like this are available on bloomberg.com/opinion

©2023 Bloomberg L.P.