Engineers can push the boundaries of integrated circuits design and optimization using OnScale, which combines powerful multiphysics solver technology with the limitless compute power of cloud supercomputers.
One of the principal founders of Intel, Gordon Moore, wrote in a magazine article in 1965 that he roughly expected the speed and capability of the computers to double every two years as a result of increases in the number of transistors a microchip can contain.
This postulation, remarkably, have stayed true for over 50 years and at the intersection of every technological revolution, the usual questions about the validity of Moore’s Law is being asked. Arguably, Moore’s Law have slowed down over the last few years but the upward trend of transistor scaling has been growing at a healthy rate in absolute terms.
TSMC, a leading semiconductor manufacturing house out of Taiwan, recently announced that it expects to produce 5 nm transistors in full scale by 2020. Apple’s latest A12 bionic chip contains 7nm transistors manufactured by TSMC. With advancements in fabrication techniques and huge research endowments in the packaging technologies, it seems, after all, that the famous Moore’s Law will continue to hold for a few more years, if not more.
What’s driving the miniaturization trend?
Rapid advances in micro-fabrication techniques have fueled the growth trend of miniaturization of electronic devices. Miniaturization is both a design and a fabrication challenge. Furthermore, the push on IC packages to achieve smaller form factors in not only in the length & width directions, but also in terms of the package height is a challenge. Smaller volumes lead to greater power density that can have serious impact with timing delays and higher resistances in copper/aluminum interconnects. The physical design challenges at this scale are unprecedented. The design margins have been reduced considerable, and a holistic approach to chip design is required early in the design cycle.
Physical prototyping of early designs is time consuming and expensive. To reduce such costs, creating virtual prototypes and researching multiple designs by running computer simulations has been one of the major innovation drivers. The complex inter-dependencies of electrical, thermal, and mechanical characteristics of integrated circuits (ICs) are well known. For example, the effects of voltage drop, and the subsequent temperature distribution in the die impacts the power integrity of the circuits. Further, given the nature of the packaging materials and the non-uniform temperature distribution, the differences in the coefficient of thermal expansion results in thermo-mechanical stresses affecting overall reliability due to debonding, fatigue, and warpage effects. Early identification and mitigation of these effects will result in significant performance and cost benefits.
A complex web of parameters to optimize
Gaining comprehensive insights at different scales without oversimplification of the design of an IC is a serious simulation challenge. Coupled with the previously discussed interdependent effects and cost, manufacturability, and reliability issues results in a complex web of parameters tussling with each other for optimization. Of course, different products dictate different needs but maximizing the potential of each of those Key Performance Indicators (KPIs) is a difficult ask. For example, each of the parameters in the below image is influenced by a range of factors with interconnected effects making the holistic package-aware and system-aware chip design approach critical. Thousands of design iterations, if not millions, are required to achieve maximum efficiency across the board.
OnScale: The cloud engineering simulation platform enabling the full exploration of your design space
With infinite time and resources, the best answers to most of pressing design questions could be found. But, given the competitive nature of the semiconductor landscape coupled with demanding customers, time is a luxury that even the companies with deepest pockets cannot afford. With the OnScale software, users only pay for the time spent on simulation execution time and thanks to proprietary multiphysics solvers that scale across 1000s of cores on HPC, problems that were previously thought to be impossible to tackle can be solved. Users save significant time and can run thousands of simulations in parallel. By using OnScale, users are provided with a wealth of engineering simulation insights at a breathtaking speed eliminating legacy simulation tools’ constraints that have been a huge bottleneck to engineering innovation.
A thermo-mechanical simulation example
Using OnScale, engineers can run fully-coupled thermal and mechanical simulations. In this example, thermo-mechanical simulations are run on the patch antenna arrays in a package.
Let’s examine the package depicted in the image below. The main heat dissipation happens through the patch antenna arrays. Due to the difference in the coefficient of thermal expansion, the materials deform differently under the effects of temperature. Moreover, given a set of manufacturing tolerances, those deformations can be amplified significantly, which will affect the electromagnetic domains. To investigate such effects, the interconnect width and the thickness of the cover and the ground are varied between predefined margins to run a Monte Carlo study. The goal is to review the outputs and calculate the yield percentage depending on the preset tolerances.
How to run 1000 different models in 2 minutes
Continuing with the above example, varying the thickness of the cover and the ground along with the interconnect width resulted in a design population of thousands. Through a Monte Carlo method, we determined the yield in this package given a set of tolerances. The entire design population comprising of 1000 models was simulated in parallel, which means that the OnScale simulation took the same execution time as a single one.
In summary, increasing density in electronic packages results in lower design margins across the board. Simulating all the different length scales in a single model along with the accurate representation of materials and power distribution is a prerequisite. What OnScale offers in terms of simulation speed and scalability enables explorations of large design spaces breaking the barriers for engineers to solve complex multifaceted design challenge.