+86 755-83044319

Technical Blogs

/
/

IC design changes, "fixed numbers" and "variables" of EDA tools

release time:2022-03-17Author source:SlkorBrowse:1988

In this era, it is not feasible for EDA vendors to adapt to changes. Flexibility will become an important performance index of EDA tools, and flexibility is the innate ability of the cloud. Frank Schirrmeiste said, "The industry is welcoming the era of SaaS as a usage model of EDA tools. Automation will provide users with the best heterogeneous architecture to execute EDA workload in the most efficient way."


High-performance computing HPC, heterogeneous computing CPU+X, small chip Chiplet, and 3nm mass production are just around the corner … In the post-Moore's Law era, IC design technology is changing with each passing day, which brings great challenges to a whole set of subsequent processes of IC industry chain, such as process, testing, packaging, etc. EDA tools are also impacted by the technological storm, and the frequency of design software update is accelerating.


Cadence is a software company specializing in electronic design automation (EDA), which was formed by the merger of SDASystems and ECAD in 1988. As one of the three EDA giants, how will Cadence define the current era?


Frank Schirrmeister, senior director of solutions and ecosystems at Cadence, said: "Many applications in the semiconductor industry put forward higher requirements for EDA tools, and the design complexity is also increasing. The five transformative drivers include artificial intelligence (AI)/ machine learning (ML), autonomous driving, super-scale computing, industrial Internet of Things and 5G communication. The development of these five areas raises a number of key technical challenges, including tool performance, the increase of computing power, and high-quality and more integrated design processes. "



Photo: Frank Schirrmeister, Senior Director of Solutions and Ecosystems of Cadence Company



Beyond Moore, full of variables

In the past half century, the semiconductor industry used to follow Moore's Law to promote the large-span development of computing power, but now there are more and more kinds of intelligent devices, smaller and smaller in size and lighter in weight, and silicon chips are approaching the limit of physical and economic costs step by step. When the process of the chip is below 7nm, the short channel effect and quantum tunneling effect bring great challenges to the chip manufacturing. When the semiconductor industry can no longer roughly solve problems by means of advanced manufacturing processes, it becomes a new way to go beyond Moore's Law.


Frank Schirrmeister said: "Beyond Moore's Law is becoming the new mainstream, and the key to realize it is system innovation, including integration, software and system analysis." As follows:


The innovation of integration mainly includes the realization of heterogeneous 3D-IC Chiplet chips and packages, high-performance RF development, chip-package-circuit board collaborative design, hardware and software collaborative development, and electromechanical effects.

Software innovation mainly focuses on the "left shift" to the early software startup and error correction, while considering system security and key security design.

The key innovation in the field of system analysis is to use computational fluid dynamics such as extensible finite element method and mesh generation to analyze multi-physical fields. The functions and speeds of 3D, thermal analysis and thermal solvers need to be further improved, and the massive parallel computing capability of cloud architecture should be fully utilized.


We all know that this wave of Chiplet was brought up by AMD, but now it has become one of the methodologies for the whole semiconductor industry to achieve beyond Moore's Law. The 3D-IC packaging technology mentioned by Frank Schirrmeister is also under discussion. He pointed out that the trend of using 3D-IC packaging technology to develop SoC and ASIC is driving the development of special purpose chips. Therefore, the existing EDA technology needs to be enhanced by calculation software algorithm, and provide system-level analysis of heat, electromagnetism and fluid science.

In terms of transcending Moore's Law, the ability to be flexible is the key to the success of an enterprise. For example, in the case of the same line width and the same process, the value of the chip can be maximized through innovative layout design. In this process, the value of IP has become more and more prominent, the types of IP have become more and more abundant, and the proportion of cost in chip design/manufacturing has also climbed step by step.

When we talked about this issue, Frank Schirrmeister expressed his views. He mentioned that the trend of IP outsourcing will continue as manufacturers increasingly want their development teams to focus on developing unique differentiated IP. The standardized protocol IP is one that is very suitable for outsourcing. At the same time, the latest versions of many protocols are very complicated, which requires a higher level of knowledge reserve in this field. Actually, it is not only the protocol IP, but also the design and verification of digital and mixed signals.


As predicted by Professor Hennessy and Professor Patterson in Turing Lecture in 2018, we are in the golden age of domain-specific architecture and language development. The development team can be highly customized by developing specific applications and workloads. The configurable and extensible architecture of IP processor has become the norm of custom accelerator, and the design of IP module of interface has also been optimized.


The intrinsic value of processor and design IP is improving, and the development team can focus more on the development of differentiated customized hardware, architecture and software.



Demand is endless, when there is a definite number.

It's true that the current semiconductor industry can't be overstated with the rapid development, and the technology update iteration is happening all the time. However, people's ambition to pursue higher performance is endless, and blindly pursuing new and high performance will make enterprises lose their way.


EDA tools are a necessary part of chip design. In this era of great changes, many things need to be adhered to and continuously promoted. Frank Schirrmeister talked about this in three points during his interview:


The pursuit of the best tools is constant;

The exploration of efficient verification is constant;

The development of FPGA potential is unchanged.


First of all, the realization of the best tool performance requires a better numerical solver, silicon chip signing accuracy, single CPU performance, memory management efficiency and the creation of memory and interface IP. In this process, the fully integrated tool flow can achieve the optimal power consumption, performance and area (PPA) goals, including the digital design of advanced process nodes, custom/analog and radio frequency (RF) design, mixed signal design, and fully integrated verification flow. It is an important way to realize innovation by integrating the computing engine and ML machine learning function to enhance the productivity of workflow. In this regard, Cadence has formulated an intelligent system design strategy, continuously delivering innovative computing software functions, and helping to achieve excellent electronic system design.


Secondly, verification is a never-ending task. Even if the computing power keeps improving, the design team can still fill the extra computing power with verification tasks at any time. The most important part of functional verification is to verify the throughput. The verification team needs to focus on finding and correcting the most errors in a fixed unit time. In order to realize a successful verification process, the verification team needs enough flexibility to match the correct calculation force for the verification process. To achieve this, it is necessary to support multiprocessor architecture to meet the requirements of formal verification and simulation, and hardware-aided development. What deserves special attention is that hardware simulation and prototype verification need to provide enough flexibility and make full use of customized processor and FPGA-based architecture on the premise of consistent front-end. This is why Cadence calls this powerful combination of hardware simulation and prototype verification "double swords of system power".


On the point that "the development of FPGA potential is constant", Frank Schirrmeister pointed out that the complexity of software development and verification is the key driver of hardware-assisted verification. Today, more than 80% designs have adopted FPGA-based prototype verification, and the proportion of hardware simulation acceleration is steadily increasing. Without starting the software on the hardware simulation acceleration platform and prototype verification platform, the possibility of direct streaming becomes very small. The risk of design defects consistently continuing to start is too high, and the cost and benefit loss caused by the delay of listing time will be extremely huge. The optimal balance between verification, hardware simulation acceleration and prototype verification engine is crucial, and the three need to give full play to their advantages and perform their respective duties. It is the key of productivity optimization to achieve the optimal verification infrastructure reuse in different project stages through unified front-end design.



EDA goes to the cloud, whatever you want.

Process miniaturization is still one of the most important features in the development of integrated circuit manufacturing technology. At present, although the speed of process miniaturization has slowed down, more advanced processes are still being continuously updated, so that more devices can be integrated on the same area chip, and the chip performance can be improved to reduce the unit manufacturing cost. At the same time, however, the complexity of design is increasing significantly, which brings many challenges to EDA tools.


Frank Schirrmeister said: "With the rapid increase of design and product complexity, what we need to think about is not just the chip, but the optimal system design from the overall system level. This change in thinking mode has driven many innovations in the field of chip-package-circuit board collaborative design and integration. These innovations not only need to use the engine with the best performance for design, assembly, analysis and sign-off, but also need to build an integrated development platform to manage complex multi-domain integration challenges. "


As the chip design process becomes more and more complex, the mode of cloud computing +EDA shows infinite beauty. As far as flexibility is concerned, traditional EDA tools need a large number of servers to provide computing power support in order to cooperate with complex chip design. However, we all know that these computing power are not required in the whole cycle, especially in the early stage of the project, a considerable number of computing power resources will be idle, resulting in a waste of resources. The cloud can be adjusted according to the demand and charged according to the quantity. The cloud on EDA has greatly improved the economic benefits.


If we look at the front and back of the chip design, the advantages of cloud on EDA will be more obvious. The front-end design requires high concurrency, multi-threading and mixed random access, while the back-end requires single-threading, ordered access and memory-intensive. These can be adapted by adjusting the cloud parameters.


"Users can take advantage of the elastic computing power provided by the cloud architecture to gain the advantages of productivity and scalability. In some cases, users don't have to face the complexity of cloud architecture, because some core engines have been redefined to support massively parallel computing in cloud computing. Therefore, users can fully enjoy the upgrade of performance, capacity and productivity. " Frank Schirrmeister said, "In some cases, if it involves the price of cloud services or the optimal configuration of cloud instances for a specific EDA workload, ML is being used as a key technology to help developers choose the best match between EDA workload and computing configuration."


In the post-"Moore's Law" era, people are exploring various ways to surpass it, such as advanced packaging, heterogeneous integration, Chiplet mode … various countries are also setting up various alliances and foundations to surpass Moore. The ever-changing new technology makes the edge of chip design constantly expanding, and these innovations can not be separated from the help of EDA tools. From the perspective of EDA vendors, to meet these new demands, EDA tools also need constant innovation.


Frank Schirrmeiste mentioned the innovation in verification. "There are many innovations in the field of verification. Intelligent verification management will be able to automatically generate tests, submit verification tasks to the most appropriate verification engine, collect and analyze coverage information, and support error correction failure management."

He also stressed: "Intelligent verification must make full use of massively parallel computing of cloud computing, and adopt ML technology to improve verification productivity and throughput."


Therefore, no matter which path the chip design takes, whether it follows Moore's Law or surpasses Moore's Law, the cloud on EDA has become an inevitable trend. Here, Frank Schirrmeiste said: "Simulation acceleration must use the massively parallel computing provided by cloud computing and machine learning ML technology, especially for hardware regression. The efficient use of abstraction promotes the emergence of many innovative technologies, builds a hybrid verification environment, and provides the integration of register transmission level accuracy and transaction level virtual prototype verification for designs that need high fidelity. We are at the threshold of a new stage, and the design team and software developers will make use of rich mixed settings to weigh the advantages and disadvantages between the accuracy of the efficient evaluation model and the simulation performance. "


Write at the end

Through the communication with Frank Schirrmeiste, the author found that today's chip design has entered the era of a hundred schools of thought contend. Different chip manufacturing methods inevitably require different chip design methods, and what they have in common is that they all need the support of EDA tools.


In this era, it is not feasible for EDA vendors to adapt to changes. Flexibility will become an important performance index of EDA tools, and flexibility is the innate ability of the cloud. Therefore, I'd like to conclude with Frank Schirrmeiste's words, "The industry is welcoming the era of SaaS as an EDA tool usage model, and automation will provide users with the best heterogeneous architecture to execute EDA workload in the most efficient way."





Disclaimer: This article is reproduced from "TechSugar". This article only represents the author's personal views, and does not represent the views of Sacco Micro and the industry. It is only for reprinting and sharing to support the protection of intellectual property rights. Please indicate the original source and author when reprinting. If there is any infringement, please contact us to delete it.

Company Tel: +86-0755-83044319
Fax/fax:+86-0755-83975897
Email: 1615456225@qq.com
QQ: 3518641314 Manager Li
QQ: 332496225 Manager Qiu
Address: Room 809, Block C, Zhantao Technology Building, No.1079 Minzhi Avenue, Longhua New District, Shenzhen


Service hotline

+86 0755-83044319

Hall Effect Sensor

Get product information

WeChat

WeChat