Introduction
Design Tools - [2001]
In the 1990s, the industry began to shift to EDA tools to handle the increased
complexity of the ASIC designs. Any reasonable engineer can handle a design
of up to 30,000 gates. When 6-11 million gates are involved, it would take
multiple engineers years to complete one design.
By the 1990s, designs shifted from schematic capture, with the engineer
selecting the appropriate macros from a library, to HDL code. VHDL is
currently used in Europe and Verilog is currently used in the United States. Verilog is winning at the moment. Soft IP, predesigned blocks, are available in Verilog RTL. Design re-use became the norm.
Design Tools - [1990s]
To perform a logical circuit design for an array-based circuit, the designer
in 1990 could choose between schematic capture, direct netlist creation, and the
use of behavioral languages such as HDL and VHDL (now called RTL code). Netlist generation as
was done using Tegas is too tedious an approach for ASIC-based circuits
past a minimal size. Netlist generation via a behavioral language or from
schematic capture is the more usual approach. Schematics went out of use when synthesis programs like Design Compiler came along. Design Compiler at one point dominated the industry for synthesis. PrimeTime dominates it at the moment for STA or timing analysis. Both products come from Synopsys.
Translation programs exist to move a netlist in one format to a netlist
in another format. The industry in a990 was still trying to expand the idea of
EDIF, a common netlist that would allow input to any simulator and any
placement system.
For example: Verilog to Mentor translation was possible using a Verilog
netlist to create Mentor schematics. (Back-generation of schematics will
remain a necessary step in spite of the push for behavioral descriptions
as the preferred design tool. Many GUI interfaces will show a schematic of sorts.)
Once an acceptable netlist has been generated by whatever means, the
designer needs to check or verify that the design rules have not been
violated. When the circuit is certified as acceptable and buildable, the
circuit must be simulated according to the design submission requirements
of the chosen vendor. The simulation must be checked. The design must
be documented.
Simulations involve control programs, stimulus generation, annotation
delay files and descriptions. AC test analysis requires additional documentation.
Which simulator can be used, and whether any timing verifier or other
tools are available, is limited to what the array vendor supports.
The simulation output files must be formatted according to vendor rules
to allow the generation of test vectors. (We now have ATG program to automate this process.) These will be transferred to
the placement software and to test-generation software. A submission may
include dozens of files that must be tracked, controlled for revision
level and managed to verify that the design submitted to the vendor is
the one intended to be submitted. And yes, errors do occur. Still.
Framework Systems
Framework systems are under development as the means of alleviating the
design management problem but they are in their infancy and industry sages
are predicting at least five years before they meet any goals. Further,
those developing framework systems disagreed about those goals.
There are four basic functions of a frame work agreed upon:
- integration of design tools
- provide a common user interface
- manage the design data and
- manage the design process.
The integration of design tools includes tools from non-framework vendors.
Allowing access to different design tools requires that the interface
to those tools be reasonably similar and easy to use. (We are not there yet!)
What actually happened? Candence and Synopsys bought up smaller companies until the two of them could offer complete EDA design flows.
Up to 42 different pieces of software (or more) could be required at one time to go from RTL to wafer-fab. By putting all the tools into one "system" (i.e., framework) from a single vendor, the user gains from less concerns about interfacing. Of course, this ties you to one vendor. This isn't how things are being done. People naturally look for better tools (3rd party) and newer tools, and new ways to solve problems, and climb out of the box. This means the interfaces to the tools are still an issue (2008) although many 3rd party tool vendors try very hard to match their user interfaces to interfaces from the larger EDA houses. No tool set is perfect.
Array (Process) Selection as the First Task
Whatever the framework systems end up providing, the basic design flow
that exists today will remain intact. The first and most difficult task
of array selection (process selection) will not change, nor will the basic goals of the current
design methodology.
It is the ease of satisfying those goals that will change.
The process of selecting an implementation for a circuit involves two
basic decision processes.
- First, a decision must be made on the process technology that will satisfy
the design criteria for power and speed. This includes how fast the process foundry can turn your wafers.
- Second, a selection must be made from the components (arrays, macro,
IP, I/O, etc.) available within those technologies. This is fairly minor these days.
Selecting what to use from that available remains high-priority.
Even with all the changes made in software tools, these two key items
remain unchanged. Choose the process, which defines the technology, and
then choose the components, for even with high-level synthesis, the astute
designer can "guide" the software to a better solution. The
software [Synopsys, Cadence, Avant! (oops - they were purchased by Synopsys) are the leaders] is chosen by the
designing group with input from the selected foundry as to the product
design flow.
|