HDL simulation: Key to concurrent verification
New EDA tools allow complex electronic designs to be verified from initial
concept to production
BY DAVID ALLENBAUGH
Zycad Corp.
Fremont, CA
With today's ever-shrinking product development cycles–currently on the order
of only 12 to 18 months–there is less room than ever for errors or misguided
projects. Products must be defined, developed, and introduced without error.
This requirement is in direct conflict with the simultaneous trend toward
increasingly complex systems, which boosts opportunities for design errors
geometrically.
Both of these trends are making traditional end-of-process design-verification
methods and tools obsolete. Taking their place is a new generation of
feature-rich high-performance EDA tools that enable electronic designers to
iteratively verify their designs from initial concept to production. This
process of verifying throughout the design cycle–called concurrent
verification–ensures that a design produces a final delivered product that
meets customer requirements and expectations.
Moving upstream
The introduction and growth of hardware description languages (HDLs)–most
notably Verilog, then VHDL–in the design of application-specific ICs and
complex systems has been a key factor in the movement toward concurrent
verification. HDLs are human-readable languages that have been developed for
the description of electronic components and their functionality. They describe
instances, interconnections, directions of signals, behavior of components, and
concurrency of actions, imposing the rules of electronic hardware where a
programming language cannot.
Logic and fault simulation tools have enabled designers to exercise software
models of their designs for many years. HDL simulations are unique in that
designers can define, then simulate their systems using high-level,
technology-independent language constructs before they begin implementing their
design. By describing their systems at a high level of abstraction (that is,
behaviorally), designers can focus on functionality and design intent rather
than the details of the structural implementation such as component
specification.
These high-level descriptions then provide the input to logic synthesis tools
that automatically synthesize the design into structural components, which, in
turn, can be input to simulators for system verification. This
methodology–starting from high-level functional descriptions and moving down
to gate-level design–is called top-down design.
Concurrent verification
Like the broader strategy of concurrent engineering, of which it is a subset,
concurrent verification requires pulling what is traditionally a downstream
function–design verification–into the earliest design engineering stages. It
also requires the continuing verification of the product, which occurs
simultaneously as the design evolves. Top-down design using HDL simulation is
the key to this process.
Concurrent verification and concurrent engineering have the same goal: to
minimize downstream problems and risks–that is, to avoid the expensive and
time-consuming process of redesign should a final design or manufactured
product not meet customer requirements. As with all manufacturing strategies,
the ultimate objective is to produce the best product, at the lowest cost, as
early as possible.
As today's hottest ticket to implementing concurrent verification
methodologies, HDL design tools are booming. The HDL design market is expected
to grow from a $118 million market in 1993 to a $209 million market in 1997,
representing 77% growth over the next four years (see Fig. 1).
The market is largely driven by the consumer electronics segment, where the
adage “get it right the first time” has become an absolute imperative. In 1994,
the consumer market will purchase $135 billion worth of electronic equipment.
Two major application areas fueling that growth are wireless data
communications and data compression.
The larger share of that growth will be in the market dedicated to VHDL. Though
both Verilog and VHDL continue to coexist, most industry watchers expect VHDL
to soon overtake Verilog (see Fig. 2). In any case, both languages will be
around for a while and several vendors of HDL-based simulation tools now offer
support for both languages.
According to industry studies, HDLs are replacing schematic-capture as the
primary method for design input. Studies have shown that the productivity of
HDL users is about 3 to 10 times that of users of schematic-capture when
measured in terms of number of gates created in a given time frame. Because
HDLs are human-readable, designers can much more easily see the intent of their
design by reading an HDL model than they can by looking at a schematic.
Speed differentiates
The all-important factor in HDL-based concurrent verification today is speed.
Since VHDL, in particular, is a standard (and Verilog is being promoted for
standardization), there isn't much room for differentiation based on features
and functions. Rather, vendors today are competing based on price and
performance. Lower prices can help the budget, but they don't have much of an
impact on the design cycle itself.
Speed, however, is crucial to reaping the full benefits of HDLs. Despite their
obvious advantages, HDLs continue to suffer from one major shortcoming:
simulation for large, complex systems is often too slow to support rapid
development cycles. Unfortunately, this is where top-down design is most
critical.
In theory, HDLs streamline the design of a system's overall architecture by
allowing the designer to simulate the entire system before moving on to
defining detailed logic. In the real world, when a design is initially defined
using an HDL, simulation is used to develop and verify small design partitions.
The process tends to be iterative and the amount of clocks simulated tend to be
few.
However, once the design partitions are completed and performing as specified,
the total system–which may be a single chip or a complete system–must be
integrated and tested to ensure that it meets specifications and customer
expectations before implementation. Unfortunately, however, there is rarely
time for this. It can take days or weeks to adequately simulate just a design
subset at the higher, more abstract levels of HDL–yet that is where the
greatest design benefits accrue.
Performance equals productivity
The first generation of software-based VHDL simulators was interpretive, which
resulted in tools that were fast for compiling but slow for simulation. The
second generation used compiled C code simulators, which analyze VHDL code and
generate C code, which is then compiled using the native workstation's C
compiler. C code simulators provide a faster simulator, but a longer compile
time because of the extra compile steps.
The latest generation of simulators–native compiled-code simulators–skip
the C compiler and generate object code directly. This technique has been used
by vendors such as Cadence Design Systems and Model Technology to boost
execution speeds by 2 to 3 times. Vantage Analysis Systems, meanwhile, is using
what the company calls biLevel Adaptive Signal Trees to boost the speed of its
VHDL simulations. Many software vendors are also developing both
multiprocessing and multithreaded versions of their VHDL simulation engines.
The greatest boost in simulation speeds has been achieved by recently
introduced hardware-based processors–also known as HDL accelerators–that are
optimized to simulate HDLs at high speeds. Combined with native-code compilers,
this new category in HDL simulation is achieving unprecedented simulation
speeds (see table)–up to 70 times faster than workstation-based software
simulators.
Since the hardware is developed exclusively for the target HDL, there is no
operating system overhead or external interrupts slowing down the system. The
generated code executes directly on the HDL processors and is custom generated
and partitioned with each new design. Hardware-based HDL processors are
currently finding their greatest acceptance with companies designing various
types of video or image-processing chips and others that require long
simulation times for a single test set.
Figure 3 shows the difference that a conservative nine-times performance in HDL
simulation can make in a single day. Here, it is assumed that verifying this
particular design requires a total of 20 simulation runs–15 initial runs and
five re-simulations to fix problems. Assuming one run per day and a five-day
work week, a software simulator would require 26 days to complete its
simulation. A hardware HDL processor, even with a very conservative nine-times
performance, allows three runs per day–thus resulting in a total saving of
over 17 days.
Target data simulations
Linear performance gains don't tell nearly the whole story, however. Higher
performance HDL simulation is actually changing the way companies work, which
can have a much greater impact on development cycles. The more a company is
able to simulate and verify its designs before committing to implementation,
the more time and money it will save by avoiding repeated synthesis loops as a
result of functional errors.
Traditional software-based HDL simulators, however, are often too slow to
perform full-system verification or regression tests in an acceptable time. The
ultimate goal–iterative design and verification with real-life target
data–has yet to become a reality for most designers. For example, simulation
of an MPEG (motion picture exchange group) encoder chip may take over 24 hours
to process a single image. Ideally, however, multiple images should be
processed to ensure the algorithms are correct before the expensive synthesis
process begins.
End-customer validation
Rapid verification of designs with real target data is also enabling designers
to achieve end-customer requirements validation by demonstrating whether a
design-in-progress is really the one the customer wants. This demonstration is
achieved through “proof of implementation” at any point during the design
process: the customer can be provided with a set of verification results that
demonstrates that the design is being implemented correctly and will produce a
final, delivered product that will satisfy the customer's requirements.
By contrast, conventional “discrete” verification uses a static specification
or refers to the previous or next step in the design process–with the risk
that the product is no longer on target with the customer's desires. Again, the
earlier that designers can demonstrate that some part of a design does not
actually meet customer requirements, the less expensive the problem will be to
fix. The process also allows for a more flexible response to a dynamic market.
Verifying designs at each step and providing continuous feedback allows the
customer to change the design specification as early as possible based on
changing market conditions or other requirements.
Surgical debugging
Lower costs and higher performance of HDL simulators are allowing designers to
move toward a more creative methodological approach called surgical debugging.
As designers move closer to having undelayed access to simulation, the more
they tend to interact creatively with their design. Rather than creating large
chunks of code to be simulated overnight, designers create little pieces in a
rapid flow of design, immediate verification, and highly focused debugging.
The definition of “undelayed access” depends on the number of designs and the
type and size of the designs being simulated. For some, workstation-based
software simulators may be just the ticket; for others, only networked
hardware-based accelerators can keep the creative juices flowing.
Breaking through the simulation bottleneck will not necessarily speed up the
pre-implementation design process itself. But it will give the designer time to
find the best–not just the fastest–solution to a problem. It will also allow
designers to verify their designs more often and more thoroughly, thus reducing
downstream risk and overall development cycles by weeks or months.
BOX
HDLs seek higher ground
VHDL: standard bearer seeks libraries
Very High Speed Integrated Circuit (VHSIC) Hardware Description Language (VHDL)
was an outgrowth of the DoD VHSIC program of the early 1980s. VHDL was
originally designed as a documentation language for digital designs after the
DoD realized that it was getting documented designs from its prime contractors
in myriad forms. Since the designs were typically documented in proprietary or
vendor-specific forms, the original design intent was lost by the time the
military wanted to procure spare or replacement parts years later.
The DoD later refined VHDL so that it became a language that could be
simulated. This process, in which commercial industry participated, was the
catalyst for the language becoming the only official hardware description
language of the IEEE. VHDL became IEEE Standard 1076 in December 1987, and was
updated in 1993.
According to industry observers, the greatest impediment to full top-down
design using VHDL is the lack of VHDL-format ASIC libraries for sign-off
simulation–meaning that a VHDL model has sufficient credibility to serve as
the interface specification between design and manufacturing. That is the
motive behind the VHDL Initiative Toward ASIC Libraries (VITAL), a multivendor
initiative working to introduce a standard way of expressing timing-accurate
ASIC libraries in VHDL that will work from simulator to simulator.
Verilog: library bearer seeks standard
Verilog-HDL was originally developed as a proprietary language by Gateway
Design Systems almost 10 years ago. After purchasing Gateway in 1989, Cadence
Design Systems put Verilog in the public domain and allowed the Verilog “clone”
industry to thrive as an alternative to VHDL. Cadence also helped to form Open
Verilog International (OVI), which has proposed the establishment of Verilog as
an IEEE standard. While users of VHDL struggle with the issue of building ASIC
libraries, Verilog users currently enjoy the benefit of hundreds of existing
Verilog device models.
A bilingual future
The respective strengths of VHDL and Verilog have a great deal to do with the
level of abstraction at which a designer wishes to work. VHDL, which has a
syntax based on Ada, is generally considered to be much stronger at conceptual
design, where technology independence is desired. Conceptual design is more
difficult with Verilog, which forces designers to deal with implementation
details. Verilog, however, has a much simpler syntax based on C, and designers
who are familiar with working at the RTL or gate levels find it to be much
“closer” to the hardware and, therefore, easier to learn and use.