Functional Verification Advancements and UVM

Summary:  

Functional verification, in VLSI industry, is the task of verifying that the logic design conforms to specification. In simpler words, it is the task of answering the question: "Is this design working correctly?"

 

Functional verification, in VLSI industry, is the task of verifying that the logic design conforms to specification. In simpler words, it is the task of answering the question: “Is this design working correctly?” The previous question is quite hard to answer when we are dealing with SoCs including a number of gates in the order of 100 million of gates. However, functional verification is an important task in the digital VLSI design flow. As it is performed in the early stages of the project as shown in Figure 1, it helps reduce the problems that appear later in the final stages of the flow and reduces the probability of silicon failure upon fabrication.

Figure 1 Standard Digital VLSI Design Cycle (Source: http://www.ece.unm.edu/)

To tackle the emerging verification challenges, a shift in the mindset of dealing with functional verification started around 15 years ago, which led to the appearance of the Universal Verification Methodology (UVM), which we will discuss in this article. Before we discuss it, we need to clear the confusion that you might have about verification and two other different tasks; validation and testing. Validation is the task of checking whether the design meets the customer requirements or not, while verification is more related to design specifications. Verification is done on a software layer, to make sure that the functionality of the design in the stage of pre-silicon behaves according to specifications. Testing on the other hand is performed on the hardware layer -in a lab- in post silicon stage to make sure that design was taped out correctly.

Design – Verification Gap

 

During the past decade, the time spent by systems-on-chip developers in functional verification has risen to 60% or more of the development time on some projects. Even developers of smaller chips and FPGAs are having problems with the past verification approaches. In 2001, it was predicted in the International Technology Roadmap for Semiconductor industry that the verification process will fail to keep pace with the design capabilities, as shown in Figure 2. To enhance the functional verification, many proven and promising technologies have been developed, such as: simulation, emulation, object-oriented programming, OOP, constrained random stimulus, coverage based verification, formal verification …etc.

Figure 2 the verification gap leaves design potential unrealized. This means that the potential for something to go wrong is greater, and the verification task has become exponentially more complex. (Source: SIA Roadmap, 2001)

 It is worth mentioning that many of these techniques are based on the capabilities of the System Verilog language which combined the RTL capabilities of Verilog and the verification capabilities of open Vera language developed by Synopsys. System Verilog was adopted as an IEEE Standard in 2005.

 The problem with the above statement is that we have too many verification techniques. This has led to several problems, among these problems:

 

1) Miscommunication: Different teams use different verification techniques. Consequently, the communication between teams is hard. Moreover, it is hard to get a new team member used to the techniques adopted by this team.

 2) Reusability Problem: As there is no standard way to do things, it is hard to reuse parts of a project; either horizontally in other projects or vertically in the same project.

This is where a methodology appears. We need to get teams and engineers to do the same things in the same ways. A methodology provides guidance on when, where, why and how each technique should be applied for maximum efficiency. It also provides building-block libraries, documentation, coding guidelines and lots of examples. A methodology lets the verification engineer focus on the verification planning and test effort rather than complex test-bench architecture creation.

 

 

The Universal Verification Methodology, UVM, was announced by Accellera, a standards organization specialized in electronic design automation and IC design and manufacturing. It is a complete methodology that includes the best practices for efficient and exhaustive verification.

Functional Verification – Design Separation

 

To understand how the field of Functional verification has been separated from design, we must return around 15 years ago. In 2000, Verisity Design Inc. introduced a collection of best known verification practices. It was targeted towards the e user community. Later, in 2002, Verisity introduced the first verification library called the e Reuse Methodology, eRM. In 2004, the 9-year-old company was featured at the edaForum04, held in Dresden, Germany, in a talk titled “Improving Shareholder Value by Separating Verification from Design“. In this presentation, Verisity delivered a solution to provide unique value that can be generated when you separate the concerns of functional verification from design. In 2005, Cadence Design Systems acquired Verisity in a deal that was estimated to be worth $285 million.


However, Verisity’s efforts were not the only efforts towards neither separating functional verification from design nor reaching a unified verification methodology. In 2003, Synopsys announced its Reuse Verification Methodology library, RVM, for the Vera verification language. It didn’t include architecture guidelines and was considered as a subset of the eRM. Over time, it was converted into the System Verilog Verification Methodology Manual, VMM, supporting the evolving System Verilog Standard. Later, in 2006, Mentor introduced its Advanced Verification Methodology, AVM. It was the first open-source methodology and the first methodology to adopt the SystemC Transaction-Level Methodology standard.

After Cadence’s acquisition of Verisity, it started converting the eRM to System Verilog, introducing the Universal Reuse Methodology. Not only did it include the proven capabilities of eRM, but it also used TLM and was the first open source methodology. In 2008, Cadence and Mentor collaborated to release the Open Verification Methodology, OVM. The impact of OVM was great as it was the first multi-vendor methodology tested against different vendors’ simulators. This was important due to the fact that System Verilog was in the early stages and many of its constructs lacked clarity. The collaboration in OVM proved to be a very good solution, which made Synopsys collaborate with Cadence and Mentor to introduce a unified methodology. In 2010, OVM 2.1.1 was chosen as the basis for the UVM standard. It is tested by all vendors and no more technical comparisons between VMM and OVM are needed. UVM is currently an Accellera standard. It represents an alignment on verification methodology across the industry, supported by the major EDA suppliers and their eco-systems.

 

“Methodologies and tools for constructing and implementing hardware have dramatically improved, while verification processes appear to have not kept pace with the same improvements.  As hardware construction is simplified, then there is a trend to have less resources building hardware but same or more resources performing verification.  Design teams with 3X verification to hardware design are not unrealistic and that ratio is trending higher.” – Bill Grundmann, Fellow at Xilinx, DVcon 2014

 

Universal Verification Methodology

 

The UVM is a complete methodology that codifies the best known verification practices. One of its key principles is to produce reusable verification components called Universal Verification Components, UVCs. It is targeted to verify both the small designs and large IP-based SoCs.

The key features of UVM are:

  1. Data Structures

The UVM provides the ability to clearly partition your verification environment into a set of data objects and components. Moreover, it provides means for setting and getting data values hierarchically, textually printing and graphically viewing objects and automating commonplace activities, such as copying, comparing and packing items, which we will refer to later as transactions. This allows engineers to focus on what objects contain and how they work, instead of the supporting code.

 

  1. Stimulus Generation

The UVM provides infrastructures and built-in stimulus generation that can be customized to include user-defined transactions and transaction sequences. These sequences can be randomized and controlled based on the current state of the Design under Test, interface or previously generated data.

 

  1. Building and Running Reusable Test-Benches (Test/Test-bench Separation)

The UVM includes well-defined build flows for creating reusable verification environments. Moreover, it includes configuration mechanisms that allow customizing the runtime behavior without modifying the original implementation. This is beneficial when creating a test-bench for a design with different IPs, interfaces or processors.

 

  1. Coverage Model Design and Checking Practices

The UVM includes the best-known practices for incorporating functional coverage, in addition to protocol and data checks, into a reusable Universal Verification Component (UVC).

 

  1. User Example

The UVM library and user guide include a golden example, based on an understandable, yet complete, protocol called the UBus.

The UVM provides the ability to clearly partition your verification environment into a set of data objects and components. Moreover, it provides means for setting and getting data values hierarchically, textually printing and graphically viewing objects and automating commonplace activities, such as copying, comparing and packing items, which we will refer to later as transactions. This allows engineers to focus on what objects contain and how they work, instead of the supporting code.

A UVM test-bench is composed of UVCs. Each UVC is an encapsulated, ready-to-use, configurable verification component for an interface protocol, sub-module or a full system. A UVC consists of a sequencer and a driver for stimulating the design, a monitor for monitoring the pin-level activity and scoreboard for checking. It can optionally contain a coverage collector. Consequently, UVM enables the verification process to be divided into three different levels, as shown Figure 3.


Figure 3 The UVM environment different development levels

Moreover, UVM provides a framework to achieve coverage-driven verification (CDV) as shown in Figure 4. It combines automatic test generation, self-checking test-benches and coverage metrics. It eliminates the efforts and time spent in creating hundreds of tests and ensures thorough verification using up-front goal setting.

Figure 4 Constrained Random Verification Flow (Source: Mentorgraphics Verification Academy)

“Verisity is strong in verification automation and hardware acceleration. Add that to our strengths in simulation and in-circuit emulation, we can offer a more complete solution for customers,” – Adolph Hunter, group director of corporate communications at Cadence.

Conclusion

 

The advancements in VLSI Design techniques and methodologies have created a huge gap between design and verification capabilities. This gap led to increasing products’ cost and time to market, while limiting design capabilities. Consequently, efforts have been exerted over the past 15 years to create and enhance new verification methodologies and techniques to reduce such gap. These efforts led to the development of the Universal Verification Methodologies (UVM).

 

In fact, UVM provides a lot of useful utilities and functionalities to the verification engineers, but the question is “Would UVM be sufficient to face the ever-growing design complexities?” However, the fact that it is developed through the collaboration of the big three EDA giants and its support and adoption by Accellera as an open source standard indicates that it will be supported for a long period of time.

References

  1. “A new vision of ‘scalable’ verification” – EE Times, 2013

  2. SIA Roadmap, 2001

  3. “Improving Shareholder Value by Separating Verification from Design” – edaForum04, 2004

  4. UVM Community, Accellera

  5. Mentorgraphics Verification Academy

  6. Cadence buys Verisity for $285 million – EE Times, 2005

 

Mustafa Khairallah is a Verification Engineer at Boost Valley for Engineering Services. He is currently a Masters’ Student at Ain Shams University, Electronics and Communications Department. Mustafa is a graduate of Alexandria University, Electronics and Communications Department, class of 2013, with a grade of Distinction with honors and has one published research paper.

Related posts:

 

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

Facebook Comments