Quantcast
Channel: Systems Design Engineering Community » Oski Technology
Viewing all articles
Browse latest Browse all 3

Design and Verification Need a Closer Relationship

$
0
0

Gabe Moretti, Senior Editor

Design and Verification are often mentioned in the same sentence but too often they remain disciplines kept apart from each other.  During my time in engineering, over 15 years ago, we had two teams, one for design development and one for verifying the implementation.  At one point many of the EDA companies had teams that were geographically apart, design and development was done in the USA, while testing was done in India or East European countries.

It is clear today that such an approach is not optimal, and yet, too frequently one sees verification teams staffed with junior engineers and separated organizationally from the development team.  This approach has been borrowed from the classical manufacturing flow of the industrial revolution.  Testing a product was, and in many cases still is, a required function performed in between manufacturing and shipping.  Things were built and then tested, it was incomprehensible to do otherwise.

With the advent of software products it became obvious that tests could be performed before the entire product was finished.  Modularization was the key to enable testing at different stages of product development, but in general teams still waited until the module was finished before verification.

The next step in improved efficiency was taken by the verification teams.  The team would begin to develop the test structure and policy from module specifications, so overlap was achieved and total schedules shortened.  But it was still a uni-directional situation.  Verification teams obtained information from the development team and acted accordingly.  It is time for another change in methodology.

Just as the industry is striving to attain true hardware/software co-development, so it should aim to obtain design/verification co-execution.

Figure 1: Concurrent design/verification eliminates feedback loop increasing efficiency and QoR

Industry feedback

Michael Sanie, senior director of verification marketing at Synopsys, noted that: “The biggest verification challenge today is to deal effectively with systemic complexity; the use of software, design for low power, increasing analog design, and bigger logic design are all blurring the existing disciplines boundaries. This requires engineers to master a set of skills across formerly separate domains, which requires more up-front planning. It is not enough to know how to design something; engineers must also understand how to verify that it works correctly.  What complicates this further are the market dynamics and demand-driven complexities, the confluence of time-to-market pressures, increasing software/app content, and varied use-models.  It is no longer sufficient for teams to optimize a SoC architecture based on design trade-offs. Now, they must also understand how to ensure an efficient verification process and earlier software bring-up.”

Michael further pointed out that: “The “how’s” are not easy and are working themselves out as the industry walks through this journey.  Nevertheless, the dynamics that remain the same are twofold:

- Though specialization will continue to be the case (and get further deeper), more up front participation from all ‘specialists’ will be required just to make sure one team’s interest is not in the gross opposition of another team.

- Native integration of the tools and creating combined workflows will continue to drive user productivity and performance.”

Jin Zhang is vice president of marketing and customer relations at Oski Technology.  Oski provides formal verification products so she provided an analysis of the problem from a formal point of view.

“Formal verification is a white-box verification technique, which means formal engineers need to have a good understanding about the internal of the design in order to do effective formal verification. Therefore, formal engineers and RTL designers naturally have a much tighter working relationship than other disciplines.

First, a sound verification methodology should allow equal contribution from all effective techniques, which includes leveraging the exhaustiveness of formal to sign-off on design blocks that are harder to verify with simulation. The block partition between formal and simulation should be clean to simplify the effort on both ends. To achieve that, formal engineers should participate in the architectural planning and exploration stage of design development in order to help influence decisions regarding design partition and block interface. A well-partitioned design with a clean interface will make the decision on where to apply formal, as well as the actual formal verification tasks, much easier.

Once design blocks are selected for formal, formal engineers will need some time from designers to understand the design under test. Often, this is done through reading a design spec, but some interaction with designers is needed to clarify certain situations.

In typical formal verification projects, the verification team needs a few hours of designers’ time to get a good understanding of the design. This exercise often turns out to be beneficial for designers as well. It gives them the opportunity to think about the interface between design blocks, improve design spec and, sometimes, fix issues in the design or spec.”

Larry Lapides, vice president of Sales at Imperas, takes a look at development and the verification needs of software, a sector that is increasing in importance in SoC design.

“The future of system design demands more integrated and concurrent hardware / software / systems development and verification, especially with regard to virtual platforms, a unifying technology and methodology. There is an inevitable upwards curve toward hardware and software co-design and verification, yet we have not yet reached the point where it is well-supported, understood, or widely adopted.

Here are a few observations on developing embedded software:

1.      As software complexity is increasing exponentially, companies need to adopt better ways to address problems, as eventually the existing methods will no longer be sufficient. (VDC researchers found that the size of embedded code bases is growing at roughly twice the speed of the embedded developer community.)

2.      One serious failure changes everything. Think of car braking systems, avionics communications, or steering electronics. Think of the challenges we increasingly face in the security arena, which includes all levels of software as well as the underlying hardware.

3.      There is a lesson to be learned from SoC design verification:  a structured methodology provides predictable execution and measurable reduction of risk. Specifically, simulation is necessary, but not sufficient.  Advanced methodology is needed on top of simulation.  (This is how Verisity’s Specman users were able to achieve over twice the industry rate for first silicon success.)

The embedded software development domain needs to adopt a more formalized approach. One key technology and methodology for embedded software and systems design is the use of virtual platforms (high-level software simulation).

What we and our customers have seen regarding virtual platform solutions is this: as systems become more complex, organizations are turning to modeling and simulation tools to improve their software development environments. Virtual platform solutions are being adopted as a mechanism to improve system quality and to accelerate software development and testing. Notably, engineering teams whose projects align with Agile and Continuous Integration (CI) product development methodologies are more likely to use virtual platform solutions.

So, teams which need to build complex products with high quality in shorter schedules need to adopt virtual platform based solutions.”

The use of FPGA in designs is increasing due to the growth in capabilities of FPGA devices. Although today microcontroller systems can all be implemented in one FPGA, Bart Connolly, Senior FAE, at Blue Pearl Software, believes that verification has not yet reached a mature state in FPGA designs.  Bart describes the status as follows.

“Modern ASIC/FPGA verification flows rely on several types of tools to varying degrees: Simulation, emulation, static checking and formal analysis using assertions. All of these techniques have their advantages and disadvantages. FPGA testing on-the-bench offers the advantage of using the real system environment to verify. However, it can be more difficult to get visibility, and finding simple problems can be more time consuming than simulation and lint. It also only covers the error conditions that the system software exposes without uncovering the error conditions that pseudo-random simulation vectors do. On the other hand, setting up a simulation environment takes more time and effort than FPGA designers may be accustomed to.

As an example, structural analysis (RTL Lint) can look through the design for unconnected nets.  Often an unconnected net is intended, and it gets optimized away in synthesis. To understand the situation, a user needs to understand the context of the unconnected net. It may make sense to remove from the message list a port name or IP name. Sometimes further analysis of the logic is needed upstream or downstream from the unconnected net. If a signal drives logic that is actually used, it will cause unintended values to be propagated.  A user needs to quickly go through many of these messages and determine if they must be fixed or can be ignored.

Today’s lint analyzes the code and builds structures to look for situations that could be problems. There is the usual language checking, to see that the RTL code fits the language spec and is synthesizable, and searching for coding constructs that can cause problems. Other examples of problems that can be found at this early stage are “if” without “else” coding, checking for correct reset methodology (asynchronous or synchronous), clock identification, and unsynchronized clock domain crossing (CDC) identification.”

There are obvious avenues to improve verification states Bart.

“FPGA Verification planning needs to look beyond the on-the-bench system testing and begin considering techniques to make the process more efficient and to offer better coverage. FPGA design teams whose priorities are more focused toward short design cycles will have a different mix of verification techniques than ASIC teams. When it comes to an FPGA Verification flow, lint or static RTL checking offers a good bang-for-your-buck as compared to other techniques such as OVM/UVM and static formal techniques.”

Conclusion

As Michael Sanie of Synopsys said the “how” differs according to the nature of the project, but the approach of co-executing design and verification remains a key to improved efficiency of the project team and increased quality of results.

Lately much has been said about utilizing Agile methods in hardware development.  Concurrent design and verification could be seen as a variation of Agile suitable for hardware development.  I propose a single team that is responsible for both design and verification where each member both develops hardware and verifies the development of other members of the team.  In this manner all of the project knowledge is within one team, each member uses both development and verification skills, and the project truly belongs to the team, not to an amorphous group that is likely to be geographically dispersed with teams that have diverse organizational goals.


Viewing all articles
Browse latest Browse all 3

Latest Images

Trending Articles





Latest Images