1 / 15

Verification of Configurable Processor Cores

Verification of Configurable Processor Cores. Marines Puig-Medina, Gulbin Ezer, Pavlos Konas Design Automation Conference, 2000 Page(s): 426~431 presenter: Peter 2000/11/06. What’s the problem?. The verification methodology for configurable processor cores.

lucie
Download Presentation

Verification of Configurable Processor Cores

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verification of Configurable Processor Cores Marines Puig-Medina, Gulbin Ezer, Pavlos Konas Design Automation Conference, 2000 Page(s): 426~431 presenter: Peter 2000/11/06

  2. What’s the problem? • The verification methodology for configurable processor cores. • Simulation-based approach uses directed diagnostics and pseudo-random program generators. • A configurable and extensible test-bench for SOC verification. • Coverage analysis provided.

  3. Introduction • The processor core should contain only the necessary functionality (defining and incorporating new instructions) so that it consumes little power,small area,high performance. (Tensilica) • A robust and flexible methodology for the verification of the processor (for architectural and micro-architectural testing)

  4. Configurable processor • Xtensa: enable configurability, minimize code size, reduce power, and maximize performance. • The processor generator include: RTL code and a test-bench; a C compiler, an assembler, a linker, a debugger, a code profiler, an ISS.

  5. Functional verification

  6. Test program generation • Using the “Perl” scripts (an OO based verification language) • AVP(architectural verification program): testing the execution of each instruction in the ISA. • MVP(micro-architectural): testing features of the Xtensa implementation. • Random test program.

  7. The examples

  8. Co-simulation(1) • The comparison process is implemented in Vera-VHL (from Synopsys Inc.) • There are three major advantages: • allows fine-grain checking through processor states during simulation. • Constructing a comprehensive self-checking diagnostic is considerably. • Stop the simulation at, or near, the cycle where the problem appears.

  9. Co-simulation(2) • The biggest challenges: finding the appropriate synchronization points between models at different levels. • In Xtensa, the interrupt latency can’t be reproduced by ISS model. • Masking off comparisons when the processor state is architecturally undefined.

  10. The test-bench

  11. Coverage • Employing ISS monitors (written in Perl ) that check the architectural level coverage. • Using Vera monitors to check RTL state and micro-architectural features. • Using “HDLScore” (a program-based coverage tool), Vera FSM monitors.

  12. Proc1: only part of the available option. Proc2: represents a maximum configuration. Proc3: a randomly generated configuration. The examples(1)

  13. The example(2)

  14. Conclusion(1) • Present methodology for generating AVP and MVP(Perl script) • Outline the coverage analysis methodology.(based on Vera) • The author is working on expanding the coverage analysis framework and the random diagnostic test-program generator.

  15. Conclusion(2) • Measuring coverage is only useful if the results of the analysis are conveyed back to the verification and design teams and they are used to improve the verification process. • The coverage tool: Perl, Vera (Synopsys), Verification Navigator( TransEDA)

More Related