1 / 50

Marcelo Lubaszewski Aula 4 - Teste PPGC - UFRGS 2005/I

CMP238: Projeto e Teste de Sistemas VLSI. Marcelo Lubaszewski Aula 4 - Teste PPGC - UFRGS 2005/I. Lecture 4 – Testability Measures and Test Pattern Generation. Testability Purpose, origins Analysis, measures and computation Summary Automatic test pattern generation

renate
Download Presentation

Marcelo Lubaszewski Aula 4 - Teste PPGC - UFRGS 2005/I

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CMP238: Projeto e Teste de Sistemas VLSI Marcelo Lubaszewski Aula 4 - Teste PPGC - UFRGS 2005/I

  2. Lecture 4 – Testability Measures andTest Pattern Generation • Testability • Purpose, origins • Analysis, measures and computation • Summary • Automatic test pattern generation • Structural vs. functional test • Definitions • Types of Algorithms • Summary

  3. Purpose • Need approximate measure of: • Difficulty of setting internal circuit lines to 0 or 1 by setting primary circuit inputs • Difficulty of observing internal circuit lines by observing primary outputs • Uses: • Analysis of difficulty of testing internal circuit parts – redesign or add special test hardware • Guidance for algorithms computing test patterns – avoid using hard-to-control lines • Estimation of fault coverage • Estimation of test vector length

  4. Origins • Control theory • Rutman 1972 -- First definition of controllability • Goldstein 1979 -- SCOAP • First definition of observability • First elegant formulation • First efficient algorithm to compute controllability and observability • Parker & McCluskey 1975 • Definition of Probabilistic Controllability • Brglez 1984 -- COP • 1st probabilistic measures • Seth, Pan & Agrawal 1985 – PREDICT • 1st exact probabilistic measures

  5. Testability Analysis • Involves Circuit Topological analysis, but no test vectors and no search algorithm • Static analysis • Linear computational complexity • Otherwise, is pointless – might as well use automatic test-pattern generation and calculate: • Exact fault coverage • Exact test vectors

  6. Types of Measures • SCOAP – Sandia Controllability and Observability Analysis Program • Combinational measures: • CC0 – Difficulty of setting circuit line to logic 0 • CC1 – Difficulty of setting circuit line to logic 1 • CO – Difficulty of observing a circuit line • Sequential measures – analogous: • SC0 • SC1 • SO

  7. Range of SCOAP Measures • Controllabilities – 1 (easiest) to infinity (hardest) • Observabilities – 0 (easiest) to infinity (hardest) • Combinational measures: • Roughly proportional to # circuit lines that must be set to control or observe given line • Sequential measures: • Roughly proportional to # times a flip-flop must be clocked to control or observe given line

  8. Goldstein’s SCOAP Measures • AND gate O/P 0 controllability: output_controllability = min (input_controllabilities) + 1 • AND gate O/P 1 controllability: output_controllability = S (input_controllabilities) +1 • XOR gate O/P controllability output_controllability = min (controllabilities of each input set) + 1 • Fanout Stem observability: S or min (some or all fanout branch observabilities)

  9. Controllability Examples

  10. More Controllability Examples

  11. Controllability Through Level 0 Circled numbers give level number. (CC0, CC1)

  12. Controllability Through Level 2

  13. Final Combinational Controllability

  14. Observability Examples To observe a gate input: Observe output and make other input values non-controlling

  15. More Observability Examples To observe a fanout stem: Observe it through branch with best observability

  16. Combinational Observability for Level 1 Number in square box is level from primary outputs (POs). (CC0, CC1) CO

  17. Combinational Observabilities for Level 2

  18. Final Combinational Observabilities

  19. Testability Computation For all PIs, CC0 = CC1 = 1 and SC0 = SC1 = 0 For all other nodes, CC0 = CC1 = SC0 = SC1 = Go from PIs to POS, using CC and SC equations to get controllabilities -- Iterate on loops until SC stabilizes -- convergence guaranteed For all POs, set CO = SO =0 Work from POs to PIs, Use CO, SO, and controllabilities to get observabilities Fanout stem (CO, SO) = min branch (CO, SO) If a CC or SC (CO or SO) is , that node is uncontrollable (unobservable) 8 8

  20. Summary • Testability approximately measures: • Difficulty of setting circuit lines to 0 or 1 • Difficulty of observing internal circuit lines • Uses: • Analysis of difficulty of testing internal circuit parts • Redesign circuit hardware or add special test hardware where measures show bad controllability or observability • Guidance for algorithms computing test patterns – avoid using hard-to-control lines • Estimation of fault coverage – 3-5 % error • Estimation of test vector length

  21. Functional vs. Structural ATPG • Functional ATPG • generate complete set of tests for circuit input-output combinations • 129 inputs, 65 outputs: • 2129 = 680,564,733,841,876,926,926,749, • 214,863,536,422,912 patterns • Using 1 GHz ATE, would take 2.15 x 1022 years

  22. Sum and Carry Circuits

  23. Functional vs. Structural (Cont’d) • Structural test: • No redundant adder hardware, 64 bit slices • Each with 27 faults (using fault equivalence) • At most 64 x 27 = 1728 faults (tests) • Takes 0.000001728 s on 1 GHz ATE • Designer gives small set of functional tests – augment with structural tests to boost coverage to 98+ %

  24. Definition of Automatic Test-Pattern Generator • Operations on digital hardware: • Inject fault into circuit modeled in computer • Use various ways to activate and propagate fault effect through hardware to circuit output • Output flips from expected to faulty signal • Test generation cost • fault-dependent or not • Quality of generated test • fault coverage (fault simulation) • Test application cost • test time, memory requirements

  25. TG Types • Exhaustive • cheap generation, high FC, expensive application • Fault-Oriented (deterministic) • expensive generation, possibly high FC, cheaper application • reduction of generation costs • Random (pseudo-random) • cheap generation, low FC, + - expensive application

  26. Exhaustive Algorithm • For n-input circuit, generate all 2n input patterns • Infeasible, unless circuit is partitioned into cones of logic, with 15 inputs • Perform exhaustive ATPG for each cone • Misses faults that require specific activation patterns for multiple cones to be tested

  27. Random-Pattern Generation • Flow chart for method • Use to get tests for 60-80% of faults, then switch to D-algorithm or other ATPG for rest

  28. Path Sensitization Method • Fault Sensitization (activation) • Fault Propagation • Line Justification

  29. Path Sensitization Method • Fault l s-a-v • Activation • set l to v • Propagation • find a path from l to a primary output that keeps faulty value • Justification • set the primary inputs to activate the fault

  30. Composite Logic Values • consider line value for original AND faulty circuit • v/vf = original/faulty • Symbols D and D (Roth, 1966) • D = 1/0 • D = 0/1 • 0 = 0/0 • 1 = 1/1

  31. Operations on Composite Values • D + 0 = 0/1 + 0/0 = 0/1 = D

  32. Path Sensitization Method • Propagation D 1

  33. Path Sensitization Method • Propagation: try path f – h – k – L 1 D D D D 1 0 1

  34. Path Sensitization Method • Propagation: try path f – h – k – L 1 D D D D 1 0 1

  35. Path Sensitization Method • Justification: Try path f – h – k – L blocked at j, since there is no way to justify the 1 on i 1 D D D D 1 D 0 1 1

  36. Path Sensitization Method • Justification: Try path f – h – k – L blocked at j, since there is no way to justify the 1 on i 1 D 1 D 1 1 D D’ D 1 1

  37. Path Sensitization Method • Backtracking! X X 1 D X 1 D X 1 1 X D D’ X D X 1 X 1

  38. Path Sensitization Method • Try other propagation: pathg – i – j – k – L 0 D D D 1 D D D 1 1

  39. Path Sensitization Method • Try other propagation: pathg – i – j – k – L 0 D D D 1 D D D 1 1

  40. Path Sensitization Method • Try other propagation: pathg – i – j – k – L 0 0 D D D 1 D D D 1 1

  41. Major Combinational Automatic Test-Pattern Generation Algorithms • D-Algorithm (Roth) -- 1966 • PODEM (Goel) -- 1981 • FAN (Fujiwara and Shimono) --1983

  42. Sequential Circuit ATPGTime-Frame Expansion • Problem of sequential circuit ATPG • Time-frame expansion

  43. Example of Sequential Circuit

  44. Sequential Circuits • A sequential circuit has memory in addition to combinational logic. • Test for a fault in a sequential circuit is a sequence of vectors, which • Initializes the circuit to a known state • Activates the fault, and • Propagates the fault effect to a primary output • Methods of sequential circuit ATPG • Time-frame expansion methods • Simulation-based methods

  45. Extended D-Algorithm 1. Pick up a target fault f. 2. Create a copy of a combinational logic, set it time-frame 0. 3. Generate a test for f using D-algorithm for time-frame 0. 4. When the fault effect is propagate to the DFFs, continue fault propagation in the next time-frame. 5. When there are values required in the DFFs, continue the justification in the previous time-frame.

  46. Example for Extended D- Algorithm

  47. Example: Step 1

  48. Example: Step 2

  49. Example: Step 3

  50. Summary • Hierarchical ATPG -- 9 Times speedup (Min) • Handles adders, comparators, MUXes • Advances over D-algorithm • Results of 40 years research – mature – methods: • Path sensitization • Simulation-based • Boolean satisfiability and neural networks • Genetic algorithms

More Related