technical university tallinn, estonia 1 raimund ubar tallinn technical university estonia...

102
Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia [email protected] www.ttu.ee/ˇraiub/ Stockholm, May 19, 2003 Testing Strategies for NoC

Upload: hester-dawson

Post on 21-Jan-2016

235 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA1

Raimund Ubar

Tallinn Technical UniversityEstonia

[email protected]/ˇraiub/

Stockholm, May 19, 2003

Testing Strategies for NoC

Page 2: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA2

OUTLINE

• Introduction: how much to test?• Defect modeling• Hierarchical approaches to test generation• Built-in self-test• Stimuli generation in BIST• Response compaction and signature analyzers• BIST architectures• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure• Conclusions

Page 3: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA3

Introduction

• The reliability of electronic systems is no longer a topic of limited critical applications like – military, aerospace and nuclear industries, where– failures may have catastrophic consequences

• Electronic systems are becoming ubiquitous – their reliability issues are present in all types of

consumer applications

• Adequate testing of electronic products is a must

Page 4: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA4

Introduction

• The complexity of systems, new failure models and modern technologies – cause the necessity for developing more efficient test

methods • In the middle of 1990s, the core based SoC concept

evolved– new strategies and standards dedicated to SoC test

• Today the design methdology is moving towards the NoC approach– the presence of the regular communication structure requires

new dedicated methods to test it

Page 5: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA5

Introduction

Dependability

Fault-ToleranceFault DiagnosisBISTTest

Reliability Security Safety

There is no sequrityon the earth,there is only oportunity Douglas McArthur

(General)

Test Diagnosis

Design for testability:

Page 6: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA6

Introduction – Test Tools

Test

System

Fault table

System model

Test generation

Fault simulation

Test result

Fault diagnosis

Go/No go Located defect

Test experiment

Test tools

Page 7: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA7

Introduction – Test Tasks

Fault Diagnosis and Test Generation as direct and reverse mathematical tasks:

dy = F(x1, ... , xn) F(x1 dx1 , ... , xn dxn)

dy = F(X, dX)

Direct task:

Test generation: dX, dy = 1 given, X = ?

Reverse task:

Fault diagnosis: X, dy given, dX = ?

Fault simulation: X, dy = 1 given, dxk = ?

Fault Simulation is a special case of fault diagnosis

Page 8: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA8

Introduction – Fault Diagnosis

F1 F2 F3 F4 F5 F6 F7

T1 0 1 1 0 0 0 0T2 1 0 0 1 0 0 0T3 1 1 0 1 0 1 0T4 0 1 0 0 1 0 0T5 0 0 1 0 1 1 0T6 0 0 1 0 0 1 1

Fault F5 located

Faults F1 and F4 are not distinguishable

Fault localization by fault tables

E1 E2 E3

0 0 10 1 00 1 01 0 11 0 10 0 0

No match, diagnosis not possible

Page 9: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA9

Test as the Quality Problem

Quality policyYield (Y)

P,n

Defect level

Design for testability

Testing P - probability of a defectn - number of defects

nPY )1( - probability of producing a good product

)1(1 TYDL

Page 10: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA10

How Much to Test?

Cost oftesting

Quality

Cost of quality

Cost

Cost ofthe fault

100%0% Optimumtest / quality

How to succeed? Try too hard!How to fail? Try too hard!(From American Wisdom)

Conclusion:

“The problem of testingcan only be containednot solved” T.Williams

Page 11: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA11

How Much to Test?

Paradox:264 input patterns (!) for 32-bit accumulator will be not enough.

A short will change the circuit into sequential one,and you will need because of that 265 input patterns

Paradox:Mathematicians counted that Intel 8080

needed for exhaustive testing 37 (!) yearsManufacturer did it by 10 secondsMajority of functions will never activated during the lifetime of the system

Time can be your best friendor your worst enemy (Ray Charles)

& &x1

x2

x3

yState q

Y = F(x1, x2, x3,q)

*1

1

Y = F(x1, x2, x3)Bridging fault

0

Page 12: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA12

How to Generate a Good Test?

Paradox:

To generate a test for a block in a system,

the computer needed 2 days and 2 nights

An engineer

did it by hand with 15 minutes

So, why computers?

The best place to start iswith a good title.Then builda song around it. (Wisdom of country music)

System

16 bit counter

&

1Sequence

of 216 bits

Sea of gates

Page 13: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA13

Complexity vs. Quality

Problems:• Traditional low-level test generation and fault simulation methods and tools for

digital systems have lost their importance because of the complexity reasons

• Traditional Stuck-at Fault (SAF) model does not quarantee the quality for deep-submicron technologies

• How to improve test quality at increasing complexities of today's systems?

Two main trends: – Defect-oriented test and – High-level modelling

• Both trends are caused by the increasing complexities of systems based on deep-submicron technologies

Page 14: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA14

Towards Solutions

• The complexity problems in testing digital systems are handled by raising the abstraction levels from gate to register-transfer level (RTL) instruction set architecture (ISA) or behavioral levels – But this moves us even more away from the real life of

defects (!)

• To handle defects in circuits implemented in deep-submicron technologies, new defect-oriented fault models and defect-oriented test methods should be used– But, this is increasing even more the complexity (!)

• As a promising compromise and solution is:

To combine hierarchical approach with defect orientation

Page 15: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA15

OUTLINE

• Introduction: how much to test?

• Defect modeling• Hierarchical approaches to test generation• Built-in self-test• Stimuli generation in BIST• Response compaction and signature analyzers• BIST architectures• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 16: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA16

Fault and defect modeling

Defects, errors and faults• An instance of an incorrect

operation of the system being tested is referred to as an error

• The causes of the observed errors may be design errors or physical faults - defects

• Physical faults do not allow a direct mathematical treatment of testing and diagnosis

• The solution is to deal with fault models

System

Component

Defect

Error

Fault

Page 17: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA17

Transistor Level Defects

Stuck-at-1Broken (change of the function)BridgingStuck-open New StateStuck-on (change of the function)

Short (change of the function)

Stuck-off (change of the function)

Stuck-at-0

SAF-model is not able to cover all the transistor level defects

How to model transistor defects ?

Page 18: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA18

Mapping Transistor Defects to Logic Level

Shortx1

x2

x3

x4

x5

y

)()(* dd dyyy

))(( 53241 xxxxxyd 54321 xxxxxy

Generic function with defect:

Function:

Faulty function:

A transistor fault causes a change in a logic function not representable by SAF model

Defect variable: d =0 – defect d is missing

1 – defect d is present

Mapping the physical defect onto the logic level by solving the equation:

1*

dy

Page 19: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA19

Mapping Transistor Faults to Logic Level

Shortx1

x2

x3

x4

x5

y )()(* dydyy d

))(( 53241 xxxxxyd 54321 xxxxxy

Test calculation by Boolean derivative:

1

))(()(*

5432154315421

5324154321

xxxxxxxxxxxxx

xxxxxxxxxxy

ddd

d

Generic function with defect:

Function:

Faulty function:

Page 20: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA20

Why Boolean Derivatives?

1)()(* dydyy d

))(( 53241 xxxxxyd 54321 xxxxxy

1

))()((

/)(

))(()(*

5432154315421

4325132154

4325132154

5324154321

xxxxxxxxxxxxx

xxxxxxxxxx

xxxxxxxxxx

xxxxxxxxxxy

ddd

ddd

Distinguishing function:

Given:

1)))((()( 5324154321 xxxxxxxxxxDBD-based approach:

Usingthe properties of BDs, the procedure of solving the equation becomes easier

Page 21: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA21

Functional Fault vs. Stuck-at Fault

NoFull SAF-Test Test for the defect

x1 x2 x3 x4 x5 x1 x2 X3 x4 x5

1 1 1 1 0 - 1 0 - 0 1

2 0 - - 1 1 1 - 0 0 1

3 0 1 1 0 1 0 1 1 1 0

4 1 0 1 1 0

5 1 1 0 0 -

Full 100% Stuck-at-Fault-Test is not able to detect the short:

54321 xxxxxy

The full SAF test is not covering any of the patterns able to detect the given transistor defect

))(( 53241 xxxxxyd

Functional fault

Page 22: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA22

Defect coverage for 100% Stuck-at Test

Results:

• the difference between stuck-at fault and physical defect coverages reduces when the complexity of the circuit increases (C2 is more complex than C1)

• the difference between stuck-at fault and physical defect coverages is higher when the defect probabilities are taken into account compared to the traditional method where all faults are assumed to have the same probability

Probabilisticdefect

coverage, %

Denumerabledefect coverage,

%

Circuit

Tmin Tmax Tmin Tmax

C1 66,68 72,01 81,00 83,00

C2 70,99 77,05 84,29 84,76

Page 23: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA23

Generalization: Functional Fault Model

Constraints calculation:

yComponent F(x1,x2,…,xn)

Defect

Wd

Component with defect:

Logical constraints

dn dFFddxxxFy ),,...,,(** 21

Fault-free Faulty

1*

d

yW d

Fault model: (dy,Wd), (dy,{Wk

d})

Constraints:

d = 1, if the defect is present

Page 24: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA24

Functional Fault Model for Stuck-ON

Stuck-on

x1 x2

Y

VDD

VSS

x1

x2

NOR gate

Conducting path for “10”

)( NP

NDDY RR

RVV

RN

RP

dZxxxx

Zxxxxdxxdy

2121

212121 )()(*

1/* 21 ZxxdyW d

x1 x2 y yd

0 0 1 1

0 1 0 0

1 0 0 Z: VY/IDDQ

1 1 0 0

Condition of the fault potential detecting:

Page 25: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA25

Functional Fault Model for Stuck-Open

Stuck-off (open)

x1 x2

Y

VDD

VSS

x2

NOR gate

No conducting path from VDD to VSS for “10”

x1

Test sequence is needed: 00,10

x1 x2 y yd

0 0 1 1

0 1 0 0

1 0 0 Y’

1 1 0 0

)'(

)'()(*

12

212121

dyxx

yxxxxdxxdy

1'/* 21 yxxdyW d

t x1 x2 y1 0 0 1

2 1 0 1

Page 26: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA26

Functional Fault Model for Shorts

Example:

Bridging fault between leads xk and xl

The condition means that

in order to detect the short between leads xk and xl on the lead xk we have to assign to xk the value 1 and to xl the value 0.

lkkd

lkkd

kkk

xxdxW

xxdxddxxdx

/*

)(*

1 lkd xxW

xk

xl

x*k

d

Wired-AND model

xk*= f(xk,xl,d)

Page 27: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA27

Functional Fault Model for Sequential Shorts

Example:

x1

x2

x3

y&&

x1

x2 x3

y&&

&

321

321321

)'(

)()(*

xydxx

xyxxdxxxdy

Equivalent faulty circuit:

Bridging fault causes a feedback loop:

1'/* 321 yxxxdyW d

Sequential constraints:

A short between leads xk and xl changes the combinational circuit into sequential one

t x1 x2 x3 y

1 0 1 02 1 1 1 1

Page 28: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA28

First Step to Quality

How to improve the test quality at the increasing complexity of systems?

First step to solution:Functional fault model

was introduced

as a means

for mapping physical defects

from the transistor or layout level

to the logic level

System

Component Low level

kWFk

WSk

Environment

Bridging fault

Mapping

Mapping

High level

Page 29: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA29

Faults and Test Generation Hierarchy

Circuit

Module

System

Networkof gates

Gate

Functionalapproach

Fki Test

Fk Test

WFki

WSki

F Test

WFk

WSk

Structuralapproach

Networkof modules

Wdki

Interpretation of WFk:

- as a test on the lower level

- as a functional fault on the higher level

Higher Level

Component Lower level

kWFk

WSk

Environment

Bridging fault

Page 30: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA30

OUTLINE

• Introduction: how much to test?• Defect modeling

• Hierarchical approaches to test generation• Built-in self-test• Stimuli generation in BIST• Response compaction and signature analyzers• BIST architectures• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 31: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA31

Hierarchical Test Generation

• In high-level symbolic test generation the test properties of components are often described in form of fault-propagation modes

• These modes will usually contain:– a list of control signals such that the data on input lines is reproduced

without logic transformation at the output lines - I-path, or

– a list of control signals that provide one-to-one mapping between data inputs

and data outputs - F-path • The I-paths and F-paths constitute connections that can be used to

propagate test vectors from input ports (or any controllable points) to the inputs of the Module Under Test (MUT) and to propagate the test response to an output port (or any observable points)

• In the hierarchical approach, top-down and bottom-up strategies can be distinguished

Page 32: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA32

Hierarchical Test Generation Approaches

Bottom-up approach: • Pre-calculated tests for components

generated on low-level will be assembled at a higher level

• It fits well to the uniform hierarchical approach to test, which covers both component testing and communication network testing

• However, the bottom-up algorithms ignore the incompleteness problem

• The constraints imposed by other modules and/or the network structure may prevent the local test solutions from being assembled into a global test

• The approach would work well only if the the corresponding testability demands were fulfilled

A

B

C

D

a

D

c

Local test:A = a.x B = f’(D) C = c.x

a,c,D fixedx - free

aSystem

Modulec

Page 33: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA33

Hierarchical Test Generation Approaches

• Top-down approach - to solve the test generation problem by deriving environmental constraints for low-level solutions.

• This method is more flexible, since it does not narrow the search for the global test solution to pregenerated patterns for the system modules

• The method is of little use when the system is still under development in a top-down fashion, or when “canned” local tests for modules or cores have to be applied

Top-down approach: A

B

C

D’

a’.x

d’.x

c’.x

Symbolic global test:A = a’.xD’ = d’.xC = c’.x

a’

c’

a’,c’,D’ fixedx - free

System

Module

Page 34: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA34

Basics of Theory for Test and Diagnostics

Two basic tasks:

1. Which test patterns are needed to detect a fault (or all faults)2. Which faults are detected by a given test (or by all tests)

ALU

&1

0

0

&10

Gate

Multiplier

System

Booleandifferential

algebraonly for

logic level

DecisionDiagrams

for logic and

higher levels

Page 35: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA35

Two trends:

• high-level modeling– to cope with

complexity• low-level

modeling– to cope with

physical defects, to reach higher acuracy

Hierarchical Diagnostic Modeling

Physicaldefect

analysis

Defect

Complex gate

Gate-level fault analysis

Module

System

Functionalfault

detected

High-levelfault analysis

High-levelsimulation

Gate-level simulation

YyMyGd

Functional fault activated

Boolean differential algebraBDD-s

High-Level DD-s

Page 36: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA36

Binary Decision Diagrams

x1

x2

y

x3

x4 x5

x6 x7

0

1

7654321 )( xxxxxxxy Simulation:

7654321 xxxxxxx0 1 1 0 1 0 0

1y

Boolean derivative and test generation:

15427613

xxxxxxx

y

1

0

Functional BDD

Page 37: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA37

Low-Level Test Generation on SSBDDs

Test generation for a bridging fault:

&

&

&

&

&

&

&

1

2

3

4

5

6

7

71

72

73

a

b

c

d

e

y

Macro

DDD

D D

11

1

1

x7

Network

Defect Wd

2. Activate a path:

Bridge between leads 73 and 6: (dx7,Wd)

6 73

1

2

5

7271

y

0

1

Path to 71: x1 = 1, x2 = 1

Path from 71: x5 = 0

Wd: x6 = 0, x7 = 11. Solve the constraint:

Test pattern:

1 2 3 4 5 6 7 y

1 1 0 0 1 1

Page 38: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA38

Test Generation on High Level DDs

R2M3

e+M1

a

*M2

b

R1

IN

c

d

y1 y2 y3 y4

y4

y3 y1 R1 + R2

IN + R2

R1 * R2

IN* R2

y2

R2 0

1

2 0

1

0

1

0

1

0

R2

IN

R12

3

Multiple paths activation in a single DDControl function y3 is tested

Data path

Decision Diagram

High-level test generation with DDs: Conformity test

Control: For D = 0,1,2,3: y1 y2 y3 y4 = 00D2

Data: Solution of R1+ R2 IN R1 R1* R2

Test program:

Page 39: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA39

Hierarchical Test Generation on DDs

R2M3

e+M1

a

*M2

b

R1

IN

c

d

y1 y2 y3 y4

y4

y3 y1 R1 + R2

IN + R2

R1 * R2

IN* R2

y2

R2 0

1

2 0

1

0

1

0

1

0

R2

IN

R12

3

Single path activation in a single DDData function R1* R2 is testedData path

Decision Diagram

Hierarhical test generation with DDs: Scanning test

Control: y1 y2 y3 y4 = x032

Data: For all specified pairs of (R1, R2)

Test program:

Low level test data

Page 40: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA40

Test Generation for RTL Cores

y3 0

C R’2

C

y2

A

2R’2

y1

R’1

R’3 B

F(B,R’3)

A

A

AR’1

0

0

0

R’2

Y,R3 R2

0

1 1

0

0

2

3

R1

C

R’1

R’1

1

0

0

1

02

01

0 1

1

C+R’2

R’3 R’2

R’1

Transparency functions on Decision Diagrams:

Y = C y3 = 2, R3’ = 0C - to be testedR1 = B y1 = 2, R3’ = 0R1 - to be justified

+ R3

R2

F R1

A

BC

Y

y2

A

y3

y1 s

High-level path activation on DDs0

2

Page 41: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA41

Test Generation for RTL Cores

t q’ y1 y2 y3 A B C R1 R2 R3 Y

1 0 0 0 1 0

2 1 1 2 0 0

3 2 2 0 0 D2 D2 0

4 4 1 1 2 D1 D D D

Symbolic test sequence:

y3 =2

R’ 2 =0

y2 = 0

0 R 3 =D

A R’ 1

A =D 1

R’ 1 =D 2 B =D 2

R’3

=0

y1

=2

y3

= 0

0

C =D

q’=4

Fault manifestation

q’=2 q’=1 q’=0

R’2

= 0 y2

= 0

q’=1

q’=2

Constraints justification

Faultpropagation

t t-1 t-2 t-3Time:

0

High-level test generation example:

+ R3

R2

F R1

A

BC

Y

y2

A

y3

y1 s

Page 42: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA42

Test Generation for Processor Cores

I1: MVI A,D A IN

I2: MOV R,A R A

I3: MOV M,R OUT R

I4: MOV M,A OUT A

I5: MOV R,M R IN

I6: MOV A,M A IN

I7: ADD R A A + R

I8: ORA R A A R

I9: ANA R A A R

I10: CMA A,D A A

High-Level DDs for a microprocessor (example):

Instruction set:

I R3

A

OUT4

I A2

R

IN5

R

1,3,4,6-10

I IN1,6

A

A2,3,4,5

A + R7

A R8

A R9

A10

DD-model of themicroprocessor:

Page 43: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA43

Test Generation for Processor Cores

High-Level DD-based structure of the microprocessor (example):

I R3

A

OUT4

I A2

R

IN5

R

1,3,4,6-10

I IN1,6

A

A2,3,4,5

A + R7

A R8

A R9

A10

DD-model of themicroprocessor:

OUT

R

A

IN

I

Page 44: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA44

Test Generation for Processor Cores

I R3

A

OUT4

I A2

R

IN5

R

1,3,4,6-10

I IN1,6

A

A2,3,4,5

A + R7

A R8

A R9

A10

DD-model of themicroprocessor:

Scanning test program for adder:

Instruction sequence T = I5 (R)I1 (A)I7 I4for all needed pairs of (A,R)

OUT I4

A I7

A

R

I1

IN(2)

IN(1)

R I5

Time:t t - 1 t - 2 t - 3

Observation Test Load

Page 45: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA45

Test Generation for Processor Cores

I R3

A

OUT4

I A2

R

IN5

R

1,3,4,6-10

I IN1,6

A

A2,3,4,5

A + R7

A R8

A R9

A10

DD-model of themicroprocessor:

Conformity test program for decoder:

Instruction sequence T = I5 I1 D I4

for all DI1 - I10 at given A,R,IN

Data generation:IN 110 A 101

Data

R 110 I1, I6 IN 110

I2, I3 I4, I5 A 101 I7 A + R 1011 I8 A R 111 I9 A R 00

Functions

I10 A 010

Data IN,A,R are generated so that the values of all functions were different

Page 46: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA46

DECIDER: Hierarchical ATPG

R2M3

e+M1

a

*M2

b

R1

IN

c

d

y1 y2 y3 y4

y4

y3 y1 R1 + R2

IN + R2

R1* R2

IN* R2

y2

R2 0

1

2 0

1

0

1

0

1

0

R2

IN

R12

3Modules or subcircuits are represented as word-level DD structures

Logic Synthesis Scripts

Design Compiler(Synopsys Inc.)

Gate LevelDescriptions

SSBDD Synthesis

SSBDD Modelsof FUs

Hierarchical ATPG

RTL Model(VHDL)

FULibrary(VHDL)

FULibrary(DDs)

RTL DD Synthesis

Test patterns

RTL DDModel

Page 47: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA47

ATPG: Experimental Results

Reference ATPGs:HITEC - T.M. Nierman, J.H. Patel, EDAC, 1991

GATEST - E.M.Rudnick et al., DAC, 1994

TTU:

DET/RAND - hierarchical deterministic- random ATPG

GENETIC - gate-level ATPG based on genetic algorithms

HITEC GATEST DET/RAND GENETIC

Circuit Gates Faults States %Time

s %Time

s %Time

s %time

sgcd 227 844 8 89.3 196 92.2 90 92.2 3.4 93.0 702Mult 1058 3915 8 63.5 2487 77.3 3027 79.4 13.6 80.5 19886Diffeq 4195 15386 6 95.1 >4h 96.0 4280 96.0 80.0 97.9 53540Huffm 2100 2816 21 12.5 16200 27.6 3553 12.5 8460 52.8 >10h

Page 48: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA48

OUTLINE

• Introduction: how much to test?• Defect modeling• Hierarchical approaches to test generation

• Built-in self-test• Stimuli generation in BIST• Response compaction and signature analyzers• BIST architectures• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 49: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA49

Built-In Self-Test

• Motivations for BIST:– Need for a cost-efficient testing– Doubts about the stuck-at fault model– Increasing difficulties with TPG (Test Pattern Generation)– Growing volume of test pattern data– Cost of ATE (Automatic Test Equipment)– Test application time– Gap between tester and UUT (Unit Under Test) speeds

• Drawbacks of BIST:– Additional pins and silicon area needed– Decreased reliability due to increased silicon area– Performance impact due to additional circuitry– Additional design time and cost

Page 50: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA50

BIST Techniques

• BIST techniques are classified: – on-line BIST - includes concurrent and nonconcurrent techniques– off-line BIST - includes functional and structural approaches

• On-line BIST - testing occurs during normal functional operation– Concurrent on-line BIST - testing occurs simultaneously with normal operation

mode, usually coding techniques or duplication and comparison are used – Nonconcurrent on-line BIST - testing is carried out while a system is in an idle

state, often by executing diagnostic software or firmware routines

• Off-line BIST - system is not in its normal working mode, usually – on-chip test generators and output response analyzers or microdiagnostic routines – Functional off-line BIST is based on a functional description of the Component

Under Test (CUT) and uses functional high-level fault models – Structural off-line BIST is based on the structure of the CUT and uses structural

fault models (e.g. SAF)

Page 51: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA51

Built-In Self-Test

SoC

SRAMPeripherial ComponentInterconnect

SRAM

CPU

Wrapper

CoreUnderTest

ROM

MPEG UDLDRAM

Test AccessMechanism

Test AccessMechanism

Source

Sink

System-on-Chip testingTest architecture components:

• Test pattern source & sink

• Test Access Mechanism

• Core test wrapper

Solutions:

• Off-chip solution

– need for external ATE

• Combined solution

– mostly on-chip, ATE needed for control

• On-chip solution

– BIST

Page 52: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA52

Built-In Self-Test

SoC

C3540

C1908 C880 C1355

Embedded Tester C2670

Test accessmechanismBIST BIST

BISTBISTBIST

Test Controller

TesterMemory

Embedded tester for testing multiple cores

Page 53: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA53

Built-In Self-Test

BIST Control Unit

Circuitry Under Test

CUT

Test Pattern Generation (TPG)

Test Response Analysis (TRA)

• BIST components:– Test pattern generator

(TPG)– Test response

analyzer (TRA)

• TPG & TRA are usually implemented as linear feedback shift registers (LFSR)

• Two widespread schemes:

– test-per-scan– test-per-clock

Page 54: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA54

OUTLINE

• Introduction: how much to test?• Defect modeling• Hierarchical approaches to test generation• Built-in self-test

• Stimuli generation in BIST• Response compaction and signature analyzers• BIST architectures• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 55: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA55

LFSR: Pseudorandom Test Generation

CUT

LFSR

LFSR

X1Xo Xn. . .

ho h1 hn

. . .

• Using special LFSR registers

• Several proposals:

– BILBO

– CSTP

• Main characteristics of LFSR:

– polynomial

– initial state

– test length

Page 56: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA56

Pseudorandom Test Generation

LFSR – Linear Feedback Shift Register:

1 x x2

x3

x4

x2 x 1x4

x3

Polynomial: P(x) = 1 + x3 + x4

Standard LFSR

Modular LFSR

Page 57: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA57

Pseudorandom Test Length

Time

Fa

ult

Co

ve

rag

e

Problems:• Very long test

application time• Low fault coverage• Area overhead• Additional delay

Possible solutions • Combining

pseudorandom test with deterministic test

– Multiple seed– Bit flipping

• Hybrid BIST

Time

Fau

lt C

ove

rag

e

The main motivations of using random patterns are: - low generation cost - high initial efeciency

Page 58: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA58

BIST: Weighted pseudorandom test

Hardware implementation of weight generator

LFSR

&&&

MUXWeight select

Desired weighted value Scan-IN

1/21/41/81/16

Page 59: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA59

OUTLINE

• Introduction: how much to test?• Defect modeling• Hierarchical approaches to test generation• Built-in self-test• Stimuli generation in BIST

• Response compaction and signature analyzers

• BIST architectures• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 60: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA60

BIST: Response Compression

1. Parity checking

m

iirRP

1

)(UUT

TestT

ri

Pi-1

2. One counting

)()(2

1

m

iii rrRP

UUTTest ri

Counter3. Zero counting

)()(2

1

m

iii rrRP

Page 61: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA61

Signature Analyser

1 x x2

x3

x4

x2 x 1x4

x3

Polynomial: P(x) = 1 + x3 + x4

Standard LFSR

Modular LFSR

UUT

Response string

Response in compacted by LFSR

The content of LFSR after test is called signature

Page 62: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA62

Signature Analysis

In signature testing we mean the use of CRC encoding as the data compressor G(x) and the use of the remainder R(x) as the signature of the test response string P(x) from the UUT

Signature is the CRC code word )(

)()(

)(

)(

xG

xRxQ

xG

xP

Example:

1)(

)(35

37

xxx

xxx

xG

xP

1 0 1 = Q(x) = x2 + 1

1 0 1 0 1 1 1 0 0 0 1 0 1 0

1 0 1 0 1 1

0 0 1 0 0 1 1 0 1 0 1 0 1 1

0 0 1 1 0 1 = R(x) = x3 + x2 + 1

P(x)

G(x)

Signature

Page 63: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA63

Signature Analysis

1)(

)(35

37

xxx

xxx

xG

xP

1 0 1

1 0 1 0 1 1 1 0 0 0 1 0 1 0

1 0 1 0 1 1

0 0 1 0 0 1 1 0 1 0 1 0 1 1

0 0 1 1 0 1 = R(x) = x3 + x2 + 1

P(x)

G(x)

Signature

The division process can be mechanized using LFSR

The divisor polynomial G(x) is defined by the feedback connections

Shift creates x5 which is replaced by x5 = x3 + x + 1

x0 x1 x2 x3 x4IN

IN: 01 010001 Shifted into LFSR

x5

Page 64: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA64

Signature Analysis

Aliasing:

UUTResponse

SA

L N

L - test length

N - number of stages in Signature Analyzer

Lk 2

All possible responses All possible signatures

Nk 2Faulty

response

Correct response

N << L

Page 65: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA65

Signature Analysis

Aliasing:

UUTResponse

SA

L N

L - test length

N - number of stages in Signature Analyzer

Lk 2 - number of different possible responses

No aliasing is possible for those strings with L - N leading zeros since

they are represented by polynomials of degree N - 1 that are not divisible

by characteristic polynomial of LFSR. There are such stringsNL2

Probability of aliasing:12

12

L

NL

PN

P2

1

1L

Page 66: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA66

LFSR: Signature Analyser

1 x x2 x3 x4

LFSR

UUT

Response string for Signature Analysis

Test Pattern (when generating tests)Signature (when analyzing test responses)

FF FF FF FF

Page 67: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA67

LFSR: Signature Analyser

x2 x 1x4

x3

Parallel Signature Analyzer:

UUT

x2 x 1x4

x3

UUT

Page 68: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA68

Signature Analysis

Signature calculating for multiple outputs:

LFSR - Test Pattern Generator

Combinational circuit

LFSR - Signature analyzer

Multiplexer

LFSR - Test Pattern Generator

Combinational circuit

LFSR - Signature analyzer

Multiplexer

Page 69: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA69

OUTLINE

• Introduction: how much to test?• Defect modeling• Hierarchical approaches to test generation• Built-in self-test• Stimuli generation in BIST• Response compaction and signature analyzers

• BIST architectures• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 70: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA70

BIST Components

BIST Control Unit

Circuitry Under Test

CUT

Test Pattern Generation (TPG)

Test Response Analysis (TRA)

• BIST components:– Test pattern generator (TPG)– Test response analyzer (TRA)– BIST controller

• A part of a system (hardcore) must be operational to execute a self-test

• At minimum the hardcore usually includes power, ground, and clock circuitry

• Hardcore should be tested by – external test equipment or

– it should be designed self-testable by using various forms of redundancy

General Architecture of BIST

Page 71: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA71

BIST: Test per Scan

Scan Path

Scan Path

Scan Path

.

.

.

CUT

Test pattern generator

Test response analysator

BIST Control

• Assumes existing scan architecture

• Drawback:– Long test application time

Initial test set:

T1: 1100T2: 1010T3: 0101T4: 1001

Test application:

1100 T 1010 T 0101T 1001 TNumber of clocks = 4 x 4 + 4 = 20

Page 72: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA72

BIST: Test per Clock

• Initial test set:

• T1: 1100• T2: 1010• T3: 0101• T4: 1001

• Test application:

• 1 10 0 1 0 1 0 01 01 1001 •

• T1 T4 T3 T2

• Number of clocks = 8

Combinational Circuit

Under Test

Scan-Path Register

Page 73: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA73

BIST Architectures

Test per Clock:

BILBO - Built- In Logic Block Observer:

CSTP - Circular Self-Test Path:

LFSR - Test Pattern Generator

Combinational circuit

LFSR - Signature analyzer

LFSR - Test Pattern Generator

& Signature analyser

Combinational circuit

Page 74: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA74

BILBO

Working modes:

B1 B2

0 0 Reset 0 1 Normal mode 1 0 Scan mode 1 1 Test mode

Testing modes:

CC1: LFSR 1 - TPGLFSR 2 - SA

CC2: LFSR 2 - TPGLFSR 1 - SA

LFSR 1

CC1

LFSR 2

CC2

B1B2

B1B2

Page 75: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA75

Circular Self-Test

Circuit Under Test

FF FFFF

Page 76: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA76

Circular Self-Test Path

CSTP CSTP

CSTP

CSTP CSTP

CC CC

CC

CC

CC

R R

Page 77: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA77

BIST Architectures

Test Pattern Generator

MISR

R1 CC1...

STUMPS:Self-Testing Unit Using MISR and Parallel Shift Register Sequence Generator

LOCST: LSSD On-Chip Self-Test

Rn CCn

Error

Test Controller

SI SO

TPG SA

CUT

BS BS

Scan Path

Page 78: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA78

OUTLINE

• Introduction: how much to test?• Defect modeling• Hierarchical approaches to test generation• Built-in self-test• Stimuli generation in BIST• Response compaction and signature analyzers• BIST architectures

• Hybrid BIST• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 79: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA79

Store-and-Generate test architecture

• ROM contains test patterns for hard-to-test faults • Each pattern Pk in ROM serves as an initial state of the LFSR for test pattern

generation (TPG)• Counter 1 counts the number of pseudorandom patterns generated starting

from Pk • After finishing the cycle for Counter 2 is incremented for reading the next

pattern Pk+1

ROM TPG UUT

ADR

Counter 2 Counter 1

RD

CL

Page 80: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA80

Hybrid Built-In Self-Test

PRPG

CORE UNDERTEST

. . .. . .

. . .

ROM

. . . . . .

SoC

Core

MISR

BIS

T C

ontr

olle

r

• Hybrid test set contains a limited number of pseudorandom and deterministic vectors

• Pseudorandom test vectors can be generated either by hardware or by software

• Pseudorandom test is improved by a stored test set which is specially generated to shorten the on-line pseudorandom test cycle and to target the random resistant faults

• The problem is to find a trade-off between the on-line generated pseudorandom test and the stored test

Page 81: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA81

Optimization of Hybrid BIST

Cost curves for BIST:

Total Cost

C TOTAL

C

Cost of

pseudorandom test

patterns C GEN

Number of remaining

faults after applying k

pseudorandom test

patterns r NOT (k)

Cost of stored

test C MEM

LLOPT

k rDET(k) rNOT(k) FC(k) t(k)

1 155 839 15.6% 1042 76 763 23.2% 1043 65 698 29.8% 1004 90 608 38.8% 1015 44 564 43.3% 99

10 104 421 57.6% 9520 44 311 68.7% 8750 51 218 78.1% 74

100 16 145 85.4% 52200 18 114 88.5% 41411 31 70 93.0% 26954 18 28 97.2% 12

1560 8 16 98.4% 72153 11 5 99.5% 33449 2 3 99.7% 24519 2 1 99.9% 14520 1 0 100.0% 0

Page 82: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA82

Hybrid BIST for Multiple Cores

SoC

C3540

C1908 C880 C1355

Embedded Tester C2670

Test accessmechanismBIST BIST

BISTBISTBIST

Test Controller

TesterMemory

Embedded tester for testing multiple cores

Page 83: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA83

Hybrid BIST for Multiple Cores

Page 84: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA84

Multi-Core Hybrid BIST Optimization

COST P,k

COST T,k

COST

j COST D,k

j min

COST E* T

j* k

Solution

E

E

Cost functions for HBIST: Iterative optimization:

Page 85: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA85

Optimized Multi-Core Hybrid BIST

Pseudorandom test is carried out in parallel, deterministic test - sequentially

Page 86: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA86

Software Based BIST

To reduce the hardware overhead cost in the BIST applications the hardware LFSR can be replaced by software, which is especially attractive to test SoCs, because of the availability of computing resources directly in the system (a typical SoC usually contains at least one processor core)

SoC ROMCPU CoreLFSR1: 001010010101010011N1: 275

LFSR2: 110101011010110101N2: 900...

load (LFSRj); for (i=0; i<Nj; i++) ...end;

Core j Core j+1Core j+...

Software based test generation:

The TPG software is the same for all cores and is stored as a single copy All characteristics of the LFSR are specific to each core and stored in the ROM They will be loaded upon request. For each additional core, only the BIST characteristics for this core have to be stored

Page 87: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA87

Functional Self-Test

• Traditional BIST solutions use special hardware for pattern generation on chip, this may introduce area overhead and performance degradation

• New methods have been proposed which exploit specific functional units like arithmetic blocks or processor cores for on-chip test generation

• It has been shown that adders can be used as test generators for pseudorandom and deterministic patterns

• Today, there is no general method how to use arbitrary functional units for built-in test generation

Page 88: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA88

Broadcasting Test Patterns in BIST

Concept of test pattern sharing via novel scan structure – to reduce the test application time:

... ...

CUT 1 CUT 2

... ...

CUT 1 CUT 2

Traditional single scan design Broadcast test architecture

While one module is tested by its test patterns, the same test patterns can be applied simultaneously to other modules in the manner of pseudorandom testing

Page 89: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA89

Broadcasting Test Patterns in BIST

Examples of connection possibilities in Broadcasting BIST:

CUT 1 CUT 2 CUT 1 CUT 2

j-to-j connections Random connections

Page 90: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA90

Broadcasting Test Patterns in BIST

... ...

CUT 1 CUT n

Scan configurations in Broadcasting BIST:

...

MISR

Scan-In

Scan-Out

... ...

... ...

CUT 1 CUT n

MISR 1

Scan-In

Scan-Out

... ...MISR n

Common MISR Individual and multiple MISRs

Page 91: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA91

OUTLINE

• Introduction: how much to test?• Defect modeling• Hierarchical approaches to test generation• Built-in self-test• Stimuli generation in BIST• Response compaction and signature analyzers• BIST architectures• Hybrid BIST

• P1500 Standard for SoC and NoC testing• Testing the communication infrastructure

Page 92: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA92

IEEE P1500 standard for core test

• The following components are generally required to test embedded cores – Source for application of test stimuli and a sink for observing

the responces– Test Access Mechanisms (TAM) to move the test data from

the source to the core inputs and from the core outputs to the sink

– Wrapper around the embedded core

embeddedcore

wrapper

testpatternsource

TAMtest

responces’sink

TAM

Page 93: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA93

IEEE P1500 standard for core test

• The two most important components of the P1500 standard are– Core test language (CTL) and

– Scalable core test architecture • Core Test Language

– The purpose of it is to standardize the core test knowledge transfer

– The CTL file of a core must be supplied by the core provider – This file contains information on how to

• instanciate a wrapper, • map core ports to wrapper ports, • and reuse core test data

Page 94: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA94

IEEE P1500 standard for core test

Core test architecture• It standardizes only the wrapper and the interface between the

wrapper and TAM, called Wrapper Interface Port or (WIP) • The P1500 TAM interface and wrapper can be viewed as an

extension to IEEE Std. 1149.1, since – the 1149.1 TAP controller is a P1500-compliant TAM interface, – and the boundary-scan register is a P1500-compliant wrapper

• Wrapper contains – an instruction register (WIR), – a wrapper boundary register consisting of wrapper cells, – a bypass register and some additional logic.

• Wrapper has to allow normal functional operation of the core plus it has to include a 1-bit serial TAM.

• In addition to the serial test access, parallel TAMs may be used.

Page 95: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA95

IEEE P1500 standard for core test

System chip

Source/Sink(Stimuli/Responses)

Off-chip or

On-chip

User-defined test access mechanism (TAM) On-chip

P1500 wrapper

Core 1

WIR

WPI WPO

Functionalinputs/outputs

P1500 wrapper

Core n

WIR

WPI WPOFunctional

inputs/outputs

P1500 Wrapper interface port (WIP)

WSIWSI WSO WSO

Page 96: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA96

Testing the Communication Infrastructure

• Consider a mesh-like topology of NoC consisting of

– switches (routers), – wire connections between them and – slots for SoC resources, also referred to as tiles.

• Other types of topological architectures, e.g. honeycomb and torus may be implemented and their choice depends on the constraints for low-power, area, speed, testability

• The resource can be a processor, memory, ASIC core etc.

• The network switch contains buffers, or queues, for the incoming data and the selection logic to determine the output direction, where the data is passed (upward, downward, leftward and rightward neighbours)

Page 97: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA97

Testing the Communication Infrastructure

• Useful knowledge for testing NoC network structures can be obtained from the interconnect testing of other regular topological structures

• The test of wires and switches is to some extent analogous to testing of interconnects of an FPGA

• a switch in a mesh-like communication structure can be tested by using only three different configurations

Page 98: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA98

Testing the Communication Infrastructure

• Arbitrary short and open in an n-bit bus can be tested by

log2(n) test patterns

• When testing the NoC interconnects we can regard different paths through the interconnect structures as one single concatenated bus

• Assuming we have a NoC, whose mesh consists of

m x m switches, we can view the test paths through the

matrix as a wide bus of 2mn wires

m x mmatrix

2m buses

Concatenated bus concept

Page 99: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA99

Testing the Communication Infrastructure

• The stuck-at-0 and stuck-at-1 faults are modeled as shorts to Vdd and ground

• Thus we need two extra wires, which makes the total bitwidth of the bus

2mn + 2 wires.

• From the above facts we can find that

3[log2(2mn+2)] test patterns are needed in order to test the switches and the wiring in the NoC

m x mmatrix

2m buses

Concatenated bus concept

Page 100: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA100

Testing the Communication Infrastructure

0

1

2

3

4

5

6

7

Bus

0 0 0

0 0 1

0 1 0

0 1 1

1 0 0

1 0 1

1 1 0

1 1 1

Test Detected faults

Stuck-at-1

Stuck-at-0

All opens

and shorts

6 wires tested

3[log2(2mn+2)]

test patterns needed

Page 101: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA101

Conclusions

• Defect-oriented and hierarchical test generation approaches as promising trends in the deep-submicron era were discussed

• Due to the fact that BIST allows at-speed testing and simplifies test access to embedded cores, it has become a popular technique for testing the cores in SoC

• With properly designed BIST, the cost of added test HW will be more than balanced by the benefits in terms of– Reliability, and– Reduced maintenance cost

• Useful knowledge for testing NoC network structures can be obtained from the interconnect testing of other regular topological structures

Page 102: Technical University Tallinn, ESTONIA 1 Raimund Ubar Tallinn Technical University Estonia raiub@pld.ttu.ee  raiub/ Stockholm, May 19, 2003 Testing

Technical University Tallinn, ESTONIA102

Conclusions of our Research Experience

Who is

a test engineer ?

The test engineer is the man who is able to program a broken computer