test and test equipment july 2011 san francisco california n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Test and Test Equipment July 2011 San Francisco, California PowerPoint Presentation
Download Presentation
Test and Test Equipment July 2011 San Francisco, California

Loading in 2 Seconds...

play fullscreen
1 / 20

Test and Test Equipment July 2011 San Francisco, California - PowerPoint PPT Presentation


  • 74 Views
  • Uploaded on

Test and Test Equipment July 2011 San Francisco, California. Dave Armstrong. 2011 Test Team. Jerry Mcbride Jody Van Horn Kazumi Hatayama Ken Lanier Ken Taoka Ken-ichi Anzou Khushru Chhor Masaaki Namba Masahiro Kanase Michio Maekawa Mike Bienek Mike Peng Li Mike Rodgers Paul Roddy

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Test and Test Equipment July 2011 San Francisco, California' - lysa


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
test and test equipment july 2011 san francisco california

Test and Test Equipment July 2011San Francisco, California

Dave Armstrong

ITRS 2011 Test and Test Equipment – San Francisco, CA

slide2

2011 Test Team

Jerry Mcbride

Jody Van Horn

Kazumi Hatayama

Ken Lanier

Ken Taoka

Ken-ichi Anzou

Khushru Chhor

Masaaki Namba

Masahiro Kanase

Michio Maekawa

Mike Bienek

Mike Peng Li

Mike Rodgers

Paul Roddy

Peter Maxwell

Phil Nigh

Prasad Mantri

Rene Segers

Rob Aitken

Roger Barth

Sanjiv Taneja

Satoru Takeda

Sejang Oh

Shawn Fetterolf

Shoji Iwasaki

Stefan Eichenberger

Steve Comen

Steve Tilden

Steven Slupsky

Takairo Nagata

Takuya Kobayashi

Tetsuo Tada

Ulrich Schoettmer

Wendy Chen

Yasuo Sato

Yervant Zorian

Yi Cai

Akitoshi Nishimura

Amit Majumdar

Anne Gattiker

Atul Goel

Bill Price

Burnie West

Calvin Cheung

Chris Portelli-Hale

Dave Armstrong

Dennis Conti

Erik Volkerink

Francois-Fabien Ferhani

Frank Poehl

Hirofumi Tsuboshita

Hiroki Ikeda

Hisao Horibe

Brion Keller Nilanjan 'Mukherjee

'Rohit Kapur

2

ITRS 2011 Test and Test Equipment – San Francisco, CA

2011 changes
2011 Changes
  • New Section on 3D Device Test Challenges
  • Updated Adaptive Testing section
  • Logic / DFT
    • Major re-write of this section thanks to the addition of some new team members representing the major three EDA vendorsr
  • Numerous other changes to specialty devices info.
  • Test Cost
    • Test cost survey completed that quantifies industry view
  • Other updates will be published for the Logic, Consumer/SOC, RF, ad Analog section.

ITRS 2011 Test and Test Equipment San Francisco, CA

3

slide4

Previous Data

More challenges in the future

Test Cost Components

  • NRE
  • DFT design and validation
  • Test development
  • Device
  • Die area increase
  • Yield loss
  • Work Cell
  • Building
  • People
  • Consumables
  • DUT Interface
  • Test Equipment
  • Handling Tools
  • Factory Automation

FalsePassUnits

Untested

Units

Good Units

Reject Units

False Fail Units

ITRS 2011 Test and Test Equipment – San Francisco, CA

slide5

3D Technology AddMany Challenges

Test Cost Components

  • NRE
  • DFT design and validation
  • Test development

FalsePassUnits

  • NRE
  • DFT design and validation
  • Test development
  • NRE
  • DFT design and validation
  • Test development
  • Device
  • Die area increase
  • Yield loss
  • NRE
  • DFT design and validation
  • Test development

SmartManufacturing

Die Stacking

ProbablyGood Units

ProbablyGood Units

ProbablyGood Units

Test Cell

Test Cell

Untested

Units

Pass/Fail

Analysis

Reject Units

Good Units

Rejected Units

Good Die in a Failing Stack

False Fail Units

ITRS 2011 Test and Test Equipment – San Francisco, CA

slide6

Adaptive Test Flow

ETest

Optical Inspection

Other inline data

Assembly/Build Data

FAB

“RT A/O” stands for “Real-Time Analysis & Optimization”

PTAD

RT A/O

“PTAD” is “Post-Test Analysis & Dispositioning”

Wafer Probe

Assembly Operations

(This includes test operations at any level of assembly.)

PTAD

RT A/O

Burn-in

PTAD

RT A/O

Final Test

PTAD

Databases & Automated Data Analysis

(This may include multiple databases. ‘Analysis’ includes capabilities like post-test statistical analysis, dynamic routings and feedforward data.)

RT A/O

Stack / Card / System Test

  • Fab data
  • Design data
  • Business data
  • Customer specs

PTAD

Field Operation

2011 drivers
2011 Drivers

Unchanged

Revised

New

Drop

  • Device trends
    • Increasing device interface bandwidth and data rates
    • Increasing device integration (SoC, SiP, MCP, 3D packaging)
    • Integration of emerging and non-digital CMOS technologies
    • Complex package electrical and mechanical characteristics
    • Device characteristics beyond the deterministic stimulus/response model
    • 3 Dimensional silicon - multi-die and Multi-layer
    • Multiple Power modes and Multiple time domains
    • Fault Tolerant architectures and protocols
  • Test process complexity
    • Device customization / configuration during the test process
    • “Distributed test” to maintain cost scaling
    • Feedback data for tuning manufacturing
    • Adaptive test and Feedback data
    • Higher Order Dimensionality of test conditions
    • Concurrent test within a DUT
    • Maintaining unit level test traceability

ITRS 2011 Test and Test Equipment – San Francisco, CA

7

2011 drivers 2
2011 Drivers (2)

Unchanged

Revised

New

Drop

  • Economic Scaling of Test
    • Physical and economic limits of packaged test parallelism
    • Test data volume and feedback data volume
    • Effective limit for speed difference of HVM ATE versus DUT
    • Managing interface hardware and (test) socket costs
    • Trade-off between the cost of test and the cost of quality
    • Balancing General Purpose Equipment vs. Multiple Insertions for System Test and BIST

ITRS 2011 Test and Test Equipment – San Francisco, CA

2011 difficult challenges

Unchanged

Revised

New

Drop

2011 Difficult Challenges
  • Cost of Test and Overall Equipment Efficiency
    • Progress made in terms of test time, capital cost, multisite test
    • Continued innovation in DFT, Concurrent Test, Balancing DPM vs. Cost
    • Gains in some cases are now limited by Overall Equipment Efficiency
  • Test Development as a Gate to Volume Production (Time to Market)
    • Increasing device complexity driving more complex test development.
    • Complexity also driven by the diversity of different types of device interfaces on a single chip.
  • Potential yield losses
    • Tester inaccuracies (timing, voltage, current, temperature control, etc)
    • Over testing (e.g., delay faults on non-functional paths)
    • Mechanical damage during the testing process
    • Defects in test-only circuitry or spec failures in a test mode e.g., BIST, power, noise
    • Some IDDQ-only failures
    • Faulty repairs of normally repairable circuits
    • Decisions made on overly aggressive statistical post-processing
    • Multi-die stacks / TSV
    • Power management issues

ITRS 2011 Test and Test Equipment San Francisco, CA

2011 difficult challenges 2
2011 Difficult Challenges (2)

Unchanged

Revised

New

Drop

  • Detecting Systemic Defects
    • Testing for local non-uniformities, not just hard defects
    • Detecting symptoms and effects of line width variations, finite dopant distributions, systemic process defects
  • Screening for reliability
    • Effectiveness and Implementation of burn-in, IDDQ, and Vstress testing
    • Screening of multiple power down modes and binning based on power requirements
    • Detection of erratic, non deterministic, and intermittent device behavior

ITRS 2011 Test and Test Equipment – San Francisco, CA

2011 future opportunities
2011 Future Opportunities

Unchanged

Revised

New

Drop

  • Test Program Automation
    • Automatic generation of an entire test program.
    • Tester independent test programming language.
    • Mixed Signal still a test programming challenge
  • Scan Diagnosis in the Presence of Compression
  • Simulation and Modeling
    • Seamless integration of simulation & modeling into the testing process.
    • A move to a higher level of abstraction with Protocol Aware test resources.
    • Focused test generation based on layout, modeling, and fed back fabrication data.
  • Convergence of Test and System Reliability Solution
    • Re-use of test collateral in different environments (ATE, Burn-in, System, Field)

ITRS 2011 Test and Test Equipment – San Francisco, CA

11

summary
Summary
  • Stacked devices change many things for test.
    • The methods and approach seem available.
    • Considerable work ahead of us to implement.
  • Adaptive testing is becoming a standard approach
    • Significant test data accumulation, distribution, and analysis challenges.
  • Ongoing changes to the RF, Analog, and Specialty devices.
  • Many more details to be published in the final document.
slide15

Adaptive Test Database Architecture Example

Tester

Local Database

For real-time data analysis & actions.

Resident on Tester or in Test Cell.

Latency: <1 second Retention: “hours”

Database

Data availability for production – lot setup or dispositioning.

Latency: “minutes” Retention: “hours to days”

Large Database

For long-term storage

Latency: “minutes”

Retention: “months” … with longer-term retrieval options. (data is available “forever”)

World-wide, cross-company databases

3d device testing challenges
3D Device Testing Challenges
  • Test Access
      • Die level Access
      • Die in Stack Access
  • Test Flow/Cost/Resources
  • Heterogeneous Die in the Stack
      • Die in Stack Testing
      • Die to Die Interactions
  • Debug/Diagnosis
  • DFT
  • Test data managements, distribution, & security
  • Power Implications
soc logic update
SOC / Logic Update
  • Takes the device roadmap data and calculates:
      • Fault Expectations both inside and outside the various cores using multiple fault models.
      • Required test pattern lengths given five different assumptions:
          • Flat test patterns
          • Test implemented taking advantage of the circuit hierarchy
          • Tests implemented using compressed flat patterns.
          • Tests implemented using compressed hierarchal test patterns.
          • Using low power scan test approach.
lcd device probing challenge overcome
LCD Device Probing Challenge Overcome

New probe needle arrangement 4 layers + 4 layers = 8 layers

could provide solution for LCD driver probe pad continually narrow down.

higher site count camera chips

D

Tan()= (D/2)/EPD

D=Pupil diameter

F-number = EPD / D

Chief ray

Maxº

Minº

Chiefº

Max º

EPD

Four sites

Single site

Higher Site Count Camera Chips
mems sensors for handheld devices
MEMs Sensors For Handheld Devices

Gyro Accelerometers

E-Compass Pressure

Expect a 10% yearly growth