1 / 9

UCSB Module LT Testing

UCSB Module LT Testing. Data Taking. Scenarios used Standard LT scenario (night.lt) 1 thermal cycle, tests at 20 C, -20 C, 20 C Qualification scenario (3day-1daycold.lt) 1 day thermal cycling, 1 day -20 C, 1 day thermal cycling No verification/fault finding done by UCSB

jena
Download Presentation

UCSB Module LT Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UCSB Module LT Testing

  2. Data Taking • Scenarios used • Standard LT scenario (night.lt) • 1 thermal cycle, tests at 20 C, -20 C, 20 C • Qualification scenario (3day-1daycold.lt) • 1 day thermal cycling, 1 day -20 C, 1 day thermal cycling • No verification/fault finding done by UCSB • Patrick has been analyzing the data for us, telling us if runs OK • Not a good idea for production. Would have lag-times in which we would have to find modules and re-test them. • We need the tools to do this semi-automatically • No comparison of data to ARCS data done yet • We currently take a noise and LED run on the ARCS to confirm nothing changed. If anything does, then we look at root file • We definitely need a tool to compare ARCS and LT data to check for differences, confirm good data

  3. second sensor which failed Vienna Strasbourg

  4. Pre-series II sensors: optical inspection • systematic optical inspection done, but only for the 10 first sensors tested (corresponding to 4 different batches): • all of them were fine before the test (not touched since they left STM) • after the test: stains and mainly dots seen for ALL of them (see next) Þ clear ageing effect again!

  5. ID 34572225 which failed the test only the most terrible pictures!

  6. Database • Currently we are doing nothing at UCSB with database, xml generation, validation, or uploading • Again, Patrick has been saving us over the short term • Clearly, we need to do this on site as well • Preferably in a semi-automatic way which could also check the quality of the data • We need to develop tools similar to the gantry database which will automatically move the root files to a common area, generate the xml, check the xml quality, and finally upload the data • I believe that the Defect Analyzer package Patrick has recently been using could do most of the work • We will need to write a tool for checking the xml quality

  7. Equipment • We are short 1 PAACB from running 10 slots • Does FNAL have a spare 1 (or 2 if they have more spares)? • So far, our +5V LV supply failed/overheated and we have blown two fuses on the Vienna box power supply • We have spares of both • After replacing backplane and using extensions, 2 slots not working • We need help in de-bugging these 2 slots and a set of procedures on how to de-bug these failures in the future.

  8. Data Handling Short Term • Patrick will show FNAL/UCSB how to use LTmacro and the Defect Analyzer • Need volunteers at FNAL/UCSB to run programs • Check data visually • Need macro to make all plots needed (I-t especially) • Manually check xml files until we believe results • Manually update data

  9. Data Handling Long Term • Need a tool of move root file to analysis area (if necessary) • Need to tune/confirm LTmacro and Defect Analyzer • Hopefully Patrick • Need a tool to automatically run LTmacro in files in area • Need volunteer • Need a tool to automatically run Defect Analyzer • Need volunteer • Need a tool to confirm data is good in xml file • Need volunteer at both sites • Need a tool to upload xml files • Need volunteer • Need a tool to compare ARCS and LT data • Either compare macro outputs or bad channel flags in database • I believe the comparing bad channel flags in the database will be easier to maintain over the production • Again another volunteer

More Related