1 / 12

MRO Archive Readiness and End to End Testing Edward Guinness and Susan Slavney

MRO Archive Readiness and End to End Testing Edward Guinness and Susan Slavney Washington University Presented to PDS Management Council August 2, 2006. Data ARchive Working Group.

jalena
Download Presentation

MRO Archive Readiness and End to End Testing Edward Guinness and Susan Slavney

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MRO Archive Readiness and End to End Testing Edward Guinness and Susan Slavney Washington University Presented to PDS Management Council August 2, 2006

  2. Data ARchive Working Group • The DARWG has been chaired by Carl Kloss, JPL, who is retiring. Brian Hammer, JPL, will be the new DARWG Chair. • Document status: • Archive Plan V1.1 was signed in January 2006. • ICDs: All signed except ONC – Imaging ICD, which is in progress. • SIS Documents and Peer Reviews: • CRISM: EDR/RDR review is complete, documents are complete but being revised. • HiRISE: EDR review is complete, RDR SIS is in progress. • SHARAD: EDR/RDR review is in progress. • CTX / MARCI: EDR SISs are in draft form, not yet reviewed. • ONC: Will be “safed” according to agreement with Imaging Node (TBD). • MCS: EDR and RDR SISs are ready for peer review. • ACC: SIS documents are in progress, not yet reviewed. • Local data dictionary was submitted to EN; will be revised as a result of E2E tests

  3. End-To-End Test Summary • Purpose: To exercise MRO science data flow from instrument teams through PDS. • Approach: Conduct a series of engineering tests to identify and correct problems before the first PDS delivery in June 2007. • Tests participants: • HiRISE  Imaging Node  Engineering Node • CTX / MARCI  Imaging Node  Engineering Node • CRISM  Geosciences Node  Engineering Node • SHARAD  Geosciences Node  Engineering Node • MCS  Atmospheres Node  Engineering Node

  4. Test Plan: Four Phases • Test 1, May 2006 (Completed) • Team delivers one EDR to PDS Node using any agreed delivery method • Node validates product against EDR SIS and PDS standards • Test 2, July 2006 (In progress) • Team delivers complete archive volume with at least one EDR generated by operational software, using planned delivery mechanism • Node validates archive volume against EDR SIS, Archive SIS, and PDS standards • Team verifies availability of product at node through PDS catalog search • Test 3, October 2006 • EDR Test: Same as Test 2 but with one day’s worth of EDRs • RDR Test: Same as Test 1 but with one RDR • Both EDRs and RDRs to be assembled into archive volumes • Test 4, February 2007 • Same as Test 3, but with seven days’ worth of EDRs and RDRs • Additional test if needed in April or May, 2007

  5. Test Status • Test 1, May 2006 • Test completed • Preliminary report issued with several minor Discrepancy Reports (DRs) • Lien issued: No CTX/MARCI testing • Test 2, July 2006 • Test period completed • Report in progress

  6. Test Details: CRISM  Geosciences • Test 1 – Success – Test 1 requirements met. • Also satisfied CRISM Test 2 requirements as well. • Data transferred: ~350 EDR Products = ~2300 files = 700 Mb • EDRs validated against PDS standards and EDR SIS. • Results: 1 DR on PDS data dictionary, 2 DRs on minor label errors • DRs were corrected in Test 2 • Test 2 - Success– Test 2 requirements met. • Data transferred: same as Test 1 plus a few corrected files • EDR archive assembled at Geosciences using inputs from CRISM team • EDRs validated against PDS standards and EDR SISs • Archive validated against PDS standards and Archive SIS • Results: 1 DR on PDS data dictionary, 5 DRs on minor label errors

  7. Test Details: SHARAD  Geosciences • Test 1 – Success – Test 1 requirements met. • Data transferred: Two hand-generated EDRs (10 files, 13 MB) • Transferred via FTP from ASDC in Italy to Geosciences • EDRs validated against PDS standards and EDR SIS. • Results: 1 DR on PDS data dictionary, 1 DR on minor label errors • DRs were corrected in Test 2 • Test 2 – Success – Test 2 requirements met. • Data transferred: 225 EDRs = 660 files = 2.95 GB • Products were generated by SHARAD processing software • Files were transferred from ASDC to Geosciences using the planned daily automated FTP-based transfer system • EDRs validated against PDS standards and EDR SISs • Archive validated against PDS standards and Archive SIS • Results: 1 DR on PDS data dictionary, 3 DRs on minor label errors

  8. Test Details: HiRISE  Imaging • Test 1 – Success – Test 1 requirements met. • Used automatically generated EDRs from HiRISE team. • Data transferred: 28 automatically-generated EDRs = ~500 MB • Files transferred from HiROC to PDS IN using product and profile servers. • EDRs validated against PDS standards and EDR SIS. • Results: No AIs generated. • Test 2 – Success – Test 2 requirements met. • Used automatically generated EDRs from HiRISE team. • Files transferred from HiROC to PDS IN using product and profile servers. • Data transferred: 28 EDRs = ~500 MB. • EDRs validated against PDS standards and EDR SISs • Archive validated against PDS standards and Archive SIS • Results: No AIs generated

  9. Test Details: CTX/MARCI  Imaging • Test 1 – Failure – Test 1 Not Conducted • Instrument team not ready. • Test 2 – Success – Test 2 minimal requirements met. • Used automatically generated EDRs from instrument team. • Files transferred from MSSS to Imaging Node using FTP. • Data transferred: 3 EDRs = 300 MB. • EDRs validated against PDS standards and EDR SISs. • Results: Test minimally successful, but peer review of data and SISs has not been conducted, and no draft Archive SIS has been produced.

  10. Test Details: MCS  Atmospheres • Test 1 – Completed – Report TBD • Test 2 – Completed – Report TBD

  11. Test Details: Engineering Node • Test 2 – Success – Test 2 requirements met. • Number of Catalog Files ingested: 25 MISSION.CAT, INSTHOST.CAT, CRISM (5), SHARAD (5), CTX (3), MARCI (3), HiRISE (4), MCS (3) • Number of Resource Links ingested: 7 • Number of Release Objects ingested: 7 • Number of MRO Test Subscriptions: 4 (MCS; Sharad, 2; CRISM) • PDS Data Search capability verified: 6 data sets • Provided updated MRO Local Data Dictionary • Results: • 3 DRs on subscription incompletion • 4 DRs on incomplete data access verification • 1 DR on EN Test Bed – data set catalog files need to be re-ingested to display “Abstract Description” values • Lesson Learned: • First use of the test bed, good experience learned to facilitate future mission End to End tests.

  12. Conclusion • Lessons Learned • Tests are useful for identifying problems before first delivery. • Tests help push instrument teams to finish SISs. • Tests can highlight problems in instrument processing software in time to make corrections before production begins. • We recommend future missions consider conducting E2E tests.

More Related