mro end to end test status n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
MRO End-to-End Test Status PowerPoint Presentation
Download Presentation
MRO End-to-End Test Status

Loading in 2 Seconds...

play fullscreen
1 / 15

MRO End-to-End Test Status - PowerPoint PPT Presentation


  • 86 Views
  • Uploaded on

MRO End-to-End Test Status. Ray Arvidson and Keith Bennett November 29, 2006. End-To-End Test Summary. Purpose: Exercise MRO science data flow from instrument team archives to posting of PDS archives

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'MRO End-to-End Test Status' - yosefu


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
mro end to end test status

MRO End-to-End TestStatus

Ray Arvidson and Keith Bennett

November 29, 2006

end to end test summary
End-To-End Test Summary
  • Purpose: Exercise MRO science data flow from instrument team archives to posting of PDS archives
  • Approach: Series of delivery tests to identify and correct problems before first delivery to PDS in June of ’07
  • Tests involve HiRISE, CRISM, SHARAD, MCS, CTX, and MARCI MRO Instrument Teams
  • Tests involve Geosciences, Atmospheres, Imaging, and Engineering Nodes
test plan four test phases
Test Plan – Four Test Phases
  • Test 1 – May ’06
    • Instrument teams provided a single EDR
    • Transferred to the PDS node via an agreed upon mechanism
    • Validated EDR against PDS standards and appropriate SISs
  • Test 2 – July ’06
    • EDR Archive with single EDR generated using planned instrument processes
    • Archive generated as planned (either by instrument team or PDS node)
    • Delivered via planned delivery mechanism
    • Validated EDR and Archive against PDS standards and appropriate SISs
    • Verified availability through PDS catalog search system
  • Test 3 – Oct ’06
    • EDRs - Same as test 2 except with 1 day’s worth of EDRs
    • RDRs – Same as test 1 with a single RDR
    • Both EDRs and RDRs assembled into archive volumes
  • Test 4 – Feb ’07
    • Same as test 3 except with 7 days’ worth of EDRs and RDRs
  • Additional test if needed in ~April/May ‘07
test status
Test Status
  • Test 1 – May ’06
    • Test successfully completed
    • Final report issued
    • Several minor discrepancy reports (DRs) issued
      • All closed in Test 2
    • Lien issued – No CTX/MARCI testing because instrument team was not ready
      • Closed in Test 2
  • Test 2 – July ’06
    • Test successfully completed
    • Several minor discrepancy reports issued
      • All closed in Test 3
    • Preliminary report issued
test 3 oct 06 status
Test 3 (Oct ’06) Status
  • CRISM/Geosciences
    • Successfully Completed
    • Several Minor Discrepancy Reports related to Labels
    • Some closed, the rest are expected to be closed in test 4
  • SHARAD/Geosciences
    • Successfully Completed
    • Several Minor Discrepancy Reports related to Labels
    • Minor issues with labels (expected since RDR SIS still in peer review)
    • Some closed, the rest are expected to be closed in test 4
  • HiRISE/Imaging
    • Successfully Completed
    • Minor issues with labels (expected since RDR SIS still in peer review)
  • MCS/Atmospheres
    • Successful (not quite finished reviewing all data but no issues seen to date)
  • CTX/MARCI/Imaging
    • Status Pending
errors and problems encountered labels not meeting pds standards or siss
Errors and Problems EncounteredLabels not meeting PDS Standards or SISs
  • Example:

SOURCE_PRODUCT_ID = {

"HRL00002794_00_DF089S_EDR0"

= }

  • Many of these errors were actually in the Instrument teams’ software
  • Correction 1: instrument team updated software
  • Correction 2: Sometimes the SIS was changed instead
  • All errors found in tests 1 and 2 were corrected by test 3
  • Only a few of these types of errors were found in test 3 and are expected to be fixed in test 4
errors and problems encountered label references
Errors and Problems EncounteredLabel References
  • Label referencing a non-existent file
    • Example from SHARAD:

Line 45 – referenced file:

^PROCESSED_ECHO_TABLE = "R_0083201_001_SS05_700_A.DAT"

The actual file is "R_0083201_001_SS05_700_A000.DAT"

  • Correction 1: instrument team updated software
errors and problems encountered data dictionary errors
Errors and Problems EncounteredData Dictionary Errors
  • Several new keywords or values were missing from Data Dictionary, despite generation of MRO Local Data Dictionary
    • Some new keywords or values were not entered by the time of the test
    • Discrepancies between SIS keywords and label keywords
    • Correction 1: Timely and accurate data dictionary updates reduced errors by test 3
    • Correction 2: Updates of SISs or automatic label generation software by Instrument teams reduced keyword errors
  • Highlights current problems with managing local data dictionaries!
errors and problems encountered lvtool errors
Errors and Problems EncounteredLVTool Errors
  • Label Validation Tool (LVTool) sometimes failed to correctly handle valid keywords/values, although none prevented LVTool from running to completion.
  • Example:

“LVTool reports an error on a BIT_COLUMN when an ITEMS field is included. LVTOOL indicates a BITS field is needed but the BITS field is options when there is an ITEMS field (as per PDS Standards).”

  • Correction 1: Some errors corrected with updated version of LVTool
  • Correction 2: Some have been deferred to the new Validation Tool
data transfer and access results
Data Transfer and Access Results
  • No data transfer / access errors
  • Data Transfer/Access Examples
    • CRISM – Test 3 - ~7GB via Data Brick
    • SHARAD – Test 2 – ~3GB via FTP from Italy
    • HiRISE – Test 3 - ~8GB validated via remote access to Data Node
    • CTX/MARCI – Test 2 – 300MB
    • MCS – Test 3 – 630MB via FTP
pds data flow and web access results
PDS Data Flow and Web Access Results
  • Updated PDS catalog with MRO test data
  • Demonstrated PDS web access to MRO catalog data and science data at the nodes
  • No major problems
mro test summary
MRO Test Summary
  • Next Step - Test 4 – February ’07
  • Objectives:
    • Exercise data transfer
    • Exercise data validation
    • Reduce potential first-delivery problems
    • Useful for pushing instrument teams to finish SISs
future pds end 2 end plans
Future PDS End-2-End Plans
  • Missions planning to do E2E testing:
    • Phoenix
    • MESSENGER
    • LRO
    • MSL
key lessons for the future
Key Lessons for the Future
  • E2E reduces first-delivery problems
  • Test timing is crucial
    • Too early and products/SISs not ready
    • Too late and they interfere with operations
  • Test goals need to be clear:
    • Test products (production, validation, etc.)
    • Test handling (delivery, archive assembly, publication)
  • Number of tests depends on test goals and product/instrument complexity
pds lessons and questions for future e2e tests
PDS Lessons and Questions for Future E2E Tests
  • Improve Inter-PDS test communication
    • MRO E2E inter-PDS communication was often slow and relied too much on the mission
    • Suggest having PDS-Only E2E telecons outside mission archive working groups
  • Discussion Topics:
    • Does PDS or the mission drive the tests?
    • Who is the customer of the test results?
    • What is the role of PDS test coordinator?
    • Should there be a common definition of what constitutes an Discrepancy Report?