1 / 15

Parallel Tomography

Parallel Tomography. Shava Smallen SC99. What are the Computational Challenges?. Quick turnaround time Resource availability and utilization Network performance Coallocation Transparent execution Single login Remote data access Security. GTOMO.

Download Presentation

Parallel Tomography

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Parallel Tomography Shava Smallen SC99

  2. What are the Computational Challenges? • Quick turnaround time • Resource availability and utilization • Network performance • Coallocation • Transparent execution • Single login • Remote data access • Security

  3. GTOMO • Developed by collaboration of NCMIR researchers and computer scientists to address computational challenges of telescience by leveraging distributed resources • GTOMO is an embarrassingly parallel implementation of tomography.

  4. projections sinograms slices GTOMO Description • projections are preprocessed into sinograms • each sinogram is individually processed into a slice

  5. ptomo driver ptomo writer reader ptomo disk disk GTOMO Architecture Off-line Work queue scheduling Solid lines = data flow dashed lines = control

  6. Grid Enabled • GTOMO is implemented using components of the Globus toolkit • distributed resources • single login • security • Uses AppLeS to achieve performance • coallocation of workstations and immediately available supercomputer nodes

  7. AppLeS = Application Level Scheduling • AppLeS + application = self-scheduling application • scheduling decisions based on • dynamic information • available from Network Weather Service (NWS) • static application and system information • Methodology • select sets of resources • plan possible schedules for each set of feasible resources • predict the performance for each schedule • implement best predicted schedule on selected infrastructure

  8. AppLeS for GTOMO • Resource selection • NCMIR interactive workstations • NPACI supercomputer time • We have developed a scheduler which coallocates program execution over workstations and immediately available supercomputer nodes for an improved execution performance

  9. Resource Selection • Strategy: • submit GTOMO to available workstations • use dynamic information available from the supercomputer’s batch scheduler to determine a job request which will be started immediately • available on Maui Scheduler • Utilizes computational resources available to a typical research lab

  10. Preliminary Experiment Results • Resources • 6 workstations available at Parallel Computation Laboratory (PCL) at UCSD • immediately available nodes on SDSC SP-2 (128 nodes) • Maui scheduler exports the number of immediately available nodes • e.g. 5 nodes available for the next 30 mins 10 nodes available for the next 10 mins

  11. Allocation Strategies/Experiment Setup • 4 strategies compared: • SP2Immed/WS: workstations and immediately available SP-2 nodes • WS: workstations only • SP2Immed: immediately available SP-2 nodes only • SP2Queue(n): traditional batch queue submit using n nodes • experiments performed in production environment • ran experiments in sets, each set contains all strategies • e.g. SP2Immed, SP2Immed/WS, WS, SP2Queue(8) • within a set, experiments ran back-to-back

  12. Experiment Results (8 nodes on SP-2)

  13. Experiment Results (16 nodes on SP-2)

  14. Next Steps • Develop contention model to address network overloading which includes • NWS bandwidth measurements • network capacity information • Expansion of platform • reservations (e.g. GARA scheduled resources) • S3 • On-line tomography (NPACI Telescience Alpha Project)

  15. People • AppLeS: (http://apples.ucsd.edu) • Shava Smallen, Jim Hayes, Fran Berman, Rich Wolski, Walfredo Cirne • NCMIR: (http://www-ncmir.ucsd.edu) • Mark Ellisman, Marty Hadida-Hassan, Jaime Frey • Globus: (http://www.globus.org) • Carl Kesselman, Mei-Hui Su • ssmallen@cs.ucsd.edu

More Related