1 / 10

HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers

This article presents key ideas from the SciDAC II white papers, including the importance of continuing code development, collaboration among scientists, support for code users, and the need for computational experts. It also highlights the recommendations for a follow-on program, multidisciplinary collaboration, V&V efforts, embedded personnel, and an end-to-end computational science infrastructure.

wendyswan
Download Presentation

HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HEP and NP SciDAC projects:Key ideas presented in the SciDAC II white papers Robert D. Ryne

  2. Projects • Lattice QCD • Supernova modeling • Particle Physics Data Grid • Accelerator modeling

  3. National Computational Infrastructure for Lattice Gauge Theory • “The SciDAC Program has been enormously successful in producing community codes for terascale computing” • “It has done so by providing the support needed to enable apps scientists to work on such codes, and by encouraging collaborations among apps scientists, applied mathematicians and computer scientists in their development” • “we strongly recommend that this very successful approach be continued • “…it is important that these codes continue to evolve, since once codes become stagnant, the quickly become obsolete.” • “It is also important that codes be properly maintained and ported to new platforms, and that appropriate support be provided for the code users.” • “…we strongly recommend that DOE provide hardware with the capability and capacity to meet the needs of the SciDAC apps areas, either as part of an extension of the SciDAC Program, or as part of a separate program.”

  4. National Computational Infrastructure for Lattice Gauge Theory, cont. • “While recognizing that the bulk of computing resources for areas other than LQCD will be provided by commercial supercomputer located at DOE centers, [the LQCD project] has clearly demonstrated that for our field designing hardware and software that specifically takes into account the structure of the computation is highly advantageous… We expect this approach to be useful in some other fields, and we urge that work in this direction by us and by other be supported in an extended SciDAC program.” • Future plans include: • Continued software development of the QCD API • Collaboration w/ TOPS on multigrid algorithm for Dirac operator inversion • Build a unified users environment; run BNL, FNAL, Jlab compute facilities as a single meta-facility (will use grid tools developed by groups such has PPDG)

  5. Supernova Modeling:TSI recommendations • “We strongly recommend a 5-year SciDAC follow-on program” • SAPP/ISIC comments: • “We envision that our primary mode of multidisciplinary collaboration will be through SAPP.” [lift the 25% SAPP funding cap] • “…we also envision significant continued interaction with the ISICs.” • “successes…often the result of combined efforts of SAPP & ISIC teams.” • “We imagine SAPP & ISICs as equal partners in next SciDAC round.” • “The next generation of SciDAC should focus on apps that can benefit from computational capabilities at the 100+ TFLOP level • Verification & Validation (V&V) comments: • “DOE/SC should demand rigorous V&V from its application teams” • “demand that work that has the imprimatur of SciDAC funding should manifest and maintain rigorous and continuing V&V efforts, including publication of results of V&V test problems… insists that apps teams publish numerical details sufficient to allow replication of the results by other researchers…”

  6. Supernova Modeling:TSI recommendations, cont. • “…in order for the apps teams to fully leverage the efforts of the ISIC teams, the apps must have sufficient embedded personnel” • Software developed by the ISICs should be performed in close collaboration w/ the applications. “We have had limited success in incorporating products that were developed and then ‘thrown over the fence.’” • Long-term view of software developed by apps, SAPP, and ISICs -- beginning with R&D, continuing through deployment and maintenance. • “…we must establish an end-to-end computational science infrastructure to enable the scientific workflow, from simulation data generation and storage, data movement, data analysis, and scientific visualization”

  7. Supernova Modeling:SNSC summary • “Having the support of people w/ computational expertise is essential to our (and SciDAC’s) goals” • “The collab works best when there is a sustained effort in which the computer scientists become involved w/ the science goals (joint pubs a good indicator). Geographical proximity also helps. • “There is a real need in the community for a special breed of computational *scientist* …trained in the intricacies of writing, optimizing, maintaining, and running large codes.” • “One of the major goals of SciDAC II should be the training of this next generation of specialists.” • “As the codes mature… ISICs in optimization, data mgmt, and visualization (sorely needed and still absent) will also be beneficial”

  8. HEP Collaboratories Future • HEP and ASCR have jointly funded the Particle Physics Data Grid program (PPDG) • PPDG works closely with GriPhyN and iVDGL-(NSF-funded) to create a working Grid environment for HEP and NP • Among the results are Grid3 (late 2004), the start of the Open Science Grid (now) and ongoing vigorous CS-HENP collaboration • PPDG-OSG already supports massive simulations for LHC (simulation is not I/O intensive and is run by small teams) • Major challenges must be addressed to support data analysis (I/O intensive, many users) • Grids are central to the computing for the future of HENP • Strong encouragement from NSF MPS to propose continued work building on the successes of PPDG/GriPhyN/iVDGL.

  9. Accelerator Science and Technology (AST) project • Excerpts from the draft white paper: • “SciDAC has been an incredibly successful program. A key reason is that it encouraged collaboration and supported the formation of [multidisciplinary teams].” • “While the SciDAC AST focused mainly on terascale problems, under the new project the focus will range from the terascale to the petascale depending on the problem” • “Given the time required to develop major software packages as found under SciDAC, we are proposing an initial 5-year project duration…” • “…just as SciDAC itself is a coordinated multi-program office activity, so too is the project that we have proposed in this white paper – a coordinated, multi-program office accelerator modeling initiative that builds upon the success of the SciDAC AST project.

  10. Accelerator Science and Technology (AST) project, cont. • Future management structure • Through community discussions and meetings with DOE/SC Associate Director’s & program managers, a follow-on to the AST project is being formulated. • Coordination of accelerator modeling activities across offices of DOE/SC essential to make most efficient use of SC resources • Details are being worked out (e.g. one or multiple proposals) • Community will formulate a science-driven plan; coordination details will be decided by the AD’s and their Program Mgrs.

More Related