1 / 39

Presented to the HEPAP AARD Subpanel December 21, 2005

Accelerator Modeling: Present capabilities, future prospects, and applications to the HEP Program (with emphasis on SciDAC). Robert D. Ryne Lawrence Berkeley National Laboratory with contributions from Kwok Ko (SLAC) and Warren Mori (UCLA).

esme
Download Presentation

Presented to the HEPAP AARD Subpanel December 21, 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Accelerator Modeling:Present capabilities, future prospects, and applications to the HEP Program(with emphasis on SciDAC) Robert D. Ryne Lawrence Berkeley National Laboratory with contributions from Kwok Ko (SLAC) and Warren Mori (UCLA) Presented to the HEPAP AARD Subpanel December 21, 2005

  2. SciDAC Accelerator Science & Technology (AST) Project: Overview • Goals: • Develop new generation of parallel accelerator modeling codes to solve the most challenging and important problems in 21st century accel S&T • Apply the codes to improve existing machines, design future facilities, help develop advanced accelerator concepts • Sponsored by DOE/SC HEP in collaboration with ASCR • Primary customer: DOE/SC, primarily its HEP, also NP programs • codes have also been applied to BES projects • Funding: $1.8M/yr (HEP), $0.8M/yr (ASCR/SAPP) • Strong leveraging from SciDAC ISICs • Duration: Currently in 5th (final) year • Participants: • Labs: LBNL, SLAC, FNAL, BNL, LANL, SNL • Universities: UCLA, USC, UC Davis, RPI, Stanford • Industry: Tech-X Corp.

  3. SciDAC Accelerator Science & Technology (AST) Project: Overview cont. • Management: • K. Ko and R. Ryne, co-PIs • Senior mgmt team: K. Ko, R. Ryne, W. Mori, E. Ng • Oversight and reviews by DOE/HEP program mgrs • Vicky White • Irwin Gaines • Craig Tull (present) • The project must • advance HEP programs (R. Staffin) • through synergistic collaboration w/ ASCR that advances the state-of-the-art in advanced scientific computing (M. Strayer)

  4. SciDAC AST Overview: Focus Areas • Organized into 3 focus areas: • Beam Dynamics (BD), R. Ryne • Electromagnetics (EM), K. Ko • Advanced Accelerators (AA), W. Mori • All supported by SciDAC Integrated Software Infrastructure Centers (ISICs) and ASCR Scientific Application Partnership Program (SAPP) • Most funding goes to BD and EM; AA is very highly leveraged

  5. Why do we need SciDAC??? • Why can’t our community do code development just ourselves as we have done in the past? • Why can’t it be done just as an activity tied to accelerator projects? • Why can’t our community follow “business as usual?”

  6. Computational Issues • Large scale: • simulations approaching a billion particles, mesh points • Huge data sets • Advanced data mgmt & visualization • Extremely complex 3D geometry (EM codes) • Complicated hardware with multiple levels of memory heirarchy, > 100K processors • Parallel issues • Load balancing • Parallel sparse linear solvers • parallel Poisson solvers • particle/field managers

  7. Close collaboration w/ ASCR researchers (ISICs, SAPP) is essential • A hallmark of the SciDAC project is that it built upon collaboration between applications/computational scientists with mathematicians, computer scientists, parallel performance experts, visualization specialists, and other IT experts. • The AST project collaborates with several ISICs: • TOPS (Terascale Optimal PDE Solvers) • APDEC (Applied Partial Differential Equations Center) • TSTT (Terascale Simulation Tools & Technologies) • PERC (Performance Evaluation Research Center)

  8. Overview of the 3 focus areas • Beam Dynamics (BD) • Electromagnetic Modeling (EM) • Advanced Accelerators (AA)

  9. Overview of the 3 focus areas • Beam Dynamics (BD) • Electromagnetic Modeling (EM) • Advanced Accelerators (AA)

  10. SciDAC Codes: Beam Dynamics • Set of parallel, 3D multi-physics codes for modeling beam dynamics in linacs, rings, and colliders • IMPACT suite: includes 2 PIC codes (s-based, t-based); mainly for electron and ion linacs • BeamBeam3D: strong-weak, strong-strong, multi-slice, multi-bunch, multi-IP, head-on, crossing-angle, long-range • MaryLie/IMPACT: hybrid app combines MaryLie+IMPACT • Synergia: multi-language, extensible, framework; hybrid app involves portions of IMPACT+MXYZPTLK • Langevin3D: particle code for solving Fokker-Planck equation from first principles

  11. RAL • PSI • GSI • KEK • SLAC • LBNL • LANL • Tech-X • FNAL • ANL • ORNL • MSU • BNL • Jlab • Cornell • NIU IMPACT suite becoming widely used;> 300 email contacts in FY05, > 100 already in FY06

  12. SciDAC code development involves large, multidisciplinary teams. Example: MaryLie/IMPACT code

  13. Development, reuse, and synthesis of code components. Examples: Synergia, e-cloud capability

  14. New algorithms and methodologies are key. Examples: (1) high aspect ratio Poisson solver; (2) self-consistent Langevin/Fokker-Planck Electric field error vs. distance Self-Consistent Diffusion Coefficients vs. velocity Spitzer approximation Error in the computed electric field of a Gaussian distribution of charge (x=1mm and y=500mm). Even using a grid size of 64x8192, the standard method (blue curve) is less accurate than the Integrated Green Function method (purple) on 64x64. First-ever 3D self-consistent Langevin/Fokker-Planck simulation

  15. SciDAC beam dynamics applications benefit DOE/SC programs, esp. HEP • Beam-Beam simulation of Tevatron, PEP-II, LHC, RHIC • ILC damping rings (space-charge, wigglers) • FNAL Booster losses • CERN PS benchmark study • RIA driver linac modeling • SNS linac modeling • LCLS photoinjector modeling • CERN SPL (proposed proton driver) design • J-PARC commissioning • Publications: • 23 refereed papers since 2001 (including 5 Phys Rev Lett., 10 PRST-AB, 4 NIM-A, 2 J. Comp. Phys., Computer Physics Comm.), numerous conf proceedings papers • USPAS course on computational methods in beam dynamics

  16. Examples: Collider modeling using BeamBeam3D LHC beam-beam simulation nx1=nx2= ny1=ny2=0.31, x0=–0.0034 PEP-II luminosity calculation shows importance of multi-slice modeling (J. Qiang, Y. Cai, SLAC; K. Ohmi, KEK) First-ever 1M particle, 1M turn strong-strong b-b simulation (J. Qiang, LBNL) Parameter studies of antiproton lifetime in Tevatron Code scalability depends strongly on parallelization methodology (J. Qiang)

  17. ILC damping ring modeling using ML/I Results of MaryLie/IMPACT simulations of an ILC “dog-bone” damping ring (DR) design showing space-charge induced emittance growth using different space-charge models. Space charge is important for the ILC DR in spite of the high energy because of the combination of small emittance and large (16 km) circumference. Top (nonlinear space charge model): the beam exhibits small emittance growth. Bottom (linear space charge model): the beam exhibits exponential growth due to a synchro-betatron resonance. The instability is a numerical artifact caused by the simplified (linear) space-charge model. (M. Venturini, LBNL)

  18. FNAL booster modeling using Synergia FNAL booster simulation results using Synergia showing the merging of 5 microbunches. SciDAC team members are working closely with experimentalists at the booster to help understand and improve machine performance. (P. Spentzouris and J. Amundson, FNAL; J. Qiang and R. Ryne, LBNL)

  19. Beam Dynamics under SciDAC 2(HEP program) • Support/maintain/extend successful codes developed under SciDAC 1 (BD, EM, AA) • Develop new capabilities to meet HEP priorities: LHC, ILC, Tevatron, PEP-II, FNAL main injector, booster, proton driver • Self-consistent 3D simulation of: e-cloud, e-cooling, IBS, CSR • Start-to-end modeling with all relevant physical effects • Enable parallel, multi-particle beam dynamics design & optimization • Performance and scalability optimization on platforms up to the petascale (available by the end of the decade) • Couple parallel beam dynamics codes to commissioning, operations, and beam experiments

  20. Overview of the 3 focus areas • Beam Dynamics (BD) • Electromagnetic Modeling (EM) • Advanced Accelerators (AA)

  21. SciDAC AST – Electromagnetics Under SciDAC AST, the Advanced Computations Dept. @ SLAC is in charge of the Electromagnetics component to: • Developa comprehensive suite of parallel electromagnetic codes for the design and analysis of accelerators, (Ron’s talk) • Apply new simulation capability to accelerator projects across SCincluding those in HEP, NP and BES, (Ron’s talk) • Advancecomputational scienceto enable terascale computing through ISICs/SAPP collaborations. (this talk)

  22. ACD’s ISICs/SAPP Collaborations ACD is working with the TOPS, TSTT, PERC ISICs as well as SAPP researchers on 6 computational scienceprojectsinvolving 3 national labs and 6 universities. • Parallel Meshing – TSTT (Sandia, U Wisconsin/PhDthesis) • Adaptive Mesh Refinement – TSTT (RPI) • Eigensolvers– TOPS (LBNL), SAPP (Stanford/PhD thesis, UC Davis) • Shape Optimization – TOPS (UT Austin, Columbia, LBNL), TSTT (Sandia, U Wisconsin) • Visualization – SAPP (UC Davis/PhDthesis) • Parallel Performance – PERC (LBNL, LLNL)

  23. Processor: 1 2 3 4 Parallel Meshing & Adaptive Mesh Refinement Parallel meshing is needed for generating LARGE meshes to model multiple cavities in the ILC superstructure & cryomodule Adaptive Mesh Refinement improves accuracy & convergence of frequency and wall loss calculations RIA RFQ Frequency Wall loss Q

  24. Omega3P Lossless Lossy Material Periodical Structure External Coupling ISIL w/ refinement ESIL Implicit Restarted Arnoldi SOAR Self-Consistent Loop Omega3P Sensitivity geometric model optimization meshing sensitivity meshing Omega3P (only for discrete sensitivity) Eigensolvers & Shape Optimization Complex eigensolver for treating external coupling is essential for computing HOM damping in ILC cavities. Shape Optimizationto replace manual, iterative process in designing cavities with specific goals subject to constraints.

  25. Solve & Postprocess Breakdown Communication Pattern Visualization & Parallel Performance Visualization is critical to mode analysis in complex 3D cavities, e.g. mode rotation effects Parallel Performancestudies are needed to maximize code efficiency and optimize use of computing resources.

  26. Proposed Projects for SciDAC 2 SLAC will develop the NEXT level of simulation tools for NEXT generation SC accelerators (ILC, LHC, RIA, SNS) by continuing to advance Computational Science in collaborations with the ISICs/SAPP component of SciDAC • Parallel adaptive h-p-q refinement where h is mesh size, p is order of FE basis and q is order of geometry model • Parallel shape optimization (goals w/ constraints) and prediction (cavity deformations from HOM measurements) • Parallel particle simulation on unstructured grids for accurate device modeling (RF guns, klystrons) • Integrated electromagnetics/thermal/mechanical modeling for complete design and engineering of cavities • Parallel, interactive visualization cluster for mode analysis and particle simulations

  27. Overview of the 3 focus areas • Beam Dynamics (BD) • Electromagnetic Modeling (EM) • Advanced Accelerators (AA)

  28. Recent advances in modeling advanced accelerators: plasma based acceleration and e-clouds W.B.Mori , C.Huang, W.Lu, M.Zhou, M.Tzoufras, F.S.Tsung, V.K.Decyk (UCLA) D.Bruhwiler, J. Cary, P. Messner, D.A.Dimtrov, C. Neiter (Tech-X) T. Katsouleas, S.Deng, A.Ghalam (USC) E.Esarey, C.Geddes (LBL) J.H.Cooley, T.M.Antonsen (U. Maryland)

  29. Accomplishments and highlights:Code development • Four independent high-fidelity particle based codes • OSIRIS: Fully explicit PIC • VORPAL: Fully explicit PIC + ponderomotive guiding center • QuickPIC:quasi-static PIC + ponderomotive guiding center • UPIC: Framework for rapid construction of newcodes--QuickPIC is based on UPIC: FFT based • Each code or Framework is fully parallelized. They each have dynamic load balancing and particle sorting. Each production code has ionization packages for more realism. Effort was made to make codes scale to 1000+ processors. • Highly leveraged

  30. Colliding laser pulses Particle beams 104 s(N) Scale well to 1,000’s of processors Full PIC: OSIRIS and Vorpal • Successfully applied to various LWFA and PWFA problems Self-ionized particle beam wake 3D LWFA simulation

  31. Quasi-static PIC:QuickPIC Code features: • Based on UPIC parallel object-oriented plasma simulation Framework. Model features: • Highly efficient quasi-static model for beam drivers • Ponderomotive guiding center + envelope model for laser drivers. • Can be 100+ times faster than conventional PIC with no loss in accuracy. • ADK model for field ionization. Applications: • Simulations for PWFA experiments, E157/162/164/164X/167 • Study of electron cloud effect in LHC. • Plasma afterburner design afterburner hosing E164X

  32. Recent highlights: LWFA simulations using full PIC • Phys. Rev. Lett. by Tsung et al. (September 2004) where a peak energy of 0.8 GeV and a mono-energetic beam with an central energy of 280 MeV were reported in full scale 3D PIC simulations. • 3 Nature papers (September 2004) where mono-energetic electron beams with energy near 100 MeV were measured. Supporting PIC simulations were presented. • SciDAC members were collaborators on two of these Nature publications and SciDAC codes OSIRIS and Vorpal were used. Vorpal result on cover

  33. Located in the FFTB 25 m Modeling self-ionized PWFA experiment with QuickPIC E164X experiment FFTB QuickPIC simulation

  34. Afterburner simulation: 0.5 TeV ~ 1 TeV in 28 meters s = 0 m s = 28.19 m Simulation done with QuickPIC in 5000 node-hours Full PIC run would have taken 5,000,000 node-hours!

  35. Vision for the future: SciDAC 2High fidelity modeling of .1 to 1TeV plasma accelerator stages • Physics Goals: • A) Modeling 1to 10 GeV plasma acc stages: Predicting and designing near term experiments. • B) Extend plasma accelerator stages to 250 GeV - 1TeV range: understand physics & scaling laws • C) Use plasma codes to definitively model e-cloud physics: • 30 minutes of beam circulation time on LHC • ILC damping ring • Software goals: • A) Add pipelining into QuickPIC: Allow QuickPIC to scale to 1000’s of processors. • B) Add self-trapped particles into QuickPIC and ponderomotive guiding center Vorpal packages. • C) Improve numerical dispersion* in OSIRIS and VORPAL. • D) Scale OSIRIS, VORPAL, QuickPIC to 10,000+ processors. • E) Merge reduced models and full models • F) Add circular and elliptical pipes* into QuickPIC and UPIC for e-cloud. • G) Add mesh refinement into* QuickPIC, OSIRIS, VORPAL,and UPIC. • H) Develop better data analysis and visualization tools for complicated phase space data** • *Working with APDEC ISIC • **Working with visualization center

  36. In Conclusion… • Q: What is the scope of our research in regard to HEP short/medium/long-range applications? • A: It is mainly short/medium. • Capabilities have been developed, codes applied to: • Short: PEP-II, Tevatron, FNAL Booster, LHC • Medium: ILC • Long: Exploration of advanced accelerator concepts • these activities are highly leveraged, represent 10% of the SciDAC AST budget

  37. Final remarks • Future HEP facilities will cost ~$0.5B to ~$10B • High end modeling is crucial to • Optimize designs • Reduce cost • Reduce risk • Given the magnitude of the investment in the facility, the $1.8M investment in SciDAC is tiny, but the tools are essential • Laser/plasma systems are extraordinarily complex • High fidelity modeling, used in concert with theory & experiment, is essential to understand the physics and help realize the promise of advanced accelerator concepts

  38. Acronyms used • SciDAC: Scientific Discovery through Advanced Computing • AST: SciDAC Accelerator Science & Technology project • ASCR: Office of Advanced Scientific Computing Research • BD: Beam dynamics activities of SciDAC AST • EM:Electromagnetics activities of SciDAC AST • AA: Advanced Accelerator activities of SciDAC AST • ISIC: SciDAC Integrated Software Infrastructure Center • TOPS: Terascale Optimal PDE Solvers center • TSTT: Terascale Simulation Tools and Technologies center • APDEC: Applied Partial Differential Equations center • PERC: Performance Evaluation Research Center • SAPP: Scientific Application Partnership Program (ASCR-supported researchers affiliated w/ specific SciDAC projects)

More Related