1 / 25

DOE HEP Physics Program Review June 14-16, 2005 @SLAC

Advanced Computations Department. DOE HEP Physics Program Review June 14-16, 2005 @SLAC. Kwok Ko. * Work supported by U.S. DOE ASCR & HEP Divisions under contract DE-AC02-76SF00515. ACD Mission. Formed in 2000 to focus on high performance computing with the mission to:.

valmai
Download Presentation

DOE HEP Physics Program Review June 14-16, 2005 @SLAC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Computations Department DOE HEP Physics Program Review June 14-16, 2005 @SLAC Kwok Ko * Work supported by U.S. DOE ASCR & HEP Divisions under contract DE-AC02-76SF00515

  2. ACD Mission Formed in 2000 to focus on high performance computing with the mission to: • Develop new simulation capability to support accelerator R&D at SLAC & accelerator facilities across SC, • Advance computational science to enable ultra-scale computing on SC’s flagship computers (NERSC, ORNL) • Share resources with community and educate/train future computational scientists. Support: Base program, SciDAC, Accelerator projects, SBIR + others Personnel:15 people/13 FTE (5 computational physicists, 7 computer scientists, 2 graduate students, 1 admin/technical assistant) Output: 3 PhD thesis, 5 papers, 3 reports, 30 talks/posters (2003-05)

  3. D SBIR - STAR Inc ACD R&D Overview & SciDAC H High Performance Computing (NERSC, ORNL) Parallel Code Development ComputationalScience Modeling and Simulation Simulation and Modeling SciDAC LBNL LLNL SNL Stanford UCD RPI, CMU Columbia UWisconsin Accelerators SLAC FNAL ANL Jlab MIT DESY KEK PSI ACD Accelerator Modeling Computational Mathematics Computing Technologies

  4. Analysis Partitioning Solvers CAD/Meshing Refinement Performance Optimization Visualization Electromagnetic Modeling Elements of Computational Science NLC Cell Design Large-scale electromagnetic modeling is enabled by advancing all elements through SciDAC collaborations

  5. LLNL L. Diachin, D. Brown, D. Quinlan, R. Vuduc SNL P. Knupp, K. Devine. L. Fisk, J. Kraftcheck LBNL E. Ng, W. Gao, X. Li, C, Yang P. Husbands, A. Pinar, D. Bailey, D. Gunter UCD K. Ma, H. Yu Z. Bai RPI M. Shephard, A. Brewer, E. Seol UWisconsin T. Tautges, H. Kim, Stanford G. Golub Columbia D. Keyes CMU O. Ghattas V. Akcelik SciDAC ESS Team “Electromagnetic Systems Simulation” SLAC/ACD Computing Technologies N. Folwell, G. Schussman, R. Uplenchwar, A. Guetz (Stanford) Computational Mathematics L. Lee, L. Ge, E. Prudencio, S. Chen (Stanford), Accelerator Modeling K. Ko, V. Ivanov, A. Kabel, Z. Li, C. Ng,, L. Xiao, A. Candel (PSI) ISICs (TSTT, TOPS, PERC) and SAPP

  6. Generalized Yee Grid Finite-Element Discretization Weak-strong Beam-beam S3P Omega3P Tau3P/T3P Strong-strong Beam-beam Frequency Domain Mode Calculation Time Domain Simulation With Excitations Scattering Matrix Evaluation TrafiC4 - CSR Track3P – Particle Tracking with Surface Physics V3D – Visualization/Animation of Meshes, Particles & Fields “Unstructured Grid and Parallel Computing” Parallel Code Development Electromagnetics (SciDAC funded) Beam Dynamics (SLAC supported)

  7. Achievements in Accelerator Science (Electromagnetics & Beam Dynamics)

  8. NLC DDS Wakefields Omega3P/Tau3P computed the long-range wakefields in the 55-cell Damped Detuned Structure to verify the NLC design in wakefield suppression by damping and detuning. NLC 55-cell DDS Omega3P: Sumovereigenmodes Tau3P: Direct beam excitation Tau3P: Direct beam excitation Tau3P: Direct beam excitation Tau3P Wakefields Omega3P Wakefields

  9. NLC Dark Current Dark current pulses were simulated for the 1st time in a 30-cell X-band structure with Track3P and compared with data. Simulation shows increase in dark current during pulse risetime due to field enhancement from dispersive effects. Track3P: Dark current simulation Track3P: Dark current simulation Red – Primary particles, Green – Secondary particles Red – Primary particles, Green – Secondary particles Dark current @ 3 pulse risetimes -- 10 nsec -- 15 nsec -- 20 nsec Data Track3P

  10. ILC Cavity Design An international collaboration (DESY, KEK, SLAC, FNAL, Jlab) is working on a Low-Loss cavity (23% lower cryogenic loss) as a viable option for the ILC linac. SLAC is calculating the HOM damping & multipacting for the DESY and KEK designs. ILC LL 9-cell Cavity Design

  11. ILC Cavity HOM Damping Partitioned Mesh of LL Cavity Complex Omega3P is being used to calculate the Qext of dipole modes in the DESY and KEK LL cavity designs. KEK DESY

  12. Bellows Modes PEP-II Vertex Bellows Damping Omega3P was used to study the effectiveness of ceramic tiles mounted on the bellows convolution to damp localized modes that contribute to HOM heating of the bellows. Bellows modes can be damped to very low Qs (~20-50) Ceramic tile absorber Bellows mode Dielectric loss PEP-II Vertex Bellows

  13. Quad (βr)/mm LCLS RF Gun Cavity Design ACD provided the dimensions for the LCLS RF Gun cavity that meet two important requirements: • minimized dipole and quadrupole fields via a racetrack dual-feed coupler design, • reduced pulse heating by rounding of the z coupling iris. A new parallel Particle-In-Cell (PIC) capability is being developed in T3P for self-consistent modeling of RF guns needed for the LCLS upgrade, future light sources and FELs. Quad

  14. Slice Saturation Power Slice Gain Length LCLS CSR Effects LCLS Bunch Compressor (with P. Emma): Predict FEL performance in the self-consistent Coherent Synchrotron Radiation (CSR) regime for different compressor settings Coherent Edge Radiation: Field viewer module for TraFiC4 allows study of the spatial & temporal behaviour of the detector signal

  15. Lifetime enhancement with lowered chromaticity Example result - Low particle loss rates at collision PLIBB Results Tevatron Beam-Beam Simulation Tevatron (with Y. Cai and T. Sen): Calculate actual lifetimes and lifetime signatures for the machine at injection and collision for different machine parameters New version of parallel beam-beam framework PLIBB: • Allows billions of particle-turns • Resolves ~100h lifetime (collision case!) • Handles chromaticity exactly • Strong-strong being integrated

  16. PSI Cyclotron HOM Analysis 1st evereigenmode analysis of an entire ring cyclotron as part of a PhD research (L. Stingelin) to investigate the beam-cavity interactions in the existing machine and future upgrade. CAVITY VACUUM CHAMBER MIXED MODES (NEW)

  17. Advances in Computational Science (SciDAC)

  18. Processor: 1 2 3 4 Parallel Meshing (SNL, UWisconsin) To be able to model multiple ILC cavities a parallel meshing capability has been developed in collaboration with SNL and UWisconsin (PhD thesis) to facilitate the generation of VERY LARGE meshes on the supercomputer directly to overcome the memory limitation of desktops.

  19. Omega3P Lossless Lossy Material Periodic Structure External Coupling ISIL w/ refinement ESIL Implicit Restarted Arnoldi SOAR Self-Consistent Loop WSMP MUMPS SuperLU Kryov Subspace Methods Domain-specific preconditioners Eigensolvers (LBL, UCDavis, Stanford) With LBL, UCD and Stanford, a comprehensive capability has been under development for solving large, complex RF cavities to accuracies previously not possible. The parallel eigensolver Omega3P has been successfully applied to numerous accelerator cavities and beamline components.

  20. Mesh Refinement (RPI) In modeling RIA’s RFQs, Adaptive Mesh Refinement (AMR) provided accuracy gain of 10 and 2 in frequency and wall loss calculations with Omeg3P over standard codes, while using a fraction of CPU time compared to the case without AMR. Wall Loss on AMR Mesh AMR speeds up convergence thereby minimizing computing resources Qo Convergence Frequency Convergence More accurate f and Q predictions reduce the number of tuners and tuning range, and allow for better cooling design

  21. geometric model Omega3P Sensitivity optimization meshing sensitivity Omega3P meshing (only for discrete sensitivity) Shape Optimization (CMU, SNL, LBNL) An ongoing SciDAC project is to develop a parallel shape optimization tool to replace the existing manual process of optimizing a cavity design with direct computation. The capability requires the expertise from SciDAC’s ISICs.

  22. Visualization (UCDavis) Graphics tools for rendering LARGE, 3D multi-stream, unstructured data have been developed and a visualization cluster soon be installed, both to support accelerator analysis Mode rotation (in space and time) exhibited by the two polarizations of a damped dipole mode in ILC cavity New graphics tools for rendering LARGE, multi-stream, 3D unstructured data have been developed, to be supported by a dedicated visualization cluster to help in analyzing cavity design, such as mode rotation in the ILC cavity.

  23. Dissemination • HEP/SBIR:STAR Inc and ACD are developing the GUIs to interface SLAC’s parallel codes which are in use at e.g. FNAL and KEK. These codes potentially can replace use of commercial software (MAFIA, HFSS) at DOE sites to save costs ~million+ $ per year in leases. • USPAS: SciDAC codes and capabilities are shared regularly with the community via the course “Computational Methods in Electromagnetism” USPAS sponsored by theCornell Universityheld in Ithaca, NY June 20 - July 1, 2005 http://uspas.fnal.gov/

  24. PhDs completed in ACD; Yong Sun, SCCM, Stanford University, March 2003 “The Filter Algorithm for Solving Large-Scale Eigenvalue Problems from Accelerator Simulations” Greg Schussman, Computer Science, UCDavis, December 2003 “Interactive and Perceptively Enhanced Visualization of Large, Complex Line-based Datasets” Lukas Stingelin, Physics, Ecole Polytechnique Lausanne, December 2004 “Beam-cavity Interactions in High Power Cyclotrons” PhDs in progress; Adam Guetz, ICME, Stanford University Sheng Chen, ICME, Stanford University Summer interns – Grad/Undergrad Education/Training

  25. ACD Goals • Continue to support Accelerator Science across SC • Continue SciDAC collaborations in Computational Science • Involve in Astroparticle Physics & Photon Science ILC BPM & Wakefields in LCLS Undulator ILC LL Cavity & Cryomodule XFEL SC RF Gun MIT PBG Cavity for Jlab 12 GeV Upgrade

More Related