1 / 34

Beam Dynamics Overview

Beam Dynamics Overview. Robert D. Ryne COMPASS all-hands meeting Sept 17-18, 2007 Fermilab. Overall Goals. Develop advanced beam dynamics capability to meet the mission needs of DOE/SC HEP, NP, and BES accelerator projects

lamonica
Download Presentation

Beam Dynamics Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Beam Dynamics Overview Robert D. Ryne COMPASS all-hands meeting Sept 17-18, 2007 Fermilab Robert Ryne

  2. Overall Goals • Develop advanced beam dynamics capability to meet the mission needs of DOE/SC HEP, NP, and BES accelerator projects • Develop reusable software components to produce a comprehensive, scalable (to petascale), lasting accelerator modeling capability for present and future accelerator projects Robert Ryne

  3. COMPASS Beam Dynamics physics areas fall mainly in 7 categories • Space-charge • Beam-beam • Multi-species • Beam-environment • Optics, errors, feedback • High brightness e-beams, radiation • IBS Robert Ryne

  4. Space Charge • Maintain SciDAC1 solvers, port/optimize for SciDAC2 platforms • Develop/incorporate new solvers, working math math/cs partners, to meet new requirements (boundary conditions, etc) • See math/cs talks Tuesday Robert Ryne

  5. Beam-beam effects • Codes used • BeamBeam3D • Lifetime,Nimzovitch Robert Ryne

  6. BeamBeam3D • Developed by Ji Qiang • Multiple models (strong-strong, weak-strong) • Multi-slice (finite bunch length effects) • New algorithm -- Shifted Green function -- efficiently treats long-range parasitic collisions • Particle-based decomp (perfect load balance) • Lorentz boost handles crossing angle collisions • Multi-IP collisions, varying phase advance,… • Arbitrary closed orbit sep (static or time-dep) • Applied to Tevatron, LHC, PEP-II, KEK-B, RHIC, RHIC/LARP RHIC B-B -growth vs x,y • Strong collaboration, code development by Stern et al at FNAL • Fourier 3D solver validated with observed synchro-betatron modes • Resistive-wall impedance model growth rate matches predictions • Chromaticity with coupled-motion maps and impedance matches predictions • Arbitrary bunch collision patterns w/ measured Tevatron optics & helix incorporated Robert Ryne

  7. Beam-beam code validation comparing with VEPP-II data BeamBeam3D code validation comparing with VEPP-II data (E. Stern, A. Valisev, FNAL; J. Qiang, LBNL) Robert Ryne

  8. Sequence of frames from a BeamBeam3D simulation of a collision at the Tevatron @ 200x nominal intensity (E. Stern, FNAL) Robert Ryne

  9. Lifetime, Nimzovitch • Developed by Andreas Kabel • LIFETIME application • Uses PLIBB to calculate lifetimes in storage rings; applied to Tevatron, RHIC current wire experiment, LHC • NIMZOVITCH • Strong-strong beam-beam code optimized for large number of bunches/IP’s/parasitic crossings Robert Ryne

  10. Beam-beam plans • BeamBeam3D: • Implement wire compensation model • Implement rotating beam colliding w/crossing angle, test on LHC luminosity monitor • Implement full nonlinear symplectic tracking • Implement quantum effects, test and perform high-resolution simulation of ILC beam-beam interactions • Incorporate solver into multi-physics framework • Use to investigate antiproton intensity limits in the Tevatron and the growth of multi-bunch modes and electron cooling beam-beam compensation operation Robert Ryne

  11. Beam-beam plans, cont. • Nimzovitch: • Make Nimzovitch go away by reformulating under enhanced PLIBB: • more physics (IBS, noise, imperfections) • low noise PIC • enforce symplectic correctness in 3D • beamline parallelization for multi-bunch calculations • Lifetime: • Apply PLIBB w/ IBS module to RHIC, experimental validation • Strong-strong module: apply to LHC multi-bunch effects Robert Ryne

  12. e- g e- e- i+ halo i+ g e- e- e- e- Multi-species effects • Main emphasis on electron-cloud • 2 approaches: • Full 3D using WARP/POSINST (A. Friedman, D. Grote, J.L.-Vay, M. Furman, et al) • Quasi-static using QuickPIC (W. Mori, V. Decyk, T. Katsouleas et al) e- • i+ = ion • e-= electron • g = gas • = photon = instability  Positive Ion Beam Pipe • Ionization of • background gas • desorbed gas • ion induced emission from • expelled ions hitting vacuum wall • beam halo scraping • secondary emission from electron-wall collisions • photo-emission from synchrotron radiation (HEP) Robert Ryne

  13. Calculating the e-cloud effects in the ILC DR wiggler is an immense numerical challenge • 3D - fields and dynamics • Self-consistent (beam  electrons) • Large range of spatial scales (sets resolution,  memory req’t) transverse: longitudinal: Must resolve beam, but 3000 x 3000 x 6000 ~ 1010-cell mesh! • Huge number of timesteps required t = (e- traverse <1 cell in 1 timestep) Robert Ryne

  14. E-cloud modeling using WARP-POSINST • 3D field dynamics and dynamics • Fully self-consistent • Realistic boundary conditions • Detailed electron generation models (POSINST, including energy spectrum) • Drift Lorentz electron “mover” (correct space charge w/out resolvingcyclic orbit) • Mesh refinement (spatial resolution only where needed -- essential!) • Velocity sub-cycling (small t only for particles that need it) • Parallelized Robert Ryne

  15. quad drift bend drift QuickPIC uses a quasi-static model; under certain circumstances, agrees well with self-consistent but is orders of magnitude faster 2-D slab of electrons s 3-D beam s0 lattice A 2-D slab of electrons (macroparticles) is stepped backward (with small time steps) through the beam field and 2-D electron fields are stacked in a 3-D array, that is used to push the 3-D beam ions (with large time steps) using maps (as in HEADTAIL-CERN) or Leap-Frog (as in QUICKPIC-UCLA/USC). 100x improvement with “no” loss in accuracy

  16. QuickPIC: Pipelining

  17. Calculation in boosted frame provides x10n speedup* - proton bunch through a given e– cloud- proton bunch radius vs. z electron streamlines beam • hose instability of a proton bunch • Proton energy:g=500 in Lab • L= 5 km, continuous focusing Code: WARP (Particle-In-Cell) Speedup x1000 CPU time: • lab frame: >2 weeks • frame with2=512: <30 min *J.-L. Vay, PRL 98, 130405 (2007) Robert Ryne

  18. WARP-POSINST plans • Code needs development to perform run in boosted frame and complete set of e-cloud related physics • Implementation of magnetoinductive (“Darwin”) model, or reduce version of it if sufficient, • implement an interface linking zones of 3-D PIC simulations to zones of MAPS transport in beween, • upgrade diagnostics to allow for results given in frame different from the one of calculation, • implement self-consistent generation and tracking of photo-electrons, based on Monte-Carlo methods, • implement adaptive macro-particle management (reduction/coalescence), • upgrade parallel decomposition from 1-D to 2-D/3-D. Robert Ryne

  19. Beam-environment interactions • Maintain SciDAC1 wakefield modules, port/optimize for SciDAC2 platforms • Implement circuit model for time-dep beam loading effects • Fully self-consistent calculation using VORPAL Robert Ryne

  20. Optics, errors, feedback • Maintain existing optics libraries that are used in the BD framework, port/optimize for SciDAC2 platforms • Extend multi-bunch capabilities • Implement models for dynamically changing quantities (e.g. jitter), machine errors, and feedback systems Robert Ryne

  21. High brightness electron beam dynamics • Codes used: • Elegant • IMPACT (collaboration with synergistic high brightness e-beam activities at LBNL funded by other sources) • Essential goal of this work is to support LCLS commissioning, operation, and optimization with fast, high-fidelity modeling tools • Large scale computing essential for detailed study of the microbunching instability Robert Ryne

  22. Elegant: status and limitations • CSR and longitudinal space charge parallelized in elegant • Limited to ~60 million particles presently • With ~1.5 billion particles we could look at modulations on the 1 mm level relevant to proposed laser/undulator beam heaters • Addressing I/O and memory management issues related to this • Present fast CSR algorithms are 1-d simplifications • Existing 3-d algorithms are coarse-grained, time-consuming • No standardized, accepted tools exist for transferring information between various accelerator codes (elegant, IMPACT) and radiation modeling codes (GINGER, GENESIS, SPUR) • No way to take a snapshot of an existing FEL, simulate it, then compare simulated and real diagnostics • LCLS is already reporting [Frisch, PAC07] unexplained effects with very short bunches in the first compressor • Must be able to optimize to match a selection of diagnostics, then extrapolate to other diagnostics Robert Ryne

  23. BLS BC1 BC2 FERMI FEL Microbunching Instability Simulated with elegant Tiny initial density modulations build up in bunch compression systems due to CSR and space charge. Gain increases to ~2000-fold down to 25 m modulation. Can't presently go shorter than this!

  24. IMPACT: status and limitations • Successfully used to perform 1B macroparticle simulations of Fermi FEL linac • Limitations: 1D CSR model, difficult to use for design optimization, simple matrix description of RF elements, not fully integrated with FEL codes Robert Ryne

  25. 1B particle simulation of microbunching in FERMI FEL linac using IMPACT Final Longitudinal Phase Space Distribution Using 10M and 1B particles (init. 15 keV energy spread, 2BCs) Robert Ryne

  26. Summary of ANL’s tasks/plans • Finish parallelization of elegant • Develop accepted, robust interfaces among suite of codes involved in FEL modeling • IMPACT (gun and linac modeling) • elegant (accelerator modeling and optimization) • GENESIS and GINGER (FEL modeling) • Develop integrated graphical user interface to provide on-demand, high-fidelity modeling of data and experients • Selection of codes, algorithms, detail level • Utilizes data drawn from the control system • Utilizes high-performance computing resources • Develop optimizer based on genetic algorithm to provide guidance on FEL performance improvement. Robert Ryne

  27. IMPACT development plans for high brightness e-beam (funded by LDRD and other non-SciDAC projects) • Develop and implement interfaces for start-to-undulator parallel simulation • Fully self-consistent CSR (difficult!) • Automatic beam steering • Integration with optimization tool • Incorporate nonlinear model of RF beamline elements Robert Ryne

  28. Frameworks • Synergia/SciDAC1 • IMPACT suite • MaryLie/IMPACT • UPIC • PLIBB Robert Ryne

  29. Synergia/SciDAC1 • See next talk Robert Ryne

  30. IMPACT • A code suite (linac design, 3D rms code, 2 parallel PIC tracking codes) developed under SciDAC1 • Includes IMPACT-Z and IMPACT-T 3D parallel PIC codes • Applicable to electron and ion accelerators • Recent enhancements • Cathode emission model; cathode image effects • Energy binning for large E • Multi-charge state capability (RIA) • SW and TW structures • wakefields • 1D CSR • IMPACT-T now widely used for photoinjector modeling • BNL e-cooling project, Cornell ERL, FNAL/A0, LBNL/APEX, ANL, JLAB, SLAC/LCLS, Fermi@elettra Emission from nano-needle tip Robert Ryne

  31. MaryLie/IMPACT (ML/I) • Hybrid code combining MaryLie beam optics with IMPACT parallel PIC + new capabilities • Embeds operator splitting for all thick elements • Allows mixed MaryLie and MAD input • New software modules (wakefields, soft-edge magnet models, …) add functionality • Performance optimization (R. Gerber, NERSC staff) • Multiple uses all within in the same code • Particle tracking, envelope tracking, map production, map analysis, lattice functions, fitting. • User manual and example suite • Contributions from many people from many disciplines (follows the SciDAC model) damping ring simulation with MLI Robert Ryne

  32. PLIBB particle dynamics framework • Developed at SLAC (A. Kabel) • a general-purpose C++ framework for high-speed, parallel tracking studies • fast and easily extensible through compile-time polymorphism • easily applied: MAD{X,8} beamline parsers & manipulators • physics: magnetic elements, cavities, wakefields, beam-beam • analysis: statistics, differential algebra, collective quantities Robert Ryne

  33. UPIC Framework for Parallel PIC • Developed by V. Decyk • Layered, Fortran based, but could be called from C/C++ • Goals: • Rapid construction of new parallel PIC codes from trused components • High accuracy testbed for evaluating and verifying PIC algorithms • Supports • Multiple plasma models: electrostatic, Darwin, electromagnetic • Multiple boundary conditions: periodic, dirichlet, neumann, open3. Multiple levels of accuracy: linear, quadratic, gridless4. Multiple programming paradigms: procedural, object-oriented5. Multiple parallel models: threads, message-passing. • Used in QuickPIC and other applications Robert Ryne

  34. Frameworks: Plans • See next talk Robert Ryne

More Related