1 / 34

Computing at the High Energy Physics Lab at FIT

Computing at the High Energy Physics Lab at FIT. Patrick Ford, Jen Helsby, Richard Hoch, David Pena Dr. Hohlmann, Dr. Mitra. Current Projects. Cluster Computing - HEP’s computer cluster Grid Computing - Getting the cluster on the Open Science Grid Simulations of Particles Through Matter

nerys
Download Presentation

Computing at the High Energy Physics Lab at FIT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computing at the High Energy Physics Lab at FIT Patrick Ford, Jen Helsby, Richard Hoch, David Pena Dr. Hohlmann, Dr. Mitra

  2. Current Projects • Cluster Computing - HEP’s computer cluster • Grid Computing - Getting the cluster on the Open Science Grid • Simulations of Particles Through Matter - Using Geant4 to model cosmic ray muons traveling through different mediums • Reconstruction Algorithms - Developing algorithms to reconstruct muons passage through matter

  3. Cluster Computing • Popular high performance computing solution. • A computer cluster is a group of tightly coupled computers that work together closely so that in many respects they can be viewed as though they are a single computer. • Computing clusters make up over half of the top 500 most powerful computers in the world System X at Virginia Tech (12.5 Teraflops) [1]

  4. HEP Computer Cluster • Equipment loaned by University of Florida • Started with 10 Dual CPU Intel Pentium 1.0 GHz servers • One server for front-end, nine for nodes • Uses networked attached storage (NAS) • Cascaded switches for expandability and redundancy

  5. Cluster Topology

  6. Current and Future Status • Currently the original front-end is still being used, but the cluster has expanded to 30 nodes • Uses a high-end managed switch as the hub of network and cascades to unmanaged switches with 10 nodes each • Future expansion will include high-end compute nodes, a ~10TB NAS, and a better front-end MAGNUM XV3045 NAS [2]

  7. HEP COMPUTER CLUSTER Newest Nodes NAS 1 and 2

  8. Rocks • Open-source Linux cluster distribution • Enables end users to easily build computational clusters [3]

  9. Rocks Kickstart Graph

  10. Networked Attached Storage • Also uses Rocks • Uses RAID 5 - Faster writing. Each hard drive needs to write only 1/3 of the data - Efficiency increases as number of hard drives increases - Fault tolerance. If any one hard drive fails, the data on that drive can be reconstructed using the data from the other two drives.

  11. Condor • Software that enables us to distribute the load of a computing task over all the CPUs in the cluster • This type of software is called a batch job system • Well suited for grid computing, as it is able to submit jobs to machines located all over the world

  12. Grid Computing • A collection of networks, software, and computers intended for shared use by organizations of people • Resources are managed by a grid • Users run applications as needed without worrying about where the computers are • Well-suited to organizations that consist of a large number of geographically distributed members, all working on a common project, and who require shared computing resources in order to accomplish their work

  13. Grid Layers I. Network layer - Underlying connectivity II. The grid's resources - data storage, databases, software repositories, and even sensors III. The middleware, or "brains" of the grid - does all the work to connect users' jobs to computing resources IV. Application layer - diverse layer, as it includes virtually any program an end user wishes to run

  14. Open Science Grid • A distributed computing infrastructure that is used for large-scale scientific work • Used by many universities, laboratories, and software developers • Backed by the NSF and the U.S. Department of Energy's Office of Science • The OSG Consortium builds and operates the OSG project, with the goal of giving scientists from many fields access to shared resources worldwide

  15. Science on OSG • Scientists from many fields use OSG: particle and nuclear physics, astrophysics, bioinformatics, gravitational-wave science and computer science collaborations [4]

  16. Getting On OSG • Need the third layer, the middleware • OSG’s is based on the Virtual Data Toolkit (VDT) • Installation package is needed, called Pacman • First installed Integration Test Bed (ITB) client and then the Compute Element (CE) package

  17. Getting On OSG (cont.) • Interfacing Globus and Condor • Installing additional packages: Managed Fork , MonaLisa, other monitoring services • Getting personal and host certificates, and the Certificate Authority (CA) list • Testing and debugging the install • Registering with OSG

  18. Success… partially The Integration Test Bed Map

  19. Particle Simulations • Geant4 provides a toolkit that enables modeling of many different particles through matter • Much data can be extracted from these simulations • Our focus is on the simulation of cosmic ray muons traveling through different mediums. Why?

  20. Muon Tomograpy • Outgrowth of muon and proton radiography • Provides a new way to detect threats such as nuclear weapons or fissionable material, and other terrorist threats (artillery shells, IED’s, etc.) • Why muons?

  21. Why Muons? • Relatively large elementary particles and travel at relativistic speeds, can penetrate tens of meters into rocks and other matter before attenuating as a result of absorption or deflection by other atoms • All natural occurring muons on Earth are due to cosmic rays • One per cm^2 per minute • Muons are deflected by coulomb scattering, dependent on the atomic number of the material • Benefits over other techniques • Muons more penetrating than gamma rays • No extra radiation dose • Fewer false alarms [5]

  22. How Does It Work?

  23. Geant4 • Free tool that can run on Windows, Linux, and MAC OS X • Current version written in C++; former versions written in Fortran • Developed and maintained by the Geant4 collaboration which has over 100 members worldwide [6]

  24. Scope of Geant4 • the geometry of the system (e.g. a box) • the materials involved (e.g. Pb, U, etc.) • the fundamental particles of interest (e.g. electrons, muons, etc.) • the physics processes governing particle interactions • the generation of event data • the storage of events and tracks • the visualization of the detector and particle trajectories • the capture and analysis of simulation data at different levels of detail and refinement

  25. First Scenario • Created a 50x50x50 cm^3 lead block in an argon atmosphere • Bombarded it with 3GeV muons • Interactions included: muons, electrons - ionization, knock-on electrons, multiple scattering photons – absorption via photoelectric effect, Compton scattering, pair production

  26. Second Scenario • Interfaced Cosmic RaY (CRY) to simulate cosmic ray muon • Used same lead box as material to detect, but added detectors made of G10 Material • Blue – Positively Charged Muons • Red – Negatively Charged Muons • Green - Photons

  27. Future Scenarios • Adding more detectors • Simulating a truck carrying plywood • Hidden in the cargo area, and in the engine block are small blocks of uranium • Problems: - Small amounts of high-z material harder to detect - Engine block contains high-z material so multiple scattering will occur

  28. Reconstruction Algorithms • Produces a 3D image from the projection • For Muon Tomography two have been prominently used: - Point of Closest Approach - Maximum Likelihood • Implemented the POCA algorithm

  29. Point of Closest Approach • Take two lines, L1 and L2 • Take w(s,t) = L1(s) – L2(t) • L1 and L2 closest when w(s,t) is a minimum, and this vector is perpendicular to these points, meaning w·v=0 and w·u=0 • Solve these by substituting w = L1(s)-L2(t) = w0 + su - tv, where w0 = P0-Q0 we get two linear equations: • (u·u)s-(u·v)t=-u·w0 and (u·v)s-(v·v)t=-v·w0 • If we set a = (u·u), b = (u·v), c = (v·v), d = u·w0 and e = v·wo and then solve for s and t we get the equations: • s = be-cd/(ac-b^2) and t = ae-bd/(ac-b^2)

  30. Using POCA to find Scatter Points

  31. 3D Imaging • Also have the scattering angle of the muon • A large scattering angle indicates high Z-object • When plotted, points will be assigned colors according to scattering angle [7]

  32. Future Work • Expanding the Cluster - High-end nodes, NAS, and front-end • Becoming a fully functional site on OSG • Modeling more detailed scenarios using Geant4 • Improving POCA, and implementing a Maximum Likelihood algorithm • Using real world data

  33. References • http://www.vt.edu/spotlight/20060806_systemx.php • http://www.cybertronpc.com/ • http://www.rocksclusters.org/wordpress/ • http://www.opensciencegrid.org/Science_on_the_OSG/Currently_Running_Applications • http://en.wikipedia.org/wiki/Muons • http://geant4.web.cern.ch/geant4/ • http://math.lanl.gov/Research/Publications/Docs/borozdin-2004-information.pdf

More Related