1 / 42

Virtual Server Self-Service Provisioning

Juraj Sucik System Architect CERN. Virtual Server Self-Service Provisioning. CERN. What means «  »?. CERN. C. E. R. N. 1952. C onseil E uropé en pour la R echerche N ucl é aire. O rganisation E uropé enne pour la R echerche N ucl é aire. E uropean O rganization for N uclear

egil
Download Presentation

Virtual Server Self-Service Provisioning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Juraj Sucik System Architect CERN Virtual Server Self-Service Provisioning

  2. CERN

  3. What means «  »? CERN C E R N 1952 ConseilEuropéen pour laRechercheNucléaire OrganisationEuropéenne pour laRechercheNucléaire European Organization for Nuclear Research European Council for Nuclear Research 1954 EuropeanLaboratory for ParticlePhysics

  4. The largest particle physics lab in the world People 2415 Staff730Fellows and associates200 Students9133Users2000 ExternalFirm Annual budgetin 2007982 MCHF (610 MEUR) Externalfundingfor experiments TwentyMember States Austria, Belgium, Bulgaria, CzechRepublic, Denmark, Finland, France, Germany, Greece, Italy, Hungary, Netherlands, Norway, Poland,Portugal, Slovakia, Spain, Sweden, Switzerland, United Kingdom Eight Observer StatesEuropean Commission, USA, Russian Federation,India, Israel, Japan, Turkey, UNESCO

  5. Dofundamentalresearch By answering questions like the structure of matter… 4th - 5th century BC End of 19th century Beginning of20thcentury 1960s

  6. Checkingexistingtheories: the standard model LEPTONS QUARKS ORDINARYMATTER ELECTRONNEUTRINO ELECTRON UP DOWN MUONNEUTRINO MUON CHARM STRANGE TAUNEUTRINO TAU TOP BOTTOM GLUONS Strong Force PHOTONS Electro-Magnetic Force BOSONS Weak Force GRAVITONS Gravity 4forces Images: www.particlezoo.net

  7. Answering fundamental questions… HiggsBoson • How to explain particules have a mass? Newton could not explain, neithercanwe… • Whatis 96% of the Universe made of ? Wecanonlysee 4%of itsestimated mass! • Whyisn’tthereantimatterin the Universe? Nature shouldbesymetric… • Whatwas the state of matterjustafter the « Big Bang » ? Travelling back to the earliestinstants of the Universewould help…

  8. Bringing nations togetherandeducate • Hundreds of physics institutes • Half of the world’s particle physicists • Biggest international scientific collaboration • Variousstudents programmes • Over 100 countries

  9. By accelerating and collidingobjects…

  10. Atincrediblelevels of energy! • E=mc2

  11. The largest particle accelerators • 17 miles (27km)long tunnel • Thousands of superconductingmagnets • Ultra vacuum: • 10x emptierthan on the Moon • Coldest placein the Universe: -271° C • In safe conditions!

  12. The biggest and mostsophisticated detectors • Cathedrals of science100m underground • 600 million collisionsper second detected • by hundreds of • million sensors • Thousands of collaborators • for each detector • In safe conditions!

  13. Practical applications: the World Wide Web • Was developed in the frame of the LHC project in 1989! • Freely given to the World!

  14. Practical applications: cancer treatment • For both detection and cure of cancers • PET Scans • Hadron Therapy

  15. Practical applications: detectors • Scanning trucks in less than one hour without unloading them!

  16. Practicalapplications:using the Grid • Ultra high-speed processing • of satellite imagery in the • case of natural disasters

  17. And of course… some Nobel prizes! George Charpak “for his invention and development of particle detectors, in particular the multiwire proportional chamber” Carlo Rubbia(with Simon van der Meer) “for their decisive contributions to the large project, which led to the discovery of the field particles W and Z, communicators of weak interaction”

  18. GRID

  19. IT Infrastructure at CERN • General Purpose Computing Environment • Administrative Computing Services • Physics and engineering computing • Consolidation, coordination and standardization of computing activities • Physics applications(e.g., for data acquisition& offline analysis) • Accelerator design and operations

  20. LHC Data Every Year • 40 million collisions per second • After filtering, 100 collisions of interest per second • > 1 Megabyte of data digitized per collision recording rate > 1 Gigabyte / sec • 1010 collisions recorded each year stored data > 15 Petabytes / year • analysis requires a computing power equivalent to ~ 100,000 of today's fastest processors

  21. Computing power available at CERN • High-throughput computing based on reliable “commodity” technology • More than 35’000 CPUs in about 6000 boxes (Linux) • 14 Petabytes on 14’000 drives (NAS Disk storage) • 34 Petabytes on 45’000 tape slots with 170 high speed drives Nowhere near enough!

  22. Computing for LHC • Problem: even with Computer Centre upgrade, CERN can provide only a fraction of the necessary resources • Solution: Data centers, which were isolated in the past, will be connected, uniting the computing resources of particle physicists worldwide Users of CERN Europe: 267 institutes 4603 users Out of Europe: 208 institutes 1632 users

  23. What is the Grid? • The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations • In contrast, the Grid is an emerging infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe

  24. The most extensive scientific computing grid • 15 Petabytes(15 millions of GB)of data everyyear • 100’000 processors • 200 computer centres around the planet • Should run 100 millions jobs • Used by 5000 scientists in 500 institutes

  25. Virtualization

  26. Why virtual? • Steady flow of requests for dedicated servers in the CERN computer centre • Excellent network connectivity • Reliable power supply, cooling • 24x365 monitoring with operator’s presence • Daily tape backup • Use the hardware without owning the responsibility (maintenance, procurement) • Focus on application without sharing the resources • Improve the CPU utilization of grid nodes • Optimize TCO

  27. Infrastructure as a Service (IaaS) • Extra effort to procure and maintain HW • Delivery time in several weeks • Lack of flexibility • Not easy to adapt to dynamic patterns • Ready in ~ minutes • Highly flexible • Efficient capacity planning

  28. Experience since 2006 • Server Self Service Center (S3C) • Choose your server from a set of predefined images • Take resources from the pool of available HW • Available within minutes

  29. Requirements have evolved • New requirements identified • Flexibility of resource allocation • Higher performance • High-availability model adapted to customers • Larger scale • Efficient management Source: Gartner (August 2008)

  30. Why Hyper-V? • A built-in component of the operating system • Create powerful VM • 64-bit support for guests • Linux support • High availability • Quick migration • Manageability • High performance, reliability, security • VHD compatibility

  31. Why SCVMM? • One solution to centrally manage all virtual infrastructure • Windows Powershell API • V2V and P2V capabilities • Web portal • Intelligent placement • Library • Templates • Delegated management roles • Job history • Support for highly available VM • VM Migration

  32. Hyper-V Infrastructure

  33. System Architecture CERN Virtual Infrastructure Web Interface Virtual Machine Manager Admin Console Application Management OS Maintenance Backups SOAP Services Windows Powershell LAN DB Microsoft VirtualMachine Manager

  34. Web Portal

  35. Challenges • Console access from Linux • Missing .Net API for SCVMM • Time sync issues in guests

  36. Experiences • Cost efficient customized cloud computing infrastructure • Maintenance with limited downtime • Disaster recovery of VMs within minutes • Improved performance compared to Virtual Server 2005 Source: http://blogs.msdn.com/modonovan

  37. Experiences • 172 running virtual machines • 17 templates • Scalable to large number of VM • Expiration handling • Green computing

  38. Real life use cases • Video streaming for LHC First Beam Day • 6 virtual machines needed for ~1 week • Terminal Servers for Engineering Apps • A terminal servers installed with older version of the apps • Oracle Application servers

  39. Real life use cases • CERN Media Archive • CERN Alerter web server • Physical server with 2xCPU, 4GB RAM • Upgrade necessary because of OS driver issue • Virtual server set up “on demand” • Resources limited to 1xCPU, 2 GB RAM • Physics analysis running in VM • Etc, etc.

  40. Future work • Upgrade to Hyper-V 2.0 & SCVMM 2008 R2 • Use the new “Cluster Shared Volume” feature • Use the new “Rapid Provisioning” feature • VDI functionality

  41. Conclusion • Innovativephysicsresearchlaboratory • Pushinglatesttechnology to itslimits • Moving services to the cloud Visitourwebsites:Informations: www.cern.ch CERN TV: www.youtube.com/cernRecruitment: www.cern.ch/jobs

More Related