1 / 24

U Oklahoma and the Open Science Grid

U Oklahoma and the Open Science Grid. Henry Neeman , Horst Severini, Chris Franklin, Josh Alexander University of Oklahoma Condor Week 2008, University of Wisconsin, Wednesday April 30 2008. Outline. Context: OU Hardware OU and the Open Science Grid Oklahoma Cyberinfrastructure Initiative

armand
Download Presentation

U Oklahoma and the Open Science Grid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. U Oklahomaand theOpen Science Grid Henry Neeman, Horst Severini, Chris Franklin, Josh Alexander University of Oklahoma Condor Week 2008, University of Wisconsin, Wednesday April 30 2008

  2. Outline • Context: OU Hardware • OU and the Open Science Grid • Oklahoma Cyberinfrastructure Initiative • OU NSF CI-TEAM Project OU and the OSG Condor Week 2008, Wed Apr 30 2008

  3. Context:OU Hardware

  4. Pentium4 Xeon Cluster 1,024 Pentium4 Xeon CPUs (“Irwindale” 3.2 GHz EM64T) 2,176 GB RAM (800 MHz FSB) 23,000 GB disk (NFS+Lustre) Infiniband & Gigabit Ethernet OS: Red Hat Linux Enterp 4 Peak speed: 6,553.6 GFLOPs topdawg.oscer.ou.edu OU and the OSG Condor Week 2008, Wed Apr 30 2008

  5. Pentium4 Xeon Cluster DEBUTED 6/2005 #54 WORLDWIDE, #9 AMONG US UNIVERSITIES, #4 EXCLUDING BIG NSF CENTERS CURRENTLY #289 WORLDWIDE, #29 AMONG US UNIVERSITIES, #20 EXCLUDING BIG 4 NSF CENTERS topdawg.oscer.ou.edu www.top500.org OU and the OSG Condor Week 2008, Wed Apr 30 2008

  6. Condor Pool @ OU OU IT has deployed a large Condor pool (775 desktop PCs in dozens of labs around campus). OU’s Condor pool provides a huge amount of computing power – more than OSCER’s big cluster – in terms of Condorized PCs: • if OU were a state, we’d be the 12th largest state in the US; • if OU were a country, we’d be the 10th largest country in the world (other than “unknown”). Also, we’ve been seeing empirically that lab PCs are available for Condor jobs about 80% of the time. OU and the OSG Condor Week 2008, Wed Apr 30 2008

  7. Proposed New Hardware The following information is public. • RFP: issued March 19, closed April 3 • Architecture described: • quad core, dual socket, x86-64 • ~5 times as fast as topdawg (similar compute node count) • ~4 times as much RAM (probably 1333 MHz FSB) • ~4 times as much disk (~100 TB) • high performance interconnect (probably IB) plus GigE • Red Hat Enterprise Linux 5 (assumed) • OU Board of Regents meets May 8-9. OU and the OSG Condor Week 2008, Wed Apr 30 2008

  8. OU andthe Open Science Grid

  9. OU and the Open Science Grid • Currently, OU’s relationship with the OSG primarily benefits High Energy Physics: • D0 project • ATLAS project • DOSAR project: cross disciplinary grid organization with members in other OSG VOs (including D0 and ATLAS) • We have 5 OSG resources: • OSCER’s large cluster (topdawg) – general purpose • OSCER’s Condor pool (currently D0 only) – general purpose • OUHEP’s Tier2 cluster – dedicated to HEP projects • OUHEP’s desktop cluster – dedicated to HEP projects • OUHEP’s OSG Integration TestBed (ITB): 8 nodes, used to test new pre-production OSG releases. We recently installed OSG 0.9.0 and bestman-xrootd (a Storage Resource Manager) as part of the integration effort. OU and the OSG Condor Week 2008, Wed Apr 30 2008

  10. OU and D0 OU and the OSG Condor Week 2008, Wed Apr 30 2008

  11. OU D0 Breakdown • OSCER’s big cluster (topdawg) 8,020,250 events (6th in the US), 0.66 TB • OSCER Condor pool 6,024,000 events (6th in the US), 0.49 TB • Dedicated OU HEP Tier3 cluster 2,472,250 events (9th in the US), 0.16 TB Notes: • Without OSCER’s Condor pool, OU would be #4. • Without OSCER’s cluster, OU would be #6. • Without OU HEP’s dedicated Tier3 cluster, OU would still be #2. OU and the OSG Condor Week 2008, Wed Apr 30 2008

  12. OU and ATLAS http://gratia-osg.fnal.gov:8880/gratia-reporting/ Note: A buggy version of gratia ran on OU’s resources until 4/3/2008. OU and the OSG Condor Week 2008, Wed Apr 30 2008

  13. OU: First in the World OU was the first institution in the world to simultaneously run ATLAS and D0 grid production jobs on a general-purpose, multi-user cluster. Most grid production jobs run on dedicated clusters that are reserved for one or the other of these projects, or on Condor pools. OU and the OSG Condor Week 2008, Wed Apr 30 2008

  14. OU’s Collaborations OU plays key roles in: • Oklahoma Center for High Energy Physics (OCHEP) • Collaboration between OU, Oklahoma State U and Langston U (HBU) • Funded by a Dept of Energy EPSCoR grant • ATLAS Southwest Tier2: OU, Langston U, U Texas Arlington • DOSAR (Distributed Organization for Scientific and Academic Research) OU, Langston U, U Arizona, Iowa State, Kansas State, U Kansas, Louisiana Tech, Louisiana State, Rice, U Mississippi, U Texas Arlington, Universidade Estadual Paulista (Brazil), Cinvestav (Mexico) OU and the OSG Condor Week 2008, Wed Apr 30 2008

  15. OU Helps with Condor OU has helped set up Windows/coLinux/Fedora Condor pools at: • Oklahoma State U • U Texas Arlington OU and the OSG Condor Week 2008, Wed Apr 30 2008

  16. Oklahoma Cyberinfrastructure Initiative

  17. OK Cyberinfrastructure Initiative • Oklahoma is an EPSCoR state. • Oklahoma recently submitted an NSF EPSCoR Research Infrastructure Proposal (up to $15M). • This year, for the first time, all NSF EPSCoR RII proposals MUST include a statewide Cyberinfrastructure plan. • Oklahoma’s plan – the Oklahoma Cyberinfrastructure Initiative(OCII) – involves: • all academic institutions in the state are eligible to sign up for free use of OU’s and Oklahoma State U’s centrally-owned CI resources; • other kinds of institutions (government, NGO, commercial) are eligible to use, though not necessarily for free. • OCII includes building a Condor flock between OU (775 PCs) and OSU (~300 PCs). We’ve already helped OSU set up their Condor pool; they just need to roll out the deployment, and then we’ll be able to use it for HEP/OSG. OU and the OSG Condor Week 2008, Wed Apr 30 2008

  18. OU’s NSF CI-TEAM Project

  19. OU’s NSF CI-TEAM Project OU recently received a grant from the National Science Foundation’s Cyberinfrastructure Training, Education, Advancement, and Mentoring for Our 21st Century Workforce (CI-TEAM) program. Objectives: • Provide Condor resources to the national community • Teach users to use Condor and sysadmins to deploy and administer it • Teach bioinformatics students to use BLAST over Condor OU and the OSG Condor Week 2008, Wed Apr 30 2008

  20. teach students and faculty to use FREE Condor middleware, stealing computing time on idle PCs; teach system administrators to deploy and maintain Condor on PCs; teach bioinformatics students to use BLAST on Condor; provide Condor Cyberinfrastructure to the national community (FREE). Condor pool of 775 desktop PCs (already part of the Open Science Grid); Supercomputing in Plain English workshops via videoconferencing; Cyberinfrastructure rounds (consulting) via videoconferencing; drop-in CDs for installing full-featured Condor on a Windows PC (Cyberinfrastructure for FREE); sysadmin consulting for installing and maintaining Condor on desktop PCs. OU’s team includes: High School, Minority Serving, 2-year, 4-year, masters-granting; 18 of the 32 institutions are in 8 EPSCoR states (AR, DE, KS, ND, NE, NM, OK, WV). OU NSF CI-TEAM Project Cyberinfrastructure Education for Bioinformatics and Beyond Objectives: OU will provide: OU and the OSG Condor Week 2008, Wed Apr 30 2008

  21. Participants at OU (29 faculty/staff in 16 depts) Information Technology OSCER: Neeman (PI) College of Arts & Sciences Botany & Microbiology: Conway, Wren Chemistry & Biochemistry: Roe (Co-PI), Wheeler Mathematics: White Physics & Astronomy: Kao, Severini (Co-PI), Skubic, Strauss Zoology: Ray College of Earth & Energy Sarkeys Energy Center: Chesnokov College of Engineering Aerospace & Mechanical Engr: Striz Chemical, Biological & Materials Engr: Papavassiliou Civil Engr & Environmental Science: Vieux Computer Science: Dhall, Fagg, Hougen, Lakshmivarahan, McGovern, Radhakrishnan Electrical & Computer Engr: Cruz, Todd, Yeary, Yu Industrial Engr: Trafalis OU Health Sciences Center, Oklahoma City Biochemistry & Molecular Biology: Zlotnick Radiological Sciences: Wu (Co-PI) Surgery: Gusev Participants at other institutions (62 faculty/staff at 31 institutions in 18 states) California State U Pomona (masters-granting, minority serving): Lee Colorado State U: Kalkhan Contra Costa College (CA, 2-year, minority serving): Murphy Delaware State U (masters, EPSCoR): Lin, Mulik, Multnovic, Pokrajac, Rasamny Earlham College (IN, bachelors): Peck East Central U (OK, masters, EPSCoR): Crittell,Ferdinand, Myers, Walker, Weirick, Williams Emporia State U (KS, masters-granting, EPSCoR): Ballester, Pheatt Harvard U (MA): King Kansas State U (EPSCoR): Andresen, Monaco Langston U (OK, masters, minority serving, EPSCoR): Snow, Tadesse Longwood U (VA, masters): Talaiver Marshall U (WV, masters, EPSCoR): Richards Navajo Technical College (NM, 2-year, tribal, EPSCoR): Ribble Oklahoma Baptist U (bachelors, EPSCoR): Chen, Jett, Jordan Oklahoma Medical Research Foundation (EPSCoR): Wren Oklahoma School of Science & Mathematics (high school, EPSCoR): Samadzadeh Purdue U (IN): Chaubey Riverside Community College (CA, 2-year): Smith St. Cloud State University (MN, masters): J. Herath, S. Herath, Guster St. Gregory’s U (OK, 4-year, EPSCoR): Meyer Southwestern Oklahoma State U (masters, EPSCoR, tribal): Linder, Moseley, Pereira Syracuse U (NY): Stanton Texas A&M U-Corpus Christi (masters): Scherger U Arkansas Fayetteville (EPSCoR): Apon U Arkansas Little Rock (masters, EPSCoR): Hall, Jennings, Ramaswamy U Central Oklahoma (masters-granting, EPSCoR): Lemley, Wilson U Illinois Urbana-Champaign: Wang U Kansas (EPSCoR): Bishop, Cheung, Harris, Ryan U Nebraska-Lincoln (EPSCoR): Swanson U North Dakota (EPSCoR): Bergstrom, Hoffman, Majidi, Moreno, Peterson, Simmons, Wiggen, Zhou U Northern Iowa (masters-granting): Gray E E E E OU NSF CI-TEAM Project OU and the OSG Condor Week 2008, Wed Apr 30 2008

  22. Okla. Supercomputing Symposium Tue Oct 7 2008 @ OU Over 225 registrations already! Over 150 in the first day, over 200 in the first week, over 225 in the first month. 2006 Keynote: Dan Atkins Head of NSF’s Office of Cyber- infrastructure 2003 Keynote: Peter Freeman NSF Computer & Information Science & Engineering Assistant Director 2004 Keynote: Sangtae Kim NSF Shared Cyberinfrastructure Division Director 2005 Keynote: Walt Brooks NASA Advanced Supercomputing Division Director 2007 Keynote: Jay Boisseau Director Texas Advanced Computing Center U. Texas Austin 2008 Keynote: José Munoz Deputy Office Director/ Senior Scientific Advisor Office of Cyber- infrastructure National Science Foundation FREE! Parallel Computing Workshop Mon Oct 6 @ OU FREE! Symposium Tue Oct 7 @ OU http://symposium2008.oscer.ou.edu/ OU and the OSG Condor Week 2008, Wed Apr 30 2008

  23. To Learn More about OSCER http://www.oscer.ou.edu/ OU and the OSG Condor Week 2008, Wed Apr 30 2008

  24. Thanks for your attention!Questions?

More Related