1 / 21

Site Reports March 2010

Site Reports March 2010. Members ( 22 contributing, 13 participating ). Contributing Members 1 University of Alabama at Birmingham 2 University of Alabama in Huntsville 3 University of Arkansas 4 University of Florida 5 George Mason University 6 University of Georgia

emery-pena
Download Presentation

Site Reports March 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Site Reports March 2010

  2. Members (22 contributing, 13 participating) Contributing Members 1 University of Alabama at Birmingham 2 University of Alabama in Huntsville 3 University of Arkansas 4 University of Florida 5 George Mason University 6 University of Georgia 7 Georgia State University 8 University of Kentucky 9 Louisiana State University 10 Louisiana Tech University 11 University of Maryland 12 University of Miami Participating Members 1 University of Central Florida 2 Clemson University 3 Great Plains Network 4 Hampton University 5 Kennesaw State University 6 Mississippi Center for SuperComputing 7 North Carolina A&T State University 13 University of Michigan ATLAS Computing • 14 Norfolk State University • 15 University of North Carolina, Charlotte • 16 Old Dominion University • 17 University of South Carolina • 18 SURA • 19 Texas A&M University • 20 Texas Advanced Computing Center • 21 Texas Tech University 22 University of Virginia 8 North Carolina State University • 9 Rochester Institute of Technology • 10 University of Southern California • 11 Stephen F. Austin State University • 12 Vanderbilt University • 13 Wake Forest University

  3. Institute for Simulation & Training/University of Central Florida • Dr. Brian Goldiez • http://webstokes.ist.ucf.edu/ • Activities • New Funding policy ($.05/CPU compute hour for faculty and $.70/CPU compute hour for commercial) • Bulk purchase $5000 for 200,000 compute hours ($.025/compute hour) • Added 4 new blades from UCF Physics dept • Plans for Next 6 months: Submit proposals for adding Tesla GPUs, find new users in UCF

  4. Voting Lead or Primary Contact (David Matthews-Morgan {DMM}) Resource Operational Support Point-of-Contact (Jason Stone as soon as we have a resource available) Website URL (rcc.uga.edu) Activities Resources (have worked with TTU to get AIX resource on SURAgrid) Applications (none) Other (DMM serving on SGC) Plans for Next 6 months Get Linux cluster on SURAgrid Get researchers to run applications on SURAgrid resources

  5. Voting Lead or Primary Contact: Art Vandenberg • Resource Operational Support Point-of-Contact • Help@gsu.edu - Georgia State HELP ticket system (log, monitor, resolve/escalate) • Victor Bolet - technical contact, janus.gsu.edu (GSU IBM Power5+, p575) • Kelly Robinson - technical contact, octans.gsu.edu (IBM Cluster 1350) • http://www.gsu.edu/ist/acs/ (research computing) • Activities • ACSrocks2 (ROCKS, decommissioned 3/2010 - since Nov. 2008) • SURAgrid Governance Committee • GSU IT restructure (Support/Technical/Operations model) • NSF MRI-R2 Proposal (SURA Team, VCL) - Declined review by NSF • NSF CI-TEAM proposal (SURA Team, VCL) - due April 27, 2010 • Operational SURAgrid, production improvement • Visualization Wall – being installed, SURAgrid options?

  6. HPC@LSU Voting Lead: Honggao Liu, honggao@cct.lsu.edu, 225-578-0289 Resource Operational Support Point-of-Contact sys-help@loni.org – the HPC@LSU help desk. Website: http://www.hpc.lsu.edu Activities 20% set-aside on Pelican provides 675,000 processor/hrs on an IBM P4/P5/P5+-575 Machine is out-of-warranty, 1 node has failed. Likely to be decommissioned/replaced this year. Have lost 2 of 3 people with portal experience. Hiring freeze presents support problems. Investigating MPI programming in Grid environment.

  7. Voting Lead: David McNabb, mcnabb@umd.edu • Resource Operational Support: David McNabb • Website URL: http://www.oit.umd.edu/units/tss/ • Activities • None in the last 6 months • Plans for Next 6 months • Restructuring and planning

  8. University of Miami Center for Computational Science Voting Lead: • Nick Tsinoremas, Director Technical/operational Contact: • Joel Zysman, Director HPC, UM-CCS Website: http://www.ccs.miami.edu/ Activities: • New 5,000 core Linux cluster, 10 G backbone, 75 TB Blue Arc Storage, LSF scheduler. • Data analysis and Visualization Cloud (100 TB) • Storage Cloud, both structured and un-structured data 500 TB • Next Generation Sequencing, Chemoinformatics MD simulations, WRF, Hycom, Roms, CCMS, CFD and GFD codes • About 6-7 FTEs for HPC and related clouds Plans: • Focus on GPU and CUDA, add another 500 TB of storage

  9. Voting Lead: Eduardo A. Socolovsky, Bioinformatics and Computational Science Coordinator, Center for Biotechnology and Biomedical Sciences (CBBS) • Operational Support: Ken Detweiler, Consultant Systems Administrator, CBBS, and NASA LaRC • Website URL: http://www.nsu.edu/biotechnology/ • Activities: • Resources: Condor Campus Grid, and 56 core Cluster (managed by OIT) • Applications on Campus Grid : • Processing of (Distributed) Micro-Array Data to Determine Expressed Genes and Ontological Information • Virtual Screening of Small Compounds for Computationally Aided Drug Discovery using (Distributed) Fingerprint Data • Gromacs for Materials Sciences Project • Applications on Cluster: • BioHPC under Microsoft’s HPC2008 • Center for Materials Sciences Applications (Linux based Cluster Management) • CBBS Staff:Eduardo A. Socolovsky and Ken Detweiler • OIT Staff:Andrea DiFabio, Chief Information Security Officer, OIT, NSU • Plans for Next 6 months: • Establishment of Cooperation Agreement with National Institute of Agricultural Technology (INTA), Argentina, for application 1. and other Data Mining and Computational applications, during scheduled visit at INTA in May. • Development and deployment of 1., in collaboration with INTA Researcher Armando Taie, during his visit August/September 2010

  10. Voting Lead, SGC member: Mike Sachon Resource Operational Support Point-of-Contact: Amit Kumar, ahkumar@odu.edu Web site: https://wiki.hpc.odu.edu Activities and resources: Suragrid Resource: Mileva.hpc.odu.edu a 17 node, Sun X2100 M2 rocks cluster Current project to implement local VCL for research and instruction (2 IBM E-chassis BladeCenters with 21 HS22 blades) Unable to fill vacant position affecting activity

  11. Rochester Institute of Technology Primary Point of Contact: Gurcharan S. Khanna Website: http://rc.rit.edu • Activities • TeraGrid Champions Program • RIT Global Collaboration Grid • Haiti Earthquake Disaster Multispectral Imagery Capture and Transfer over Internet2 • Resources • Large Memory SMP (80 GB) 8 cores for shared use • Large Memory SMP (130 GB) 32 cores with SGE • HPC cluster (128 cores, 4 nodes, 256 GB RAM) • Condor pool (up to 1000 cores of various kinds)

  12. Rochester Institute of Technology • Applications (Areas of emphasis/new initiatives) • High Performance file systems (ZFS/NFS “thumpers”) • System monitoring/reporting tools (“stat-a-tat”) • Staffing • Computational Scientist for HPC support (advertised) • Research Systems Administrator (permanent) • Asst System Administrator (temporary) • Collaboration Technologist, Mgr. (temporary) • Collaboration Technologists, (students)

  13. Rochester Institute of Technology • Plans for Next 6 months • GPU clusters • 10 GigE to the desktop • Internet2 Meeting Panel on Haiti Imagery Transfer

  14. Voting Lead: Gary Crane, gcrane@sura.org, 315-597-1459 Staff Support: Linda Akli, akli@sura.org, 202-408-2417 Websites: www.suragrid.org; www.sura.org/programs/it.html Activities for Past 6 Months: • Community Building: Member, SGC (Gary Crane); Member, SURAgrid Membership Development Subcommittee (Linda Akli); Facilitated SURAgrid Monthly Call; Produced SURAgrid Newsletter; Coordinated SURAgrid All Hands Meeting Planning; Provided liaison with SURA IT Committee and other SURA programs • Contributed Resources: SURAgrid Website; Conference Bridge • Operational Support: Facilitated working groups; managed mailing lists; posted materials to SURAgrid Website • Research Applications: Deployment facilitation - no new applications deployed during this period • Outreach: Campus Site visits; collaborations with other programs (CI-Days Consortium, MAGIC, NSF Campus Bridging Task Force, OSG, TeraGrid, VCL); SC09 Signage; Participated in the development of the white paper entitled “Enabling and Sustaining Campus-to-Campus Cyberinfrastructure” and submitted it to two NSF Workshops (HPC Center Sustainability at Cornell and Campus Bridging Networking and Data-centric Issues at Indiana) • External Funding: Facilitate community review of funding opportunities - no proposals submitted during this period • Corporate Partnerships: Managed IBM and Microsoft partnerships

  15. Voting Lead: Gary Crane, gcrane@sura.org, 315-597-1459 Staff Support: Linda Akli, akli@sura.org, 202-408-2417 Websites: www.suragrid.org; http://www.sura.org/programs/it.html Plans for Next 6 months: • Community Building: Coordinate 2010 membership renewal and SGC election; Facilitate SURAgrid Fall All Hands Meeting; Produce SURAgrid Newsletters; Continued participation on SGC and Membership Development Subcommittee • Contributed Resources: No change • Operational Support: Continued facilitation of working groups; Maintenance of mailing lists; Incremental improvements SURAgrid Website and continue posting new materials; facilitate operational improvements per SGC plan • Research Applications: Continued deployment facilitation • Outreach: Continued coordination with other programs (CI-Days Consortium, MAGIC, NSF Networking and Campus Bridging Task Force, OSG, TeraGrid, VCL); CaBig and TG10 meetings presence • External Funding: OSG – SURAgrid Integration Project; VCL • Corporate Partnerships: Manage corporate partnerships

  16. Steve Johnson, Voting Lead and Primary Contact • Resources: Calclab <admin@math.tamu.edu> and Hydra <help@sc.tamu.edu> (3+ years) • http://calclab.math.tamu.edu, http://sc.tamu.edu • Activities • Frontiers in Reality of Schubert Calculus Computed 1,687,481,630 Instances of 703 Schubert Problems in 954.843 GHz-years. • Evaluating ticket tracking system • Next 6 months: bring FRSC to all of SURAgrid

  17. Texas Tech University • Voting Lead: Philip Smith, philip.w.smith@ttu.edu , 806.742.4350 • Website URL: http://www.hpcc.ttu.edu • SURAgrid Activities for the past 6 months • Community Building: Member, SGC; Member, Access Management WG; Member, Strategic Grid Relationships WG; Member, • Contributed Resources: Bringing Weland.hpcc.ttu.edu up (128 processor Nehalem processor based cluster capable of running regional grid jobs and running grid jobs on the local campus Condor grid). Jerry Perez is lead for this project. A demo is being authored to demonstrate operability. Dr. Ravi Vadapalli and Dr. Alan Sill are lead consultants of the TIGRE/SURAgrid Stack installation. Dr. Sill is the AuthN/AuthZ lead for providing local certificates for the local users and the machine certificates. Will coordinate with Art Vandenberg and Victor Bolet to demonstrate SURAgrid grid computing between GSU and TTU. • Operational Support: Lead-Jerry Perez, Packaged Stack. Lead- Jerry Perez, Packages stack update and maintenance. The SURAgrid stack was updated with the latest software update as of January 2010. Lead- Dr. Alan Sill, Grid deployment and trouble-shooting through scheduled weekly office hours meetings. This group had a discussion about the packaged stack. AIX support is still being pursued. • Research Applications: Exploring and testing documentation through new application deployments with other SURAgrid members. A roadmap document was produced to test efficacy of the documentation and the packaged stack. • Outreach: Jerry Perez presented “Deploying Campus and Regional Grids” at a workshop for SURAgrid member institution Norfolk State University in Norfolk, Virginia. Dr. Eduardo Socolovsky was the host and mediator of the workshop. A campus grid was deployed and jobs are now being authored to run on this system. • SURAgrid activities you are considering for the next 6 Months: • Create path for migrating local TTU production to use SURAgrid resources and then after thorough testing and creation of user guide for local users run some production on a regular basis on SURAgrid resources. • Bring more production work to run on SURAgrid resources • Give more workshops on deploying and using campus/regional grids.

  18. Primary Contact Tim Miller Resource Operational Support Point-of-Contact Almost, almost there. Website URL www.deac.wfu.edu Staffing 1 ½ FTEs for all WFU HPC activities.

  19. Activities • Resources • Repurposed and condensed 16 32-bit nodes into 4GB virtualization “farm”. • Allocated 1TB of storage for VMs. • Next 6 months • Goal is to be able to provide an infrastructure upon which to run either standard VMs or bundled/staged in VMs. • Ideally, the resource would be cloud-like. • Hardware has been burned in, tested, and ready for some hypervisor research (do we use VMware? VirtualBox? KVM?) • Goal is to have some resources online by summer. • No real usage goals right now. This is a “build it and hopefully they will come” effort.

More Related