1 / 16

Florida Cyberinfrastructure Development: SS ERCA

Florida Cyberinfrastructure Development: SS ERCA. www.sserca.org. Paul Avery University of Florida avery@phys.ufl.edu. Fall Internet2 Meeting Raleigh, Va October 3, 2011. SSERCA Mission.

kaiyo
Download Presentation

Florida Cyberinfrastructure Development: SS ERCA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Florida Cyberinfrastructure Development:SSERCA www.sserca.org Paul AveryUniversity of Floridaavery@phys.ufl.edu Fall Internet2 MeetingRaleigh, VaOctober 3, 2011 Paul Avery

  2. SSERCA Mission … to further the development of state-wide computational science infrastructure of advanced scientific computing, communications and education resources by promoting cooperation between Florida’s universities. Paul Avery

  3. Sunshine State Education and Research Computing Alliance What SSERCA is • An alliance of 5 universities: FSU, UCF, UF, UM, USF What SSERCA does • Helps interface researchers to cyberinfrastructure • Provides web catalog to research projects • Offers joint CPU and storage resources • Serves as a vehicle for collaboration How SSERCA communicates • Regular meetings since July 2010 • Quarterly “Summits” w/Directors/CIOs: (3 in 2011) Paul Avery

  4. Motivation • Maximize collective impact of Florida’s scientific and IT assets • Scientists • Computing facilities • Data storage systems • Specialized research instruments • State-wide optical network (FLR) Paul Avery

  5. FLR: 20-Gbps network Paul Avery

  6. SSERCA Goals Build and leverage cyberinfrastructure • … for research across public and private Universities within Florida Provide researchers access • … to advanced computational and data services & expertise Collaborate effectively • … for federal grant opportunities • (NSF panels look for pre-existing collaboration) Paul Avery

  7. Specific Deliverables Catalog high-end resources • Internal use: track assets and progress related to linking assets • Showcase STEM resources: attract positive attention to Florida Build and support collaborative infrastructure • Data sharing • Cycle sharing Three well defined research projects Paul Avery

  8. New Florida Award “New Florida” Funds (Board of Governors) • UF: $200K • FSU: $150K • USF: $100K Internal Matching Funds (university) • UF: $200K • FSU: $150K • USF: $140K Paul Avery

  9. Dedicated SSERCA Staff Distributed dedicated staff • Florida State University: 1.0 FTE • University of Florida: ~1 FTE • University of South Florida: 1.0 FTE • University of Central Florida: Fractional • University of Miami: Fractional All employed by HPC centers at respective universities Paul Avery

  10. Web Updates Internal facing wiki Meeting planning Grant writing & coordination Paul Avery

  11. Five CollaborativeActivities Data sharing CPU sharing CMS experiment at CERN CryoEM Coupled Ocean-Atmosphere Paul Avery

  12. (1) Data Sharing Based on Lustre wide-area file system • Shared files appear as local • Experience and expertise at UF and FSU Leverages 20 Gbps FLR infrastructure Leverage ExTENCI project (NSF funded, 2010-12) • Addition of Kerberos-based security • PSC expertise and support Many performance tests completed & ongoing • File movement between UF, FSU, FIU, PSC Paul Avery

  13. (2) CPU Sharing Leverages Open Science Grid (NSF/DOE funded) • “Campus Grids” project (multi-university) • Weekly calls: good support & expertise Successful Condor tests “flocking” jobs from FIU to UF Further tests to be carried out in Fall 2011 Initial resource at FSU (3Leaf donation) • 96 nodes (1,152 cores), 6 GB/core • Other resources expected (modulo power/cooling availability) Paul Avery

  14. (3) CMS Experiment @ LHCUF, FSU, FIU, FIT Paul Avery

  15. (4) CryoEM (FSU, UF) Paul Avery

  16. (5) Coupled Ocean-Atmosphere Models(FSU, USF, UM) Paul Avery

More Related