1 / 24

“Greek Research and Technology Network: Status and Updates”

Dr. Ognjen Prnjat European and Regional eInfrastructure management Greek Research and Technology Network on behalf of GRNET Technical department. “Greek Research and Technology Network: Status and Updates”. eAge 2012, Dubai. GRNET mission.

gil-cain
Download Presentation

“Greek Research and Technology Network: Status and Updates”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dr. Ognjen PrnjatEuropean and Regional eInfrastructure managementGreek Research and Technology Networkon behalf of GRNET Technical department “Greek Research and Technology Network: Status and Updates” eAge 2012, Dubai

  2. GRNET mission • GRNET is a state-owned company operating under the supervision of the Ministry of Education (General Secretariat of Research & Technology); non-profit • Main mission is to provide high-quality electronic infrastructure services to the Greek academic and research institutions: • National and international connectivity services • Distributed Computing Infrastructure services (computing, storage, visualisation) • Supporting activity is the promotion and dissemination of the use of ICT in the public and private sector towards an e-Government, e-Learning and e-Business environment • Main sources of funding are the Operational Programme for the Information Society, Ministry of Economy and Finance and EC projects • GRNET has been certified by ISO 9001:2000 in project management

  3. Pan-EU e-Infrastructures vision e-Science Collaborations • The Research Network infrastructure provides fast interconnection and advanced services among Research and Education institutes of different countries’ • The Research Distributed Computing Infrastructure provides a distributed environment for sharing computing power, storage, instruments and databases through the appropriate software (middleware) in order to solve complex application problems • This integrated networking & DCI environment is called electronic infrastructure (eInfrastructure) allowing new methods of global collaborative research - often referred to as electronic science (eScience) • GRNET was one of the first NRENs in Europe to expand its services to grid and computing in general; infrastructure-oriented and application-neutral Distributed Computing Infrastructure Network Infrastructure

  4. GRNET main networking tasks • Interconnects universities, research centers, academic organizations (>150), primary and secondary schools (15000) • 500.000 end-users • Continuously upgrades the national backbone (Dark fiber backbone (Nx10 Gbps), institutions access (1 || 10 Gbps per institution) and international backbone (currently at 4*10 Gbps) • Operatesthe GR Internet Exchange (GR-IX), peering of Greek commercial Internet Service Providers at 10Gbps each • Cooperates with Greek and international research and academic institutions for the development of innovative networking services

  5. GRNET network evolution:from GRNET2 to GRNET3 GRNET3: dark fibre based, 10Gbps capable, 30M€, 2005-2008 GRNET2: 2,5Gbps leased lambdas, 20M€, 2000-2005

  6. >50PoPs 9000kmfibers (IRU) MANs Attiki & Thessaloniki DF loops 33 cities Single-mode fiber pair 15-years IRUs Availability > 99% 1GbE interconnection over DF to the closest IP router 10GbE links for the “Power Users” GRNET network evolution: GRNET3

  7. GRNET network evolution:from GRNET3 to GRNET4 GRNET3: dark fibres, 10Gbps capable, 30M€, 2005-08 GRNET4: Equipment upgrades, 8M€, 2012-2015 • Service Oriented Design • Optical Services Layer: physical layer connectivity • Carrier Services Layer: Carrier Ethernet (MPLS) interconnection • IP Services Layer: IP interconnection among GRNET customers and the rest of Internet • 40/100Gbps wavelengths based on PM-QPSK modulation

  8. HELLASGRID (HG) infrastructure http://www.hellasgrid.gr/infrastructure • HG-01cluster (pilot phase): • @Demokritos - Athens • 64 CPU, 10TB FC SAN, 12TB Tape Library, gLite middleware • HG02-HG07 clusters (HG project): • Athens (NDC/EKT,Min EDU, IASA), Thessaloniki (AUTH), Crete (ICS-FORTH).Patras (CTI) • ~1200 Cores • ~40 TBytes total raw SAN storage capacity • ~80TBytes Tape Library • 6 extra Sites offered by Greek Universities/Research institutes (FORTH-ICS, AUTH,IASA, UoA, UoI, UOI-HEPLAB, Upatras) • ~600 Cores and 200 TB of Storage • Total funding ~2Me 8

  9. HELLASGRID applications • HECTOR: Enabling Microarray Experiments over the Hellenic Grid Infrastructure • GRISSOM Platform: Grids for In Silico Systems Biology and Medicine • Evaluating the impact of climate change on European air quality • Density Functional Theory Calculations on Atmospheric Degradation Reactions of Fluorinated Propenes • Investigating the nature of explosive percolation transition • First-principles studies on traditional and emerging materials 9

  10. High Performance Computing • Goal is the development of a national HPC infrastructure that will join PRACE Tier-1 European infrastructure • Budget 3.5MEuro • Procurement and installation of HPC infrastructure • Operation and provision of support services • Aiming for at least a ~150 Tflops system • Hosted in GRNET’s existing Datacenter • Support for a wide range of scientific disciplines: • Biomedicince, Bioinformatics, Computational Engineering, Physics, Meteorology, Climatology, Seismology, Computational Chemistry etc. • Based on the results of the HellasHPC feasibility study • National survey among 29 academic and research institutes (2010) • Collected requirements from 200 scientific applications developed by 162 research teams from various scientific domains

  11. Cloud for R&E: the process • Vision: flexible, production-quality cloud services for Greek R&E community • Rationale: • Step beyond Grid in terms of flexibility and availability • Economies of scale for the community; solving understaffing problems, poor service, low maintenance • Minimizes the investment in equipment and support contracts • Know-how from NOC “Service boxes” (Vima service); as well as Grid • Policy background: existing MoU in place for Grid computing, expanding for HPC as well • Strategy: • Technical workshops and requirements capture meetings • Gradual offering of services, starting with storage, moving to VM on demand, IaaS, and then SaaS • Paving the way to public sector • Funding: in the context of GRNET4 project, 2.2ME DCs; 4.5ME s/w and services)

  12. Cloud for R&E: Why? • STUDENT- It gives me the opportunity to test different kinds of software on a machine that I no longer need after my work is done • PROFESSOR - It makes it possible for me to deploy PC labs without having to worry about specific hardware or physical space. It makes me capable of providing machines to my students for a scheduled amount of time. It gives me storage space to upload content and to share data with my students or access them through my virtual hardware • RESEARCHER- It enables me to run experiments in many different environments and network topologies which I can provision easily, quickly and dynamically. I can have persistent or volatile machines according to my needs. I can also upload (besides my files) my own Images and launch Virtual Machines from them

  13. okeanosservice • ~okeanos is set to deliver Infrastructure • Compute (Virtual Machines) • Network (Virtual Networks, private Ethernets (L2) or public IPv4/6) • and Storage (Virtual Disks) as a Service • Alpha2: from March 2012 - 2000 VMs – 1k alpha users • Beta December 2012 • Target group: GRNET’s customers • direct: IT depts of connected institutions • indirect: university students, researchers in academia • Users manage resources over • a simple, elegant UI, or • a REST API, for full programmatic control

  14. okeanosservice Virtual Machines Compute Virtual Ethernets Network Virtual Disks Storage Virtual Firewalls Security

  15. okeanosservice 5x 2x 8x 1x

  16. okeanosservice • Compute: Cyclades • Files: Pithos+ • Images: Plankton • Identity: Astakos • Volumes: Archipelago • Accounting/Billing: Aquarium

  17. okeanos: Design • Commercial IaaSvs own IaaS • Commercial IaaS • Amazon EC2 not an end-user service • Need to develop custom UI, AAI layers • Vendor lock-in • Unsuitable for IT depts • persistent, long-term servers, custom networking requirements • Gain know-how, build on own IaaS reusefor own services

  18. okeanos: Software Stack REST API Multiple users, multiple resources Synnefo Multiple VMs on cluster Ganeti SingleVM KVM

  19. okeanos: Platform design Web Client CLI Client Web Client 2 user@home admin@home OpenStack Compute API v1.1 GRNET Proprietary GRNET datacenter Synnefo cloud management software Google Ganeti Debian KVM Direct Out of Band Access VirtualHardware

  20. cloud storage - Pithos • Online storage for all Greek academic and research community • 50 or 100 GB/user; files, groups • Access by web browsers, native test apps, iPhone, Android • Open source implementation and API (REST) • v2 of the API, compatible with OpenStack object storage • Also used in ~oceanos: • Stores custom and user-created images • Plankton provides image registry over pithos

  21. ~oceanos: stats • 1000 alpha users • 2000VMs (windows, ubuntu, debian in order of usage) • Storage: 20% disk usage. Average 20G/VM; • 1 minute approx startup time for VM • Scalable to thousands (Pithos to 10k) • Per-VM or per-GB billing to be implemented

  22. Cloud Infrastructure: Data Centers • Data Center 1 (NDC) • Status – Production operation • Location – National Documentation Center, Athens, Greece • Purpose – Central PoP of GRNET, Connection to GEANT, GR-IX, Cloud and Grid services • Number of Racks - 16 • Total Power – 110KW with N+1 redundancy • Data Center 2 (MINEDU) • Status – Production operation • Location – Ministry of Education Athens, Greece • Purpose – Cloud/HPC services • Number of Racks - 28 • Total Power – 450KW with N+1 redundancy • Data Center 3 (LOUROS) • Status – Design phase • Location – Hydroelectric Plant of Louros • Purpose – GRNET disaster recovery for cloud/HPC services, GREEN Data Center, PUE<1.3 • Number of Racks - 14 • Total Power – 250KW with N+1 redundancy

  23. Cloud – MINEDU snapshots

  24. Thank you!

More Related