1 / 39

Recent Advancements of National Grid Infrastructure in Poland ( PL-Grid )

Recent Advancements of National Grid Infrastructure in Poland ( PL-Grid ). Jacek Kitowski. A CC C yfronet AGH, Krakow , Poland. www.plgrid.pl/en. Cracow Grid Workshop – CGW’10 11-13.10.2010, Krakow, Poland. Acknowledgements. ACC Cyfronet AGH Kazimierz Wiatr Marian Bubak

anevay
Download Presentation

Recent Advancements of National Grid Infrastructure in Poland ( PL-Grid )

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RecentAdvancements of National GridInfrastructurein Poland (PL-Grid) Jacek Kitowski ACCCyfronet AGH, Krakow, Poland www.plgrid.pl/en Cracow Grid Workshop – CGW’10 11-13.10.2010, Krakow, Poland

  2. Acknowledgements • ACC Cyfronet AGH • Kazimierz Wiatr • Marian Bubak • Łukasz Dutka • Karol Krawentek • Andrzej Oziębło • Maciej Malawski • Zofia Mosurska • Robert Pająk • Marcin Radecki • Renata Słota • Mariusz Sterzel • Tomasz Szepieniec • Agnieszka Szymańska • Teresa Ozga • ICM • Piotr Bała • Maciej Filocha • PCSS • Norbert Meyer • Bartek Palak • Krzysztof Kurowski • Tomasz Piontek • Dawid Szejnfeld • Paweł Wolniewicz • WCSS • Jerzy Janyszek • Bartłomiej Balcerek • TASK • Rafał Tylman • Mścislaw Nakonieczny

  3. General Introduction to PL-Grid Project Motivation, consortium and objectives PL-Grid and EGI Organization of the project Current Status and Advancements Status of hardware infrastructure Operational Centre – tasks and services e-Infrastructure software, software packages and user’s tools Training and user support Security aspects Conclusions – Future work Outline

  4. PL-Grid Consortium ConsortiumCreation – agreementsigned January 2007 a response to requirementsof Polish scientists anddue to ongoing Grid activities in Europe withintheframework of EGI_DS. Consortiummembers– make up of 5 Polishsupercomputing and networkingcentres: Academic Computer Centre CYFRONET AGH, Krakow (coordinator) Interdisciplinary Centre for Mathematical and Computational Modelling, Warsaw University Poznan Supercomputing and Networking Centre Academic Computer Centre, Gdansk Wroclaw Centre for Networking and Supercomputing PL-Grid Project proposalgotfunded by the European Regional Development Fund as part of the Innovative Economy Programon March 2, 2009. Result: First working NGI in Europe intheframework of EGI.eu 4

  5. Rationales behind PL-Grid Consortium TOP500 – June 2010 The Consortium consists of five High Performance Computing Polish Centres representing Communities, coordinated by ACC Cyfronet • Participation in international and national projects • 35+ international projects FP5, FP6, FP7 on Grids (~50% common)(~45 currently, including EDA) • 15+ Polish projects (~50% common) • Polish scientific communities • ~75% highly rated Polish publications in 5 Communities • Computational resources • Top500 list • European/Worldwide integration Activities • EGEE I-III, EGI_DS, EGI, e-IRG, PRACE, DEISA, OMII, EU Unit F3 „Research Infrastructure” Experts, Representatives in EGI Council, Members of EGI Executive Board • National Network Infrastructure available • Pionier National Project 5

  6. The Project is co-funded by the European Regional Development Fund as part of the Innovative Economy Program. Total budget: 83 M PLN (~ 21 M EUR) Personel cost 27 M PLN (~7 M EUR) Equipment cost 33 M PLN (~8 M EUR) Other cost 23 M PLN (~6 M EUR) Funding from the EC: 68 M PLN (~ 17 M EUR) Project duration: 01 January 2009 – 31 December 2011 Beneficiary: Academic Computer Centre Cyfronet AGH, Krakow, Poland Contract number: POIG.02.03.00-00-007/08 Project website:www.plgrid.pl/en PL-Grid Project - Basic Data

  7. AdvancedServicePlatforms Application Application Application Application DomainGrid DomainGrid DomainGrid DomainGrid Grid infrastructure (Grid services) PL-Grid Clusters High Performance Computers Data repositories National Computer Network PIONIER Main Objectives of PL-Grid • Polish Grid is developing a common base infrastructure– compatible and interoperable with European and Worldwide Grids • Capacity to construct specialized, domain Grid systems – including services and tools focused on specific types of applications • This approach should enable efficient use of available financial resources • Plans for HPC and Scalability Computing enabled with more focus on domain specific Grids • Offer for the Users • Computing Power 215 Tflops • Storage 2500 TBytes • Support from PL-Grid staffon using advanced Grid tools • Support on porting legacy codes to Grid environment • Support on designing applications for PL-Grid environment

  8. Users Grid Application Programming Interface Grid portals, development tools Virtual organizations and security systems Other Grids systems Grid services UNICORE (DEISA) LCG/gLite (EGEE) Basic Grid services Distributed computational resources Distributed data repositories National computer network Grid resources PL-Grid Building Blocks PL-Grid softwarecomprises: • user tools (portals, systems for applications management and monitoring, result visualization and other purposes, compatible with the lower-layer software used in PL-Grid) • software libraries • virtual organization systems: certificates, accounting, security, dynamic • data management systems: metadata catalogues, replica management, file transfer • resource management systems: job management, applications, grid services and infrastructure monitoring, license management, local resource management, monitoring Three Grid structures are maintained: • production • research • development / testing

  9. EGI.eu – European organization being developed to coordinate the European Grid Infrastructure, based on the federation of individual National Grid Initiatives (NGI), to support a multi-disciplinary user community. PL-Grid tasks in EGI (planned/fixed) Grid operation and oversight of the e-Infrastructure Coordination of resource allocation and of brokering support for VOs from NGIs Computational Chemistry – organization and management of Computational Chemistry and Material Science and Technology Specialized Support Centre and EGI liaisons Development of unified middleware via European Middleware Initiative PL-Grid – first operational NGI in Europe PL-Grid • Integration Activity in the framework of European Grid Infrastructure On 30 March 2010, Poland - as the first country in Europe - has initiated functioning of the National Grid Initiative (NGI) • Scientific application porting, especially concerning UNICORE architecture, within Application Porting SSC 9

  10. PROJECT MANAGEMENT P1 P2 PLANNING AND DEVELOPMENT OF INFRASTRUCTURE Coordination Strategic Planning Dissemination P3 OPERATIONS CENTER P6 P4 SECURITY CENTER GRID SOFTWARE AND USERS TOOLS DEVELOPMENT … . EGEE DEISA P5 SUPPORT FOR VARIOUS DOMAIN GRIDS Training Organization of the PL-Grid project

  11. Current Status and Advancements

  12. Status of Hardware Infrastructure Current status (October 2010): • Computational power • CYFRONET: 29,3 Tflops • ICM: 13,1 Tflops • PCSS: 23,2 Tflops • Disk storage • CYFRONET: 313 TBytes • ICM: 433 TBytes • PCSS: 300 TBytes Total: 65,6 Tflops Total:1046 TBytes Plans until the end of 2010: 185 Tflops 1900 TBytes Plans until the end of the Project (end of 2011): 2500 TBytes 215 Tflops

  13. Operational Center – just in operation • Tasks • Coordination of operations • Management and accounting • Collaboration with EGI and PRACE/DEISA • Users’ requirements analysis for operational issues • Running infrastructure for: • Production • Developers • Research • Future consideration: • Computational Cloud • Data Cloud • Internal and ExternalClouds • Virtualization aspects

  14. Services of the Operational Center for Users Operational Center aims at facilitating access to the infrastructure by simplifying the procedures and deployment of useful tools: System of registration of account management of the PL-Grid user available at https://konto.plgrid.pl/ required entry in the Polish database of „People of Science” or confirmation of the scientific tutor grid access to PL-Grid resources 5 centers – gLite 1 center – UNICORE local access to the queue system „zeus” cluster in ACC CYFRONET AGH ability of application for a grid certificate on-line application for access to computational services in other centers Helpdesk system in PL-Grid enables reporting and tracking issues available at https://helpdesk.plgrid.pl (for registered users) access also by e-mail: helpdesk@plgrid.pl manual: https://wiki.plgrid.pl/doku.php?id=pakiet5:publiczne:podrecznik_uzytkownika_pl-grid → System Pomocy Helpdesk

  15. Services of the Operational Center for Users (cont’d) Operational Center cares for a proper functioning of the infrastructure for PL-Grid users by pro-active monitoring of the following infrastructure elements: availability of the infrastructure services software packages supported by PL-Grid Provision of the conformity of the PL-Grid and European (EGI) infrastructures software operational procedures security procedures Advanced work on the „PL-Grid grants” idea Integration of the data presentation concerning resources usage for a user Work on provision of the integrated user portal

  16. Development of the e-Infrastructure Software and the User’s Tools • Close cooperation of 8 programming and testing groups, about 20 people • Installation and provision for testing purposes gLite, Unicore and QosCosGrid • About 30 various configurations of virtual machines with installed software used for development and testing of the tools for users – the choice of the technology made • Functional, conformity and efficiency tests of selected packages of the research software made in order to perform the deployment and support of the new tools and services on the production level

  17. Development of the e-Infrastructure Software and the User’s Tools (cont’d) • Direct contact with new users on the basis of a survey, available at: www.plgrid.pl/ankieta • Requirements of the Polish users (results ~190 surveys) considered in the new applications, tools and services developed and tested in the framework of the Package 4 • Large group of users cooperating with the software and tools team • Department of Chemistry of the Jagiellonian University • Department of Bioinformatics and Telemedicine of the Collegium Medicum of the Jagiellonian University • University of Adam Mickiewicz • Poznan Technical University • Wrocław Technical University • Administrators of the computing centers • …

  18. Development of the e-Infrastructure Software and the User’s Tools (cont’d) • Test versions of tools for users and systems administrators: Grid Resource Bazaar, mobile access to the infrastructure, new security modules • Extending of the GridSpace2 platform with a set of new functions, support for new scripting languages and integration with new grid services • Integration of the Migrating Desktop andgEclipsetools with various PL-Grid middleware services • Extension and deployment of FiVO– a new tool for VO management and monitoring • Performance and functional tests of middleware service QosCosGridand integration with gLite and Unicore infrastructure at the queue systems level • Implementation and provision of the advanced graphical interfaces, visualization and tasks and data management for selected user applications via the Vine Toolkit • Integration of the selected tools and web applications with Liferay portal framework and Nagios monitoring system

  19. Providing resources to users with required qualities of services Required = specified in Service Level Agreement Grid Resource Bazaar RU –Resource Users RP – Resource Providers

  20. Grid Resource Bazaar (cont’d) Resource Allocation-related Operation Model

  21. Grid Resource Bazaar (cont’d) http://grid.cyfronet.pl/bazaar

  22. GridSpace2 Experiment Workbench in the PL-Grid Virtual Laboratory • Use of distributed computational resources and data repositories • High-level tools offered for the user for in-silico experiments

  23. GridSpace2 - working with the Experiment Workbench The PL-Grid Virtual Laboratory provides an Experiment Workbench – a Web 2.0 interface supporting the development and execution of in silico experiments. • Working with the Experiment Workbench: • Open the workbench in your browser (https://wl.plgrid.pl) • Log into one of the available servers with your PL-Grid account • Start your usual work. Your files are already there. Your code snippets may play the role of scripts (in Python, Perl, Ruby etc.), bash commands or input to external applications (such as Gaussian or Gnuplot); • All the files you generate may be viewed with visualization tools (such as Jmol) • Save your work – the experiment is now visible among your other files • Share the experiment with other members of your research team

  24. GridSpace2 - sample experiment in ViroLab environment http://gs2.cyfronet.pl/ http://www.virolab.org • Patient’s data • Medical examination • HIV genetic sequence put into database • Experiment in-silico • Collect HIV genetic sequences from database • Perform sequence matching • Calculate virus resistance

  25. Migrating Desktop Platform – the Framework for Grid Applications Mainproduct features: Simple integration with grid applications Easy job defining, submission, monitoring and visualization of results Support for batch and interactive jobs Handling of sequential and parallel applications Intuitive management of grid data Easy extendable framework Description: • The Migrating Desktop Platform is a powerful and flexible user interface to Grid resources that gives a transparent user work environment and easy access to resources and network file systems independently of the system version and hardware • It allows the user to run applications and tools, manage data files, and store personal settings independently of the location or the terminal Migrating Desktop main window 26

  26. Migrating Desktop Platform – the Framework for Grid Applications (cont’d) Application support: The key feature:the possibility of easy adding various tools, applications and supporting different visualization formats The Migrating Desktop offers a framework that can be easily extended on the basis of a set of well‑defined plug-ins used for: accessing data, defining job parameters, pre-processing job parameters, and visualization of job results Open architecture of the Migrating Desktop speeds up the application integration process and makes that product significantly more flexible than specialized tools (e.g. portals) designed only for a specific application Job defining and visualization of results (examples of Migrating Desktop plug-ins) 27

  27. g-Eclipse – to Access the Power of the Grid Benefits from using g-Eclipse: Grid application users will benefit from the desktop-like access to Grid resources Grid operators and resource providers will be able to reduce the time-to-service by using the Grid management and Grid site configuration tools Grid application developers will reduce the time-to-market for new Grid applications by accelerating the development and deployment cycle • Features: • The g-Eclipse framework provides tools to customize Grid users’ applications, to manage remote resources and to support the development cycle of new Grid applications • The framework consists of general Grid workbench tools that can be extended for many different Grid and Cloud middleware’s (such as gLite, UNICORE, Globus Toolkit, Amazon Cloud) 28

  28. g-Eclipse – to Access the Power of the Grid (cont’d) g-Eclipse elements (functionalities): Grid Virtual Organisation management allows the dynamic creation and management of Virtual Organisations and their resources Grid job management supports Grid users and developers in creating and managing Grid jobs independent of the middleware Grid file management provides access and management of local and foreign resources which are seamlessly integrated into the Eclipse Workbench Grid application deployment supports Grid application developers and Grid application users with the deployment of their applications on the Grid • Grid visualisation tools offer functionalities that allow the visualisation of scientific and numerical calculations • Grid workflow builder supports the creation and management of middleware independent workflowsby use of aprovidedgraphical editor • Grid command console is a command line interface to access the Grid g-Eclipse provides API and User Interface modules for all above functionalities. 29

  29. FiVO/QStorMan toolkit – the goals Provisioning PL-Grid users an appropriate quality of access to storage systems with the following means: Definition of the non-functional requirements for the storage systems, e.g. expected transfer rate of the read/write operations or parameters related to expected availability level of the resource or security. Monitoring of the storage systems parameters, e.g. a distributed file system. Data localization management based on the provided requirements and the current state of an environment. QStorMan is a part of the Framework for Inteligent Virtual Organization (FiVO) 30

  30. FiVO/QStorMan toolkit - architecture and components Components of the toolkit: Portal – a graphical user interface for defining requirements for storage systems GOM(Grid Organizational Memory) – a knowledge base for storing semantic description of the storage systems along with defined requirements SMED – a knowledge-supportedmonitoringsystem SES (Storage Element Selection) library – a programming library for selecting storage resource based on provided requirements and current state of the environment SE – Storage Element 31

  31. QCG – QosCosGrid Middleware • Description: • The QosCosGrid (QCG) middleware is an integrated e-infrastructure offering advanced job and resource management capabilities to deliver to end-users supercomputer-like performance and structure • By connecting many computing clusters together, QosCosGrid offers easy-to-use mapping, execution and monitoring capabilities for variety of applications, such as parameter sweep, workflows, MPI or hybrid MPI-OpenMP QCG architecture 32

  32. QCG – QosCosGrid Middleware (cont’d) • Benefits: • for End Users –efficient and secure access to dynamically discovered computational resources located in the PL-Grid environment • for Cluster and Grid Administrators –a great opportunity to share and use their clusters more efficiently by scheduling local and Grid end users’ applications according to defined policies and constraints Several portal based and mobile applications have been developed and integrated with QCG infrastructure to give the users intuitive - experiment and domain oriented - access to resources. QCG mobile client 33

  33. Vine Toolkit – Distributed Resources at Your Fingertips for Developers and End Users • General description: • Vine Toolkit was designed as an environment to facilitate the development and integration of web-based applications with HPC resources, Grid services and various existing large-scale computing infrastructures managed by Grid middleware, such as gLite, Unicore, Globus, QosCosGrid and GRIA. • It is possible to create a computational web-based Science Gateway using a modular structure and existing Vine components. • Main features: • Single Sign-On – users log in once and use all resources • Support for different grid infrastructures • Advanced web applications (integration with Adobe Flex / BazeDS technology) • Extensible, module based open architecture based on plugins • Many applications included „out-of-the-box” such as: Job Manager, File Manager,Resource Manager, Credential Manager, Certificate Manager, GSI-SSHTerm applet • Support for scientific applications: ABINIT, Quantum Espresso (and others…) • Integration with portal environments Liferay and GridSphere installed on Tomcat server 34

  34. Vine Toolkit – Distributed Resources at Your Fingertips for Developers and End Users (cont’d) • For whom: • web applications developers – a base for building advanced computational scientific portal gateways andapplications with many ready to use components, integration layer for portal frameworks and advancedinstallation engine • administrators of grid infrastructures – possibility of deployment of a web portal “out-of-the-box” to accessexisting grid infrastructure • end users – many ready to use web applications for grid computations and data management in gridinfrastructure • scientists – dedicated web applications for scientific applications for different scientific domains Example of the advanced web interface in the Nano-Science Gateway based on Vine Toolkit 35

  35. Scientific software packages Access to software packages is provided to users through: gLite UNICORE Examples of available packages in various fields: biology: AutoDock, BLAST, ClustalW2, CPMD, Gromacs, NAMD quantum chemistry: ADF, CFOUR, Dalton, GAMESS, Gaussian, Molcas, Molpro, MOPAC, NWChem, OpenBabel, Siesta, TURBOMOLE physics: ANSYS FLUENT, Meep numerical computations and simulation: Mathematica, MATLAB other: Blender, POV-Ray Users may report us their expectations through a survey available at:http://www.plgrid.pl/ankieta The system of testing the software packages in the grid environment has been prepared and deployed The correctness of functioning of the packages is monitored automatically in each of the centers 36

  36. Training and Users’ Support Basic training on access to the PL-Grid infrastructure through gLite and UNICORE conducted in all centers participating in the project – in Gdańsk / Kraków / Poznań / Warszawa / Wrocław More advanced training started Similar (free) training may be conducted in other centers, if necessary eLearning courses offered through the Blackboard system ( available for registered users of the PL-Grid infrastructure) Helpdesk system implemented it’s a novel support system for people using the Project resources it involves the technical support and organization of the current users’ support by the experts (maintenance of trouble tickets) tickets in Helpdesk may be created by sending an email to: helpdesk@plgrid.pl or through the online system, available at: https://helpdesk.plgrid.pl 37

  37. Security in PL-Grid 38 Provision of two CAs – PKI certification centers – for grid users Project and implementation of the SimpleCA system, facilitating the users obtaining PKI certificates and their usage Project and implementation of the secure configuration of the infrastructure, in conformity with the most actual security standards Project of the system monitoring the conformance of the configuration deployed in the centers with the security policy Creation of the group of experts from the field of security, in order to continuously monitor the environment, immediate react on incidents and support users and administrators Prototype version of the system of correlation of information about the attacks on the infrastructure (ACARM-ng) Audits of applications crucial for grid security

  38. Conclusions • Good and promising partial results of the PL-Grid Project • Justified expectations to Project completion according to its goals and requirements • Generic services and generic applications in development • Futher developement needed, as identified currently, mainly on Domain Specific Grids • Request from the users’ communities • Capacity for organization of future development according to • Expertise and experience • Strong scientific potential of the users’ communities being represented by PL-Grid Partners • Wide international cooperation concerning the Consortium and individual Partners, good recognition worldwide • Good managerial capacity

  39. http://www.plgrid.pl/en

More Related