1 / 20

Na tional Re search G rid I nitiative (NAREGI) Project

Cluster and Computational Grids for Scientific Computing, Sept.27-29 2004. Na tional Re search G rid I nitiative (NAREGI) Project. Project Leader, NAREGI Project Professor, National Institute of Informatics Kenichi Miura, Ph.D. September 26, 2004.

caitir
Download Presentation

Na tional Re search G rid I nitiative (NAREGI) Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cluster and Computational Grids for Scientific Computing, Sept.27-29 2004 National Research Grid Initiative (NAREGI) Project Project Leader, NAREGI Project Professor, National Institute of Informatics Kenichi Miura, Ph.D. September 26, 2004

  2. Information Technology Based Lab (ITBL) Super-SINET(NII) Campus Grid (Prof. Matsuoka, Titech) National Research Grid Initiative (NAREGI) Grid Technology Research Center(Mr. Sekiguchi) BioGrid (Prof. Shimojo, Osaka-U) VizGrid (Prof. Matsuzawa, JAIST) Japan Virtual Observatory (JVO) Grid Related Projects in Japan

  3. National Research Grid Initiative (NAREGI) Project:Overview • A new R&D project funded by MEXT (FY2003-FY2007) • 2 B Yen(~17M$) budget in FY2003 and FY2004 • One of Japanese Government’s Grid Computing Projects • Collaboration of National Labs. Universities and Industry • in the R&D activities (IT and Nano-science Apps.) • - Testbed for Grid Middleware Development (FY2003) MEXT:Ministry of Education, Culture, Sports,Science and Technology

  4. National Research Grid Initiative (NAREGI) Project:Goals (1) To develop a Grid Software System (R&D in Grid Middleware and Upper Layer) as a prototype for future Grid Infrastructure in scientific research in Japan (2) To provide a testbed to prove that the High-end Grid Computing Environment (100+Tflop/s expected by 2007) can be practically utilized in the Nano-science Simulations over the Super SINET. (3) To participate in international collaboration (U.S., Europe, Asian Pacific) (4) To contribute to Standardization Activities, e.g., GGF

  5. National Institute of Informatics (NII) (Center for Grid Research & Development) Institute for Molecular Science (IMS) (Computational Nano‐science Center) Universities and National Laboratories(Joint R&D) (AIST, Titech, Osaka-u, Kyushu-u, Kyushu Inst. Tech., Utsunomiya-u, etc.) Research Collaboration (ITBL Project, National Supecomputing Centers etc.) Participating Vendors (IT and Chemicals/Materials) Consortium for Promotion of Grid Applications in Industry Participating Organizations

  6. Project Leader (Dr.K.Miura, NII) NAREGI Research Organization and Collaboration

  7. NII Research Organizations IMS etc NAREGI Software Stack Grid-Enabled Nano-Applications Grid PSE Grid Visualization Grid Programming -Grid RPC -Grid MPI Grid Workflow Distributed Information Service Packaging Super Scheduler (Globus,Condor,UNICOREOGSA) Grid VM High-Performance & Secure Grid Networking SuperSINET Computing Resources

  8. WP-1: Lower and Middle-Tier Middleware for Resource Management: Matsuoka (Titech), Kohno(ECU), Aida (Titech) WP-2: Grid Programming Middleware: Sekiguchi(AIST), Ishikawa(AIST) WP-3: User-Level Grid Tools & PSE: Miura (NII), Sato(Tsukuba-u), Kawata(Utsunomiya-u) WP-4: Packaging and Configuration Management: Miura (NII) WP-5: Networking, Security & User Management Shimojo (Osaka-u), Oie ( Kyushu Tech.), Imase(Osaka U.) WP-6: Grid-enabling tools for Nanoscience Applications : Aoyagi (Kyushu-u) R&D in Grid Software and Networking Area (Work Packages)

  9. Nano-science and Technology Applications Targeted • Participating Organizations: • -Institute for Molecular Science • Institute for Solid State Physics • AIST • Tohoku University • Kyoto University • Industry (Materials, Nano-scale Devices) • Consortium for Promotion of Grid Applications in Industry • Research Topics and Groups: • Functional nano-molecules(CNT,Fullerene etc.) • Nano-molecule Assembly(Bio-molecules etc.) • Magnetic Properties • Electronic Structure • Molecular System Design • Nano-simulation Software Integration System • Etc.

  10. Orbiton (Orbital Wave) Manganese-Oxide Ferromagnetic Half-metal Apps:Half-metalMagneticTunnelingDevice Memory Device Examples of Nano-Applications Research (1) Nano-Electronic System FunctionalNano-Molecules Molecular Isomerizaton by Low Energy Integration Nano-Molecular Assembly Protein Folding

  11. Nano-Magnetism Examples of Nano-Applications Research (2) Magnetic Nano-dots Nano-simulation Software Integration System Nano-structure Search Engine Computational Science Platform Accelaration Engine Ensemble Engine Controling Arrangement of Nano-dots by Self Organization Design System Extended Ensemble Engine Nano-component Design System Interaction Force Calculation Ab Initio MO Engine(UTChem) Fragment Method Bridging Engine Nano-system Design Reaction Path Design system Density Functional quantum wire quantum-dot Nano-structure Building System Potential Function Au Material Characteristics Prediction System Averae Energy Calcuation V Si Zooming Engine Vg OCTA COGNAC PASTA SUSHI MUFFIN (Metal,organic Molecules) Nano-device, Quantum Transport

  12. Adaptation of Nano-science Applications to Grid Environment Grid Middleware Grid Middleware MPICH-G2, Globus Site B Site A RISM FMO Electronic Structurein Solutions Solvent Distribution Analysis Electronic StructureAnalysis Data Transformationbetween Different Meshes MPICH-G2, Globus Fragment Molecular Orbital method RISM FMO Reference Interaction Site Model

  13. NAREGI Five-year Plan

  14. Network Topology Map of SuperSINET

  15. National University Supercomputer Centers User Name Configuration Hitachi SR8000 (8 CPUs x 32 nodes) 256 GF/1TB Hokkaido University Tohoku University Information Synergy Center NEC SX-7 (32CPUs/node) 240 CPUs 2119GF/3TB Hitachi SR8000/MPP (8CPUs x 144 ) 2073GF/2.3TB Hitachi SR8000/128 (8CPUs x 128) 1843GF/1TB University of Tokyo Information Technology Center Nagoya University Information Technology Center Fujitsu VPP5000/64 64 CPUs 614 GF/1TB Fujitsu PRIMEPOWER HPC2500 (128 CPUs x 11 + 64 CPUs) 8,785 GF/5.6TB Kyoto University, Academic Center for Computing and Media Studies Osaka University Cybermedia Center NEC SX-5/128M8 (16 CPUs x 8) 1280 GF/1TB Kyushu University Computing & Communications Center Fujitsu VPP5000/64 (64 CPUs) 627GF/.7TB

  16. National University Supercomputer Centers(Continued) User Name Configuration Tsukuba University Science Information Processing Center Fujitsu VPP5000/80 (80 CPUs) 768GF/1.1TB NEC SX-5 (16 CPUs) 128GF/96GB SGI Origin2000 (256 CPUs) 256GB Tokyo Institute of Technology Global Scientific Information and Computing Center

  17. ISSP Small Test App Clusters Kyoto Univ. Small Test App Clusters Tohoku Univ. Small Test App Clusters KEK Small Test App Clusters AIST Small Test App Clusters NAREGIPhase 1 Testbed ~3000 CPUs ~17 Tflops TiTech Campus Grid Osaka Univ. BioGrid AIST SuperCluster Kyushu Univ. Small Test App Clusters Super-SINET (10Gbps MPLS) Computational Nano-science Center(IMS) ~10 Tflops Center for GRID R&D (NII) ~5 Tflops

  18. Computer System for Grid Software Infrastructure R & D Center for Grid Research and Development(5Tflops,700GB) File Server (PRIMEPOWER 900 +ETERNUS3000   + ETERNUS LT160) High Perf. Distributed-memory type Compute Server (PRIMERGY RX200) Intra NW-A Intra NW 128CPUs(Xeon, 3.06GHz)+Control Node Memory 130GB Storage 9.4TB InfiniBand 4X(8Gbps) Memory 16GB Storage10TB Back-up Max.36.4TB 1node/8CPU High Perf. Distributed-memoryType Compute Server (PRIMERGY RX200) (SPARC64V1.3GHz) 128 CPUs(Xeon, 3.06GHz)+Control Node Memory65GB Storage 9.4TB L3 SW 1Gbps (upgradable To 10Gbps) SMP type Compute Server (PRIMEPOWER HPC2500)  InfiniBand 4X (8Gbps) Distributed-memory type Compute Server (Express 5800)  Intra NW-B 1node (UNIX, SPARC64V1.3GHz/64CPU) Memory 128GB Storage 441GB 128 CPUs (Xeon, 2.8GHz)+Control Node Memory65GB Storage 4.7TB GbE (1Gbps) SMP type Compute Server (SGI Altix3700) Distributed-memory type Compute Server(Express 5800)  128 CPUs (Xeon, 2.8GHz)+Control Node 1node (Itanium2 1.3GHz/32CPU) Memory 32GB Storage 180GB Memory 65GB Storage4.7TB Ext. NW GbE (1Gbps) Distributed-memory type Compute Server(HPC LinuxNetworx )  SMP type Compute Server (IBM pSeries690) 128 CPUs (Xeon, 2.8GHz)+Control Node Memory 65GB Storage 4.7TB L3 SW 1Gbps (Upgradable to 10Gbps) 1node (Power4 1.3GHz/32CPU) Memory 64GB Storage 480GB GbE (1Gbps) Distributed-memory type Compute Server(HPC LinuxNetworx )  128 CPUs(Xeon, 2.8GHz)+Control Node Memory 65GB Storage 4.7TB SuperSINET GbE (1Gbps)

  19. Computer System for Nano Application R & D Computational Nano science Center(10Tflops,5TB) SMP type Computer Server Distributed-memory type Computer Server(4 units) 5.0 TFLOPS 5.4 TFLOPS 818 CPUs(Xeon, 3.06GHz)+Control Nodes Myrinet2000(2Gbps) 16ways×50nodes (POWER4+ 1.7GHz) Multi-stage Crossbar Network Memory 3072GB Storage2.2TB Memory 1.6TB Storage1.1TB/unit Front-end Server Front-end Server File Server 16CPUs(SPARC64 GP, 675MHz) L3 SW 1Gbps (Upgradable to 10Gbps) CA/RA Server Memory 8GB Storage30TB Back-up25TB Firewall VPN Center for Grid R & D SuperSINET

  20. Maintaining Scalability (Grid VM) Organizational issue Training and Education Expanding user communities Operational and Deployment Issues

More Related