Multi physics extension of openfmo framework
Download
1 / 22

Multi-physics Extension of OpenFMO Framework - PowerPoint PPT Presentation


Multi-physics Extension of OpenFMO Framework. Toshiya Takami , J. Maki, J. Ooba, Y. Inadomi, H. Honda, R. Susukita, K. Inoue, T. Kobayashi, R. Nogita, M. Aoyagi. Research Inst. for Inf. Tech. Kyushu University, Japan. Scientific Studies of 3D-RISM/FMO.

Related searches for Multi-physics Extension of OpenFMO Framework

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha

Download Presentation

Multi-physics Extension of OpenFMO Framework

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Multi-physics ExtensionofOpenFMO Framework

Toshiya Takami, J. Maki, J. Ooba,

Y. Inadomi, H. Honda, R. Susukita,

K. Inoue, T. Kobayashi, R. Nogita,

M. Aoyagi

Research Inst. for Inf. Tech.

Kyushu University, Japan


Scientific Studies of 3D-RISM/FMO

  • Scientific Studies on Coupled Simulations

    • FMO in Water

    • 3D-RISM/FMO

    • Nonlinear Science

Contents

  • Scientific Studies on Coupled Simulations

    • FMO in Water

    • 3D-RISM/FMO

    • Nonlinear Science

  • Computing Environment

    • Grid to Peta Computer

    • TeraGrid, EGEE, NAREGI

    •  Peta-scale Computer

    • Mediator-API

    • Performance prediction

  • OpenFMO for Multi-physics Simulations

    • Multi-scale design of FMO with a skeleton and MO-API

    • One-sided-communication implementation for HPC

    • Open-framework for Multi-physics Simulations


Fragment Molecular Orbital Method (1)

  • Fragment MO method, developed by Dr. Kitaura in AIST, Japan,is known as an approximate method of all electron calculation for large molecules.

  • The target is divided into fragments with one or two residues. SCF calculation of each fragment is performed under the ES potential made by other fragments. After improvements with respect to pairs of fragments, the total energy is obtained.

  • This algorithm has been implemented in ABINIT-MP (http://moldb.nihs.go.jp/abinitmp/),GAMESS (http://www.msg.ameslab.gov/GAMESS/),etc.

  • Works on FMO by Dr. Kitaura’s group:

    • K. Kitaura, E. Ikeo, T. Asada, T. Nakano and M. Uebayasi, Chem. Phys. Lett. 313, 701 (1999).

    • K. Kitaura, S. Sugiki, T. Nakano, Y. Komeiji and M. Uebayasi, Chem. Phys. Lett. 336, 163(2001).

    • D.G. Fedorov and K. Kitaura, J. Chem. Phys. 120, 6832 (2004).

    • D.G. Fedorov and K. Kitaura, J. Chem. Phys. 121, 2483 (2004).


Initialize

Fragment

Fragment

Fragment

・・・

Converge?

Frag. Pair

Frag. Pair

Frag. Pair

・・・

Total Energy

Schematic flow of FMO

Fragment Molecular Orbital Method (2)

  • The flow of FMO calculation can be represented in this figure.

    • The main part is a self-consistent loop of SCF calculations of fragments under the electro-static potential made by other fragments.

    • This is executed until the total electro-static potential is converged.

    • After the convergence, fragment-pair calculation is carried out over all combinations of two fragments in order to improve the result.

  • Parallel execution

    • Calculations of fragments and fragment-pairs can be parallelized.

    • Since the SCF calculation itself can be parallelized, FMO is executed under a hierarchical parallelization scheme.

D.G..Fedorov, R.M. Olson, K. Kitaura, M.S. Gordon, and S. Koseki, J. Comp. Chem. 25, 872 (2004).


RISM/SCF: MO in Aqueous Solution (1)

  • SCF calculations of molecules in water

    • RISM/SCF [Tenno-Hirata-Kato, 1993, 1994]

    • RISM/MCSCF [Sato-Hirata-Kato, 1996]

    • 3D-RISM/DFT [Kovalenko-Hirata, 1999]

    • 3D-RISM/SCF [Sato-Kovalenko-Hirata, 2000]

S. Ten-no, F. Hirata and S. Kato, CPL 214, 391 (1993)

S. Ten-no, F. Hirata and S. Kato, JCP 100, 7443 (1994).

H. Sato, F. Hirata and S. Kato, JCP 105, 1546 (1996).

A. Kovalenko and F. Hirata, JCP 110, 10095 (1999).

H. Sato, A. Kovalenko and F. Hirata, JCP 112, 9463 (2000).

  • RISM (Reference Interaction Site Model)

    • Statistical mechanics of molecular liquid without any fitting parameter

    • 3D-RISM is 3D version of RISM.

from JCP 100, 7443 (1994)

F. Hirata, ed., “Molecular Theory of Solvation,” (Kluwer Pub., 2003)


RISM/SCF: MO in Aqueous Solution (2)

  • We have done several test execution on small proteins in aqueous solution.

chignolin (138 atoms)

met-enkephalin (75 atoms)


Gas phase

Solvated

RISM/SCF: MO in Aqueous Solution (3)

  • Our recent work: 3D-RISM/SCF calculation as a multi-physics simulation in molecular science

    • Fictitious parameter is introduced to find many avoided structures in orbital energies.

    • Localization of one-electron orbitals is analyzed through eigenvalue statistics (Brody analysis).

T. Takami, J. Maki, J. Ooba, T. Kobayashi, R. Nogita, and M. Aoyagi, “Interaction and Localization of One-electron Orbitals in an Organic

Molecule: the Fictitious Parameter Analysis for Multi-physics Simulations,” J. Phys. Soc. Jpn. 76, 013001 (2007).

from JPSJ 76, 013001 (2007)


Computing Environments

  • Scientific Studies on Coupled Simulations

    • FMO in Water

    • 3D-RISM/FMO

    • Nonlinear Science

  • Computing Environment

    • Grid to Peta Computer

    • TeraGrid, EGEE, NAREGI

    •  Peta-scale Computer

    • Mediator-API

    • Performance prediction

  • Computing Environment

    • Grid to Peta Computer

    • TeraGrid, EGEE, NAREGI

    •  Peta-scale Computer

    • Mediator-API

    • Performance prediction

  • OpenFMO for Multi-physics Simulations

    • Multi-scale design of FMO with a skeleton and MO-API

    • One-sided-communication implementation for HPC

    • Open-framework for Multi-physics Simulations


From Grid to Peta-scale Computing

  • TeraGrid

    • http://www.teragrid.org/

  • EGEE

    • http://www.eu-egee.org/

  • NAREGI

    • http://www.naregi.org/

  • Next-generation Supercomputer Project

    • http://www.nsc.riken.jp/


3D-RISM/FMO by Mediator-API

  • Mediator-API provides transformation/exchange of data between each component in coupled simulation.

  • This is parallelized with GridMPI to achieve electronic-state calculations of a protein molecule in water.

S.Ho, S.Itoh, S.Ihara and R.D.Schlichting, “Agent middleware for heterogeneous scientific simulations,” in Proceedings of ACM/IEEE SC 1998 Conference (SC’98), 1998, p. 15.


Performance Prediction of FMO

1,000,000 CPUs

This may be available within ten years.

10,000 nodes

100 times faster than current P4

P4

P4

P4

P4

...

P4

P4

P4

P4

...

...

P4

P4

P4

P4

multi-core

CPU

multi-core

CPU

multi-core

CPU

multi-core

CPU

P4

P4

P4

P4

We assume this type of hierarchical computer.

  • If we assume the hierarchical computer (10,000 nodes of 100 core CPUs)

    • The total execution time is represented in a quadratic function of .

    • The execution time can be estimated on the 10,000 nodes computer with a sufficient performance in each node.

    • All electron calculation of a molecule with 100,000 fragments (approx. 2,000,000 atoms) can be executed by FMO in a day.

Total execution time of FMO

T. Takami, J. Maki, J. Ooba, Y. Inadomi, H. Honda, T. Kobayashi, R. Nogita, and M. Aoyagi,

“Open-architecture Implementation of Fragment Molecular Orbital Method for Peta-scale Computing,”

appear in Proceedings of HPCNano06 held in SC06, Tampa, FL. (2007); arXiv:cs/0701075v1 [cs.DC]


OpenFMO Project

  • Scientific Studies on Coupled Simulations

    • FMO in Water

    • 3D-RISM/FMO

    • Nonlinear Science

  • Computing Environment

    • Grid to Peta Computer

    • TeraGrid, EGEE, NAREGI

    •  Peta-scale Computer

    • Mediator-API

    • Performance prediction

  • OpenFMO for Multi-physics Simulations

    • Multi-scale design of FMO with a skeleton and MO-API

    • One-sided-communication implementation for HPC

    • Open-framework for Multi-physics Simulations

  • OpenFMO for Multi-physics Simulations

    • Multi-scale design of FMO with a skeleton and MO-API

    • One-sided-communication implementation for HPC

    • Open-framework for Multi-physics Simulations


OpenFMO Project (1)

It is revealed that FMO calculations can exhibit peta-scale performance in the next-generation supercomputer. However, it is known that the present implementations have significant problems in:

  • the memory allocation

  • communications between processes.

    and may not be executed in the peta machines.

    Then, we began a project named OpenFMO. The main objective of this project is to construct the FMO program which can be executed in peta-scale computers.


OpenFMO Project (2)

This project stands for the following “Opennesses”:

  • Open Architecture Implementation of Skeleton and APIs(Dr. Maki, Dr. Inadomi, Dr. Honda)

    • The layered structure of the control program (skeleton) and the molecular orbital API (MO-API) is successfully developed. It was found that the one-sided communication implementation using MPI-2 functions outperforms the usual one based on MPI.

  • Open Interface to Multi-physics Simulations(Dr. Kobayashi, T.T. (myself))

    • FMO can also be opened to multi-physics simulations. Since it is based on electro-static interaction between fragments, each fragment can be substituted by the general object which can provide a static charge distribution.

  • Open Source License

    • The source code of the skeleton program of OpenFMO is publicly opened according to some open-source license like GPL.

T. Takami, J. Maki, J. Ooba, Y. Inadomi, H. Honda, T. Kobayashi, R. Nogita, and M. Aoyagi,

“Open-architecture Implementation of Fragment Molecular Orbital Method for Peta-scale Computing,”

appear in Proceedings of HPCNano06 held in SC06, Tampa, FL. (2007); arXiv:cs/0701075v1 [cs.DC]


OpenFMO Project (3)

  • Subjects Achieved and Future Schedule:

    • In 2006

      • A skeleton program based on the parallelization scheme of GAMESS-FMO (by Dr. Maki).

      • Interfaces of MO-APIs (by Dr. Inadomi and Dr. Maki).

      • Web pages of OpenFMO (by Dr. Maki and T.T. (myself))

    • The first half of 2007

      • The new style skeleton which can be executed on the peta-scale resources (by Dr. Maki and Dr. Inadomi, see below).

      • Determin multi-physics interfaces (Dr. Kobayashi, and myself)

    • The latter half of 2007

      • Beta release of the multi-physics application

J.Maki, Y.Inadomi, T.Takami, R.Susukita, H.Honda, J.Ooba, T.Kobayashi, R.Nogita, K.Inoue, M.Aoyagi, “One-sided Communication Implementation in FMO Method,” appear in Proceedings of HPCAsia07.

OpenFMO web-site:http://www.OpenFMO.org/


Multi-physics Extension (1)

  • Multi-physics/multi-scale applications will play an important role in the benchmark for peta-scale computers, since there is a limit in the scalability of a single application.

  • From scientific points of view, they will be a significant milestone when we concern complex multi-scale problems.

Multi-scales in phenomenon

Stack structure of multi-simulator


Multi-physics Extension (2)

  • Conditions:

    • Light-weight, reconfigurable structure is required in order to correspond the rapidly changing world of high-performance computing

      • be very adaptive for the computational environments

    • This program should be used by wide-range reseachers including beginners.

      • reliability and stability of the program are required

    • Open architecture must be preserved after the multi-physics extension.

      • should be developed on a kind of “standards”


Multi-physics Extension (3)

Multi-physics Simulation

3D-RISM

FMO

Rapidly Changing Computing Environement


Multi-physics Extension (3)

Multi-physics Simulation

FMO

3D-RISM

Rapidly Changing Computing Environement


Multi-physics Extension (4)

  • Component-based configuration, which is already done in the original OpenFMO using Skeleton/MO-API implementation

  • Standard Communication Protocol between each components

    • Web/Grid Services with many WSXX Standards

    • RPC-based invocation / MPI parallel programing

    • ...

  • Standard Data Representation, or Mediator-like APIs for transformation of the physical data

    • BMSML (BioMolecular Simulation Markup Language)

    • CML (Chemical Markup Language)

    • netCDF (Network Common Data Format)

    • ...

Completed!!

Not yet Completed!!


Summary

  • Several implementations and applications on (3D) RISM/SCF as a multi-scale/multi-physics simulation are reviewed, where various interesting phenomena can be investigated as scientific simulations.

  • I have also mentioned that these applications and studies are performed on rapidly changing computer environments such as Grid, peta-computer, vector / scalar architecture, network topology, and so on.

  • We are still struggling in the multi-physics extension of OpenFMO, which is originally introduced as an open-sourced software in order to avoid the dead-end in development of large and complex applications.


Acknowledgements

  • Collaborated with

    • Dr. J. Maki, Dr. Y. Inadomi, Dr. H. Honda,Dr. R. Susukita, Prof. K. Inoue, Ms. R. Nogita,Dr. T. Kobayashi, Prof. M. Aoyagi(PSI/NAREGI Project Members in Kyushu Univ.)

  • Thanks to

    • Dr. T. Ikegami, Dr. S. Sekiguchi (AIST, Tsukuba)

    • Prof. S. Matsuoka (Tokyo Inst. Tech, Japan)

  • Special Thanks to

    • Director K. Murakami, Prof. T. Nanri, Dr. F-L. Gu

    • Organizer of this Symposium


ad
  • Login