slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Dr. Isabel Campos Plasencia (on behalf of the inteugrid team) PowerPoint Presentation
Download Presentation
Dr. Isabel Campos Plasencia (on behalf of the inteugrid team)

Loading in 2 Seconds...

play fullscreen
1 / 34

Dr. Isabel Campos Plasencia (on behalf of the inteugrid team) - PowerPoint PPT Presentation


  • 116 Views
  • Uploaded on

Dr. Isabel Campos Plasencia (on behalf of the inteugrid team) Instituto de Física de Cantabria, IFCA (Santander) Consejo Superior de Investigaciones Científicas (CSIC). Interactive European Grid: An interoperable infrastructure targeting interactivity, visualization and parallelism.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Dr. Isabel Campos Plasencia (on behalf of the inteugrid team)' - ansel


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Dr. Isabel Campos Plasencia

(on behalf of the inteugrid team)

Instituto de Física de Cantabria, IFCA (Santander)

Consejo Superior de Investigaciones Científicas (CSIC)

Interactive European Grid:An interoperable infrastructure targeting interactivity, visualization and parallelism

EGEE User Forum

Manchester, 9th – 11th May 2007

the interactive european grid
EGEE User Forum, Manchester, 9th – 11th May 2007

Projectacronym int.eu.grid

Contract number 031857

InstrumentI3

Duration

2 years may ´06-april ´08

Coordinator: Jesús Marco de Lucas, CSIC

“providing transparently

the researcher’s desktop

with the power of a supercomputer,

using distributed resources”

http://www.interactive-grid.eu

The Interactive European Grid
outline of the presentation
EGEE User Forum, Manchester, 9th – 11th May 2007Outline of the presentation
  • Objectives & challenges of

int.eu.grid

  • Applications requirements
  • Middleware versus Apps.
    • MPI Support
    • Interactive steering
    • Visualization
  • Example

Open-MPI

Grid Visualization

the challenge of a stable infrastructure int eu grid
EGEE User Forum, Manchester, 9th – 11th May 2007The challenge of a stable infrastructure: int.eu.grid
  • From the middleware point of view
    • Parallel Computing (MPI)
      • Support intracluster Jobs with OpenMPI
      • Support intercluster Jobs with PACX-MPI
    • Advanced visualization tools allowing simulation steering
      • GVid, glogin
    • A Job scheduler that supports it all
    • User friendly interface to the grid supporting all this features
      • Integrating in the Migrating Desktop all the features
  • From the Infrastructure point of view
    • Operate a production level infrastructure 24/7
    • Support Virtual Organizations at all levels
      • Running the VO (user support)
  • From the Applications point of view
    • Analyze requirements of reference applications
    • Ensure that middleware copes the reference applications demands
    • Application Porting Support
    • Promote collaborative environments like AccessGrid
applications requirements
EGEE User Forum, Manchester, 9th – 11th May 2007Applications Requirements
  • Understanding the Application  User input to NA team
    • Description in terms of
      • Area of knowledge and status of the art
      • Results expected and impact on the scientific community
      • Understanding the computational approach at the algorithmic level
    • Resources needed
      • Software & Hardware
      • GRID services
    • GRID added value
      • Why on the GRID ?
  • Interactive environment
  • Graphics & Visualization
  • Quality of Service and network reliability
project pilot applications
Project Pilot Applications

Fusion

Astrophysics

Environment

Medical Imaging

EGEE User Forum, Manchester, 9th – 11th May 2007

applications in environmental research

Height above

Surfacein m

IMS Model Suite

Applications in Environmental Research

Evolution of pollution clouds in the atmosphere

EGEE User Forum, Manchester, 9th – 11th May 2007

pattern requirements for middleware
EGEE User Forum, Manchester, 9th – 11th May 2007Pattern: Requirements for Middleware
  • Distributing the task among N processors
    • MPI support
  • The Job should be started inmediately on the user

desktop

    • MPI Interactive job scheduling
  • The graphical interface should be forwarded to the user desktop
    • Graphical interface to the grid  Migrating Desktop
    • Supporting Visualization  GVid
  • The user should be able to steer the simulation
    • Real Time steering  glogin
grid mpi support
EGEE User Forum, Manchester, 9th – 11th May 2007 GRID MPI Support
  • Why MPI Support ?
    • The standard API for distributed memory parallelisation
    • Write once, run everywhere
    • This is what applications are

What is MPI

  • Is an API
  • Description of the semantics, but

NOT the implementation

  • Almost platform indenpendent

(modulo problems with MPI-IO)

What is NOT MPI

  • There is no implementation
  • No specification of how to start

the processes

    • How to the get the binary on the remote sites
    • How to start the binaries on the remote sites (ssh, PBS,…)
mpi support
EGEE User Forum, Manchester, 9th – 11th May 2007MPI Support
  • Why MPI Support ?
    • The standard API for distributed memory parallelisation
    • Write once, run everywhere
    • This is what applications are

There are many issues about handling MPI jobs types already

worked out for Linux Clusters, SuperComputers, etc…

which have to be addressed when running MPI on the Grid

in a particular way.

What is MPI

  • Is an API
  • Description of the semantics, but

NOT the implementation

  • Almost platform indenpendent

(modulo problems with MPI-IO)

What is NOT MPI

  • There is no implementation
  • No specification of how to start

the processes

    • How to the get the binary on the remote sites
    • How to start the binaries on the remote sites (ssh, PBS,…)
problems of mpi support on the grid
EGEE User Forum, Manchester, 9th – 11th May 2007

There is no standar way how to start a MPI program

No common Syntax for mpirun

MPI-2 defines mpiexec as starting mechanism, but support for mpiexec is only optional

Resource Brokers should handle different MPI implementations

Different Schedulers and different MPI implementations at each site have different ways to specify the machinefile

Problems of MPI Support on the Grid
  • Non-shared filesystems (Oh!)
    • Many Grid sites dont have support for a shared home directory
    • Many MPI implementations expect that the executable is available in the nodes where the process is started
    • Mixed setup in general: some sites have shared filesystems, some not
mpi support in grid environments
EGEE User Forum, Manchester, 9th – 11th May 2007MPI Support in Grid Environments
  • In Grid Environments there are two possible cases
    • Intra Cluster Jobs
      • All processes run on the same cluster
    • Inter Cluster Jobs
      • Processes are distributed across several clusters/sites

1

2

SIZE

. . .

P2P

MPI_COMM_WORLD

Collective Communication

grid scheduler language needs translation to local scheduler syntax
EGEE User Forum, Manchester, 9th – 11th May 2007Grid Scheduler Language needs “translation” to local scheduler syntax

Local Site B

UI

RB

WN

CE

General Grid Scheduler

Local Scheduler

Translate?

NO,

Of course!

Translate?

NO

Translate?

YES, but how?

RB cannot be updated

often without compromising

the whole job submission

procedure

problems of mpi support on the grid1
EGEE User Forum, Manchester, 9th – 11th May 2007Problems of MPI Support on the Grid
  • Our Solution an intermediate layer:

mpi-start

RESOURCE BROKER

MPI-START

MPI

Implement.

Scheduler

mpi start
EGEE User Forum, Manchester, 9th – 11th May 2007mpi-start
  • Goals
    • Hide differences between MPI implementations
    • Hide differences between local schedulers implementations
    • Supports simple file distribution

 Hides from the user the filesystem details

(shared or non-shared)

    • Providing a simple but powerful enough unique

interface for the Resource Broker to specify MPI Jobs

The Resource Broker does not have to contain

hardcoded the MPI support

mpi start1
EGEE User Forum, Manchester, 9th – 11th May 2007mpi-start
  • mpi-start: design

mpi-start

portable ($bash scripting)

schedulers

MPI

hooks

PBS

SGE

filesystem

Openmpi

PACX-MPI

MPICH

mpi job example
EGEE User Forum, Manchester, 9th – 11th May 2007MPI Job Example

Executable = "IMB-MPI1";

Arguments = "pingpong";

JobType = "Parallel";

JobSubType = "openmpi";

NodeNumber = 16;

StdOutput = "std.out";

StdError = "std.err";

OutputSandbox = {"std.out","std.err"};

InputSandbox = {"IMB-MPI1"};

mpirun –machinefile $TMP/machines –np 16 pingpong

mpi job example1
EGEE User Forum, Manchester, 9th – 11th May 2007MPI Job Example

Include in JDL the following:

InputSandbox = {"MyHooks.sh", ....};

Environment = {"I2G_MPI_PRE_RUN_HOOK=./MyHooks.sh",

"I2G_MPI_POST_RUN_HOOK=./MyHooks.sh“

# cat MyHooks.sh pre_run_hook ()

{

echo "pre run hook called "

wget www.myhome.xx/mysources.tgz

tar xzvf mysources.tgz

make …

return 0;

}

dissemination effort
EGEE User Forum, Manchester, 9th – 11th May 2007Dissemination Effort
  • School organized in Dublin at TCD,

Course including Grids and MPI Support

Hosted by TCD (Brian Coghlan)

Date: end of June 2007.

mpi support in grid environments1
EGEE User Forum, Manchester, 9th – 11th May 2007MPI Support in Grid Environments
  • For InterCluster Jobs we support PACX-MPI
  • A middleware for seamlessly run a MPI-application on a network of parallel computers
  • PACX-MPI is an optimized standard-conforming MPI- implementation, application just needs to be recompiled(!)
  • PACX-MPI uses locally installed, optimized vendor implementations for cluster inter communication

Application

PACX-MPI (job)

Open MPI (job)

Open MPI (job)

Cluster 1

Cluster 2

pacx mpi design
EGEE User Forum, Manchester, 9th – 11th May 2007

3

1

1

5

3

0

2

0

4

2

PACX-MPI Design
  • A grid site has in general the following topology
    • CE = Computing Element (head node) public IP
    • WN = Worker Nodes, private IP
  • Requirements
    • Connectivity of CE to the clusters and start-up daemons
    • Files: Application & Input files
    • Start on daemons on the CE. Connectivity of ssh to CE

A MPI Job requesting N processes

per cluster spawns N+2 processes,

Two of them in the CE running as

Daemons, making the bridge

between clusters

CE

WN

pacx mpi design1
EGEE User Forum, Manchester, 9th – 11th May 2007

3

1

4

6

1

0

5

3

2

4

0

1

2

0

5

7

4

2

3

5

Cluster 1

Cluster 2

PACX MPI – Design
  • External Communication
    • Handled via the Computing Element, the only one with public IP
    • TCP/IP daemons do the job
example visualization of plasma in fusion devices
Example: Visualization of plasma in fusion devices
  • The application visualizes the behaviour of plasma inside a Fusion device
  • Runs are foreseen as a part of a so called Fusion Virtual Session
  • The plasma is analyzed as a many body system consisting of N particles
  • Inputs
    • Geometry of the vacuum chamber
    • Magnetic field in the environment
    • Initial number, position, direction, velocity of particles
    • Possibility of collisions between particles
    • Density of particles inside the device
  • Solves a set of Stochastic Differential Equations with Runge-Kutta method
  • Outputs
    • Trajectories of the particles
    • Average of relevant magnitudes: densities, temperatures...

TJ-II Stellerator at CIEMAT (Spain)

Graphical Display in using OpenGL with interactive capabilities

EGEE User Forum, Manchester, 9th – 11th May 2007

porting the application to int eu grid
EGEE User Forum, Manchester, 9th – 11th May 2007Porting the application to int.eu.grid
  • Spread the calculation over hundreds of Worker Nodes on the Grid to increase the number of particles in the plasma.
  • Design of a Grid collaborative environment for fusion device designing and analysis.

Uses most of the

capabilities

of the int.eu.grid

Middleware

N particles distributed among P processes: MPI

Particle trajectories are displayed graphically

Interactive simulation steering

middleware for visualization steering
EGEE User Forum, Manchester, 9th – 11th May 2007

Glogin

Lightweight tool for support of interactivity on the grid

Grid authenticated shell access “glogin host”

No dedicated daemon needed such as sshd

TCP Port Forwarding enables access to grid worker nodes with private IPs.

X11 Forwarding

Middleware for Visualization & Steering

Our middleware is based on the combination of a Grid Video Streamer

together with an interactive grid enabled login tool

  • GVid
    • Grid Video Service
    • Visualization can be executed remotely on a grid resource
    • Transmits the visualization output to the user desktop
    • Communication of the interaction events back to the remote rendering machine
    • Uses Glogin as bi-directional communication channel
fusion application mpi schema
EGEE User Forum, Manchester, 9th – 11th May 2007Fusion Application MPI Schema

MPI job distribution

P1

P0

MPI synchronization

Every Process does own i/o

P2

P3

Independent Processes

Master P0 does renderization

the user interacts with the master process for visualization and steering
EGEE User Forum, Manchester, 9th – 11th May 2007The User interacts with the Master process for Visualization and Steering

USER SIDE

P0

Master

Event Reception

Event Interception

Keyboard

Mouse

Java Gvid Decoder

Gvid Encoder

P1

P2

User Screen

P3

some related events
EGEE User Forum, Manchester, 9th – 11th May 2007Some related events

GRIDS & E-SCIENCE

24 - 29th September

Santander, Spain

http://grid.ifca.unican.es