1 / 23

CCS Overview

CCS Overview. Rene Salmon Center for Computational Science. Introduction. What is CCS? June 2001 Establish new collaborations Infrastructure to exchange ideas Interdisciplinary research Computational science research High end workstations HPC Hardware and Software. Software.

claus
Download Presentation

CCS Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CCS Overview Rene Salmon Center for Computational Science

  2. Introduction • What is CCS? • June 2001 • Establish new collaborations • Infrastructure to exchange ideas • Interdisciplinary research • Computational science research • High end workstations • HPC Hardware and Software

  3. Software • Visualization • Techplot, AVS • Compilers • SGI, Absoft • Math Libraries • IMSL, BLAS • Finite Element Modeling • ABAQUS, PATRAN • Molecular Dynamics • NAMD, Gaussian, Amber, VMD • Matlab, Mathematica

  4. High Performance Computing

  5. Multiprocessor Machines

  6. SGI 4 compute nodes 32 CPUs 700MHZ R16000 MIPS 8 GB RAM Memory bandwidth 3.2GB/sec peak NUMAlink interconnect 1.6GB/sec each direction 1 TB SGI storage array Linux Cluster 34 nodes 68 CPUs 2.4 GHZ AMD Opteron 68 GB RAM Memory Bandwidth 12.8 GB/s Gigabit Ethernet interconnect 85MB/sec 1 TB storage array Multiprocessor Machines

  7. Single OS Easier to program OpenMP Inter-processcommunication Multiple OS Harder to Program MPI Inter-processcommunication Multiprocessor Machines

  8. High cost Complex Hardware Support contract Proprietary software Irix Compilers Low cost Commodity parts Community driven support Open source software Linux Compilers Multiprocessor Machines

  9. OpenMP OpenMP(Open specifications for Multiprocessing) Library and compiler directives Shared memory Process synchronization Thread based MPI MPI(Message Passing Interface) Libraries Distributed memory Process based Process synchronization Master/slave mode Parallel Programming

  10. Share a single address space Access one another's variables Time & memory Interprocess communication Process Threads

  11. OpenMP program foobar …. !$omp parallel do do i=1, n z(i)=a*x(i)+b enddo end program foobar

  12. MPI program foobar use mpi …. call MPI_INIT(…) call MPI_COMM_RANK(..,myid,..) call MPI_COMM_SIZE(..,numprocs,..) data_chunk=SIZE_X/numprocs j=1+myid*data_chunk n=j+(data_chunk-1) x_local=x(j:n) do i=1, data_chunk z_local(i)=a*x_local(i)+b enddo call MPI_GATHER(z_local, …,z,…) call MPI_FINALIZE(… end program foobar

  13. Queuing System PBSPro • Resource manager • Schedules/decides when job run • Allocates resources to jobs • Full featured • Supports preemption • Priorities • Supports parallel and single CPU jobs

  14. Q1: Lowest priority Access to all Tulane community for research purposes only. Q2: Provide intellectually to the leadership of CCS Giving (or arrange) seminars Serving on CCS committees Q3: Financially support from individual grants Personnel Computer/Software purchases Computer/Software maintenance Q4: Highest priority Faculty and students with CCS-funded projects CCS Queuing System

  15. Grid Computing • Login to Server • Compile • Move or prepare data • Create and submit Job script to queue • Monitor status • Get results • Move data • Visualization

  16. Grid Computing

  17. Grids Nationally • National Lambda Rail (NLR) • Nationwide optical fiber infrastructure • Open Science Grid • DOE and NSF Roadmap • Join U.S. labs and universities into a single, managed grid • Goal: Build a national grid infrastructure for benefit of scientific applications

  18. LONI: Louisiana Optical Network Initiative • March of 2004 secured NLR membership • Louisiana Board of Regents, Tulane, LSU • State allocated $40 million to create and maintain LONI • What is LONI? • Statewide optical network • Inter-connect universities and colleges • Take advantage of NLR access • 40 Gbps   • 1000 times faster

  19. LONI: Louisiana Optical Network Initiative • LONI Members • Tulane University, Tulane HSC • LSU, LSU Medical Centers in Shreveport and New Orleans • Louisiana Tech University • University of Louisiana at Lafayette • Southern University • University of New Orleans

  20. LONI: Louisiana Optical Network Initiative • Provide NLR access • High-quality, high-definition videoconferencing • High-speed access to data • Remote visualization • Remote instrumentation • High Performance Computing • Collaborative research projects and grants • Attract better research faculty • Increased potential of receiving national and international grant funding

  21. LONI: Louisiana Optical Network Initiative • End of summer 2005 • $500,00.00 High Performance computer • All Connected via LONI • Tulane Pilot Grid • SURA test bed • Experience • Grid Research

  22. Accessing Resources • Go to website: http://www.ccs.tulane.edu • Resource request form • Access local CCS and national Grid resources

More Related