1 / 17

159.703 Parallel Computing

159.703 Parallel Computing. Dr. Ruili Wang IIST, A/H 3.82 Email: r.wang@massey.ac.nz Phone: 2548. 159.703 Parallel Computing. Six weeks One assignment (35%) One presentation (15%) Contributing to our learning Attending the lectures Taking part in discussions

Sharon_Dale
Download Presentation

159.703 Parallel Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 159.703 Parallel Computing Dr. Ruili Wang IIST, A/H 3.82 Email: r.wang@massey.ac.nz Phone: 2548 159.703 Parallel Computing (Dr.Wang)

  2. 159.703 Parallel Computing • Six weeks • One assignment (35%) • One presentation (15%) • Contributing to our learning • Attending the lectures • Taking part in discussions • Reading the books 159.703 Parallel Computing (Dr.Wang)

  3. Books ( References) • Introduction to Parallel Computing (2nd Ed.) Ananth Grama, Anshul Gupta, George Karypis and Vipin Kumar, Addison- Wesley, 2003. • Parallel and Distributed Computing, a survey of models, paradigms and approaches, Claudia Leopold (2001) 159.703 Parallel Computing (Dr.Wang)

  4. Traditional Computing (SC) Programs have been written for serial computers: • One instruction executed at a time • Using one processor Processing speed dependent on how fast data can move through hardware • Speed of Light = 30 cm/nanosecond • Limits of Copper Wire = 9 cm/nanosecond 159.703 Parallel Computing (Dr.Wang)

  5. High Computing Power Needed Simulation and Modeling problems: • Based on successive approximations • More calculations, more precise • Problems dependent on computations / manipulations of large amounts of data • E.g. Climate Modeling, Fluid Turbulence, Pollution Dispersion, Human Genome, Ocean Circulation, Quantum Chromodynamics, Semiconductor Modeling, Superconductor Modeling, Combustion Systems Image and Signal Processing • Entertainment (Image Rendering) • Database and Data Mining • Seismic 159.703 Parallel Computing (Dr.Wang)

  6. Traditional Supercomputers Technology • Single processors were created to be as fast as possible. • Peak performance was achieved with good memory bandwidth. Benefits • Supports sequential programming (which many people understand) • 30+ years of compiler and tool development • I/O is relatively simple Limitations • Single high performance processors are extremely expensive • Significant cooling requirements • Single processor performance is reaching its asymptotic limit • http://www.mhpcc.edu/training/workshop/parallel_intro/MAIN.html#paradigms 159.703 Parallel Computing (Dr.Wang)

  7. Parallel Computing (PC) • Multiple processors • Network • Environment to create and manage parallel processing • Operating System • Parallel Programming Paradigm • Message Passing • MPI (Massage Passing Interface) • PVM ( Parallel Virtual Machine) • Data Parallel • Fortran 90 / High Performance Fortran • Others • OpenMP • shmem • A parallel algorithm and a parallel program 159.703 Parallel Computing (Dr.Wang)

  8. Platforms • Large scale applications in science and engineering rely on larger configuration of parallel computers • Hundreds of processors • Data intensive platform such as database or web servers and transaction processing and data mining often use clusters of working stations • that provide high aggregate disc bandwidth. 159.703 Parallel Computing (Dr.Wang)

  9. Parallel Computing (PC) • PC refers to solving a task fast by employing multiple processors simultaneously. • desire for more power • availability of appropriate technology • Von Neumann bottleneck, i.e. access time both memoey and discs • Currently, networks of commodity PCs or workstations, so call clusters are gainingwide acceptance as affordable hardware platforms for parallelism. ( Price/ performance ratio) 159.703 Parallel Computing (Dr.Wang)

  10. Distributed Computing • A distributed systems is a collection of autonomous computers that are interconnected with each other and cooperate, thereby sharing recourses such as printers and databases • e.g. Internet, LAN, WAN • issues: security, heterogeneity, etc. 159.703 Parallel Computing (Dr.Wang)

  11. Comparison between PC and DC Commons • Multiple processors • Processors interconnected by some networks • Multiple computational activities are in progress at the same time • Some authors believed that PC as a sub-field of DC 159.703 Parallel Computing (Dr.Wang)

  12. Comparison between PC and DC Differences • PC splits an application up into tasks that are executed at the same time, whereas • DC splits an application up into tasks that are executed at different locations, using different resources. 159.703 Parallel Computing (Dr.Wang)

  13. Concurrent Computing • CC focuses on the existence of multiple threads of control. • It investigates the implications of multiple computations being in progress at the same time. • A central issue is conflicts-free sharing of resources. • The computations can be executed in parallel and on distributed systems • But they can also run in a time-shared manner on a sequential machine, which is called pseudoparallelism • CC emphasizes the competition aspect between the processes or threads, whereas • PC and DC computing emphasize the cooperation aspect 159.703 Parallel Computing (Dr.Wang)

  14. Highly-Performance or Supercomputing • Here, the focus is on the speed of execution. • Parallelism is important means to achieve speed, processor-internal parallelism and parallelism-unrelated performance optimizations are considered 159.703 Parallel Computing (Dr.Wang)

  15. Others • Wireless and Mobile Computing • Pervasive Computing and Ubiquitous[ju:’bikiwits] Computing • What difference between them? 159.703 Parallel Computing (Dr.Wang)

  16. Ubiquitous? Pervasive? Sorry, they don't compute • http://www.computerworld.com/news/2000/story/0,11280,41901,00.html. • Ubiquitous means everywhere. • Pervasive means "diffused throughout every part of." • Ubiquitous computing would be everywhere, and pervasive computing would be in all parts of your life. • Pervasive computing involves devices like handhelds -- small, easy-to-use devices -- through which we'll be able to get information on anything and everything. That's the sort of thing that Web-enabled cell phones promise. • Ubiquitous computing, though, eschews our having to use computers at all. Instead, it's computing in the background, with technology embedded in the things we already use. 159.703 Parallel Computing (Dr.Wang)

  17. Ubiquitous computing "helped kick off the recent boom in mobile computing research," notes its inventor, Mark Weiser, who came out with the concept at Xerox's Palo Alto Research Center, "although it is not the same thing as mobile computing, nor a superset nor a subset." That means that people who use ubiquitous computing to mean computing anytime, anyplace • I'd be happy to call pervasive computing mobile computing, and to call ubiquitous computing embedded or invisible or transparent computing -- or even just built-in functions 159.703 Parallel Computing (Dr.Wang)

More Related