1 / 36

Concurrent Programming

Concurrent Programming. Concurrency. Concurrency means for a program to have multiple paths of execution running at (almost) the same time. Examples: A web server can handle connections from several clients while still listening for new connections.

pearsone
Download Presentation

Concurrent Programming

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Concurrent Programming

  2. Concurrency • Concurrency means for a program to have multiple paths of execution running at (almost) the same time. Examples: • A web server can handle connections from several clients while still listening for new connections. • A multi-player game may allow several players to move things around the screen at the same time. How can we do this? How to the tasks communicate with each other?

  3. Performing Concurrency A computer can implement concurrency by... • parallel execution: run tasks on different CPU in the same computer (requires multi-processor machine) • time-sharing: divide CPU time into slices, and multi-process several tasks on the same CPU • distributed computing: use CPU of several computers in a cluster to run different tasks tasks tasks time

  4. Design for Concurrency If we have multiple processes or threads of execution for a single job, we must decide issues of... • Allocating CPU: previous slide • Memory: do all tasks share the same memory? Or separate memory? • Communication: how can tasks communicate? • Control: • how do we start a task? • stop a task? • wait for a task?

  5. Memory • shared memory: • if one thread changes memory, it effects the others. • problem: how to you share thestack? • separate memory: • each thread has its own memory area • combination: • each thread has a separate stack area • they share a static area, may share the current environment, may compete for same heap.

  6. Shared Memory and Environment /* fork( ) creates new process */ pid = vfork( ); if ( pid == 0 ) { /* child process, pid=0 */ task1( ); task3( ); } else { /* parent process, pid=child */ task2( ); } stack: free space task1() frame SP main Tasks don't share registers. After the parent calls task1() where is the Stack Pointer (SP) of the child process? Where will task2( ) be placed?

  7. Processes and memory • Heavy-weight processes: each process gets its own memory area and own environment. • Unix "process" fits this model. • Light-weight processes: processes share the same memory area; but each has its own context and maybe its own stack. • "threads" in Java, C, and C# fit this model

  8. Heavy-weight Processes • UNIX fork( ) system call: child process gets a copy of parent's memory pages

  9. Example: Web Server /* bind to port 80 */ bind( socket, &address, sizeof(address) ); while ( 1 ) { /* run forever */ /* wait for a client to connect */ client = accept(socket, &clientAddress, &len ); /* fork a new process to handle client */ pid = fork( ); if ( pid == 0 ) handleClient( client, clientAddress ); } Server forks a new process to handle clients, so server can listen for more connections.

  10. Example: fork and wait for child pid = fork( ); if ( pid == 0 ) childProcess( ); else { wait( *status ); // wait for child to exit } wait( ) cause the process to wait for a child to exit.

  11. Threads: light-weight processes • Threads share memory area. • Conserve resources, better communication between tasks. task1 = new Calculator( ); task2 = new AlarmClock( ); Thread thread1 = new Thread( task1 ); Thread thread2 = new Thread( task2 ); thread1.start( ); thread2.start( );

  12. States of a Thread

  13. Stack Management for Threads • In some implementations (like C) threads share the same memory, but require their own stack space. • Each thread must be able to call functions separately. stack: Cactus Stack: Dynamic and static links can refer to parent's stack. thread4 stack thread5 stack thread2 stack space thread3 stack space thread1 stack main

  14. Communication between Tasks • Reading and writing to a shared buffer. • Producer - consumer model (see Java Tutorial) • Using an I/O channel called a pipe. • Signaling: exceptions or interrupts. pin = new PipedInputStream( ); pout = new PipedOutputStream( pin ); task1 = new ReaderTask( pin ); task2 = new WriterTask( pout ); Thread thread1 = new Thread( task1 ); Thread thread2 = new Thread( task2 ); thread1.start( ); thread2.start( ); task1 pipe task2

  15. Thread Coordination thread1 thread2 wait( ); processing sleeping notify( );wait( ); processing sleeping notify( ); yield gives other threads a chance to use CPU. yield( ); yield( );

  16. Critical Code: avoiding race conditions Example: one thread pushes data onto a stack, another thread pops data off the stack. Problem: you may have a race condition where one thread starts to pop data off stack. but thread is interrupted (by CPU) and other thread pushes data onto stack. thread1: thread2: push("problem?") { n = top; stack[n] = "problem?"; top=n++; pop() { return stack[top--]; } problem? top race a a this this is is

  17. Exclusive Access to Critical Code programmer control: use a shared flag variable or semaphore to indicate when critical block is free executer control: use synchronization features of the language to restrict access to critical code public void synchronized push(Object value) { if ( top < stack.length ) stack[top++] = value; } public Object synchronized pop( ) { if ( top >= 0 ) return stack[top--]; }

  18. Avoiding Deadlock • Deadlock: when two or more tasks are waiting for each other to release a required resource. • Program waits forever. • Rule for Avoiding Deadlock: exercise for students

  19. Design Patterns and Threads Observer Pattern: one task is a source of events that other tasks are interested in. Each task wants to be notified when an interesting event occurs. Solution: wrap the source task in an Observable object. Other tasks register with Observable as observers. Observable task calls notifyObservers( ) when interesting event occurs

  20. Simple Producer-Consumer Cooperation Using Semaphores Figure 11.2

  21. Multiple Producers-Consumers Figure 11.3

  22. Producer-Consumer Monitor Figure 11.4

  23. States of a Java Thread Figure 11.5

  24. Ball Class Figure 11.6

  25. Initial Application Class Figure 11.7

  26. Final Bouncing Balls init Method Figure 11.8

  27. Final Bouncing Balls paint Method Figure 11.9

  28. Bouncing Balls Mouse Handler Figure 11.10

  29. Bouncing Balls Mouse Handler Figure 11.11

  30. Buffer Class Figure 11.12

  31. Producer Class Figure 11.13

  32. Consumer Class Figure 11.14

  33. Bounded Buffer Class Figure 11.15

  34. Sieve of Eratosthenes Figure 11.16

  35. Test Drive for Sieve of Eratosthenes Figure 11.17

More Related