1 / 35

Concurrency

Concurrency. What is Concurrency. Ability to execute two operations at the same time Physical concurrency multiple processors on the same machine distributing across networked machines Logical concurrency illusion or partial parallelism Designer/programmer doesn’t care which!.

arin
Download Presentation

Concurrency

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Concurrency

  2. What is Concurrency • Ability to execute two operations at the same time • Physical concurrency • multiple processors on the same machine • distributing across networked machines • Logical concurrency • illusion or partial parallelism • Designer/programmer doesn’t care which!

  3. Real or Apparentdepends on your point of view • Multiple computers in distributed computing or multiprocessors on a computer • Multiple clients on same or multiple computers • Multiple servers on one or many machines • Where does the complexity lie? • Emphasis of text is concurrency at the server • not in code but managing/launching servers

  4. Why is concurrency important? • One machine is only capable of a limited speed • Multiple machines/processors • share workload and gain better utilization • optimize responsibility/requirements to each machine’s ability • place secure processes in secure environments • parallel processors are tailored to the problem domain • Single machine • OS can give limited parallelism through scheduling

  5. Concurrency considerations • What level of concurrency to consider • How it is handled on a single processor • understand process scheduling • How to share resources • How to synchronize activity • How to synchronize I/O specifically

  6. Process Scheduling

  7. Process • The activation of a program. • Can have multiple processes of a program • Entities associated with a process • Instruction pointer • user who owns it • memory location of user data areas • own run-time stack

  8. Process Scheduling • Order of process execution is unpredictable • Duration of execution is unpredictable • Whether there is an appearance or real concurrency is not relevant to the designer. • There will generally be a need to resynchronize activity between cooperating processes

  9. Context switch • When OS switches running process, it manipulates internal process tables • Loading/storing registers must be done • Threads minimize that effort. Same process but a different stack. • Necessary overhead but must balance overhead with advantage of concurrency

  10. Operating Systems Scheduling Ready Running Blocked 205 198 201 200 177 206 180 185 Process 200 blocks when reading from the disk Ready Running Blocked 198 201 205 177 206 180 185 200

  11. What other activities are important in scheduling? • Jobs go from RUNNING to READY when they lose their time slot • Jobs go from blocked to READY when the I/O operation they are waiting for completes • Jobs go from RUNNING to being removed completely upon exit

  12. Concurrency at many levels • Process level .. • Unix fork command…. (review example) • network client-server level • Subprocess level (like a procedure) .. thread • Statement level • Scheduling level

  13. How do you get limited parallelism from the OS? disk controller cpu Process A runs onCPU; blocks on read There are times when both processors (CPU and controller) are busy so real parallelism does occur but notat the level of the CPU Disk reads for B while blocked Process C runs for it’s time slice time Disk reads for A while A blocked Process B runs

  14. What if YOU have to create the concurrency?

  15. High-level Concurrency

  16. Unix and 95/98/NT • Unix • process concurrency • use fork and exec • fork • clone self • parent and child differ by ppid • two processes • exec • new process replaces original • single different process • 95/98/NT • thread part of same process • own copy of locals • shared copy of globals • shared resources like file descriptors • each has own activation record • parent and child do not always begin/continue at same spot as in fork • thread ceases execution on return • _beginthread() needs procedure • Createprocess() like fork/exec

  17. Unix fork() void main() { fork(); cout << “Hi”; } produces HiHi Question is… who “Hi”ed first?

  18. Process 2 void main() { fork(); cout << “Hi”;} Process 1 void main() { fork(); cout << “Hi”;} Process 2 OR void main() { fork(); cout << “Hi”;} HiHi Output Process 1 void main() { fork(); cout << “Hi”;} time OR HiHi

  19. Another Example fork(); cout << “a”; fork(); cout << “b”; How many processes are generated? How many possible outputs can you see?

  20. Use listsock close talksock repeat loop close listsock use talksock … exit() Use in servers (and clients) While (1) { talksock = accept (listsock... cid = fork(); if (cid > 0) { // this is parent code … } else { // this is child code … } }

  21. Threads(windows style)

  22. Non-threaded void main() { count(4); } void count(int i) {int j; for (j=1; j<=i; j++) cout << j<<endl; } 1 2 3 4

  23. Threaded void main() { _beginthread(( (void(*) void()) count, 0 , (void*) 5); count(4); } void count(int i) {int j; for (j=1; j<=i; j++) cout << j<<endl; } 1 2 1 2 3 3 4 4 1 2 1 2 3 3 4 4

  24. Examples of fork/exec and threads(Examples in server code)

  25. The Synchronization Problem

  26. Concurrency frequently requires synchronization! Cooperation Synchronization Competition Synchronization A is working on something B must wait for A to finish A needs to read a stream B needs to read the stream Only one can read at a time A does this A does this instr >> a; instr >> b; x = f + g; h = x + y; B does this B does this We’ll see how to do this later!

  27. The synchronization problem Shared Memory Task A Task B T = T * 2 T = T + 1 T=3 fetch T(3) incr T(4) lose CPU (time) fetch T(3) time double T(6) T=6 store T get CPU T=4 store T TRY THIS: What other combinations could occur?

  28. The essence of the problem • There are times during which exclusive access must be granted • These areas of our program are called critical sections • Sometimes this is handled by disabling interrupts so process keeps processor • Most often through a more controlled mechanism like a semaphore

  29. Where do we see it? • EVERYWHERE • Database access • Any data structures • File/Printer/Memory resources • Any intersection of need between processing entities for data

  30. Synchronization Solutionsfor c/c++(java later)

  31. Semaphore • One means of synchronizing activity • Managed by the operating system • implementing yourself will not work (mutual exclusion) • Typical wait and release functions called by applications • Count associated • 0/1 binary implies only a single resource to manage • larger value means multiple resources • Queue for waiting processes (blocked)

  32. wait and release release ( semA ) {if semA queue empty incr semA; else remove job in semA queue (unblock it) } wait ( semA ) { if semA>0 then decr semA; else put in semA queue; (block it) } This represents what the operating system does when an application asks for access to the resource by calling wait or release on the semaphore

  33. Standard exampleproducer-consumer semaphore fullspots, emptyspots; fullspots.count=0; emptyspots.count= BUFLEN; shared resource task consumer; loop wait (fullspots); FETCH(VALUE); release (emptyspots); end loop; end producer; task producer; loop wait (emptyspots); DEPOSIT(VALUE); release (fullspots); end loop; end producer; Why do you need TWO semaphores? Are adding and removing the same?

  34. Competition Synchronization What if multiple processes want to put objects in the buffer? We might have a similar synchronization problem. Use BINARY semaphore for access COUNTING semaphores for slots semaphore access, fullspots, emptyspots; access.count=1; // BINARY fullspots.count=0; emptyspots.count= BUFLEN; task consumer; loop wait (fullspots); wait (access); FETCH(VALUE); release (access); release (emptyspots); end loop; end producer; task producer; loop wait (emptyspots); wait (access); DEPOSIT(VALUE); release (access); release (fullspots); end loop; end producer; Remind you of a printer queue problem?

  35. Concurrency and Asynchronous I/O • When application blocks, it waits for a response to complete from a device • How do you wait on multiple devices? • There is support for this in the OS • let me wait on multiple devices • wake me when one answers • (more later)

More Related