Hyper threading technology
Download
1 / 27

Hyper-Threading Technology - PowerPoint PPT Presentation


  • 171 Views
  • Updated On :

Hyper-Threading Technology . Presented By Nagarajender Rao Katoori. Introduction. To Enhance Performance-  Increase in clock rate Involves reducing clock cycle time Can increase the performance by increasing number of instructions finishing per second

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Hyper-Threading Technology' - loring


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Hyper threading technology

Hyper-Threading Technology

Presented By

Nagarajender Rao Katoori


Introduction

To Enhance Performance- 

Increase in clock rate

Involves reducing clock cycle time

Can increase the performance by increasing number of instructions finishing per second

H/w limitations limit this feature

Cache hierarchies

Having frequently used data on the processor caches reduces average accesses time


  • Pipelining

    • Implementation Technique whereby multiple instructions are overlapped in execution

    • Limited by the dependencies between instructions

    • Effected by stalls and effective CPI is greater than 1

  • Instruction Level Parallelism

    • It refers to techniques to increase the number of instructions executed in each clock cycle.

    • Exists whenever the machine instructions that make up a program are insensitive to the order in which they are executed if dependencies does not exist, they may be executed.


  • Thread level parallelism

    • Chip Multi Processing

      • Two processors, each with full set of execution and architectural resources, reside on a single die.

    • Time Slice Multi Threading

      • single processor to execute multiple threads by switching between them

    • Switch on Event Multi Threading

      • switch threads on long latency events such as cache misses


  • Simultaneous Multi Threading

    • Multiple threads can execute on a single processor without switching.

    • The threads execute simultaneously and make much better use of the resources.

    • It maximizes the performance vs. transistor count and power consumption.


Hyper-Threading Technology

  • Hyper-Threading Technology brings the simultaneous multi-threading approach to the Intel architecture.

  • Hyper-Threading Technology makes a single physical processor appear as two or more logical processors

  •  Hyper-Threading Technology first invented by Intel Corp.

  •  Hyper-Threading Technology provides thread-level-parallelism (TLP) on each processor resulting in increased utilization of processor and execution resources.

  • Each logical processor maintain one copy of the architecture state


Hyper-Threading Technology Architecture

Processor Execution

Resources

Processor Execution

Resources

Arch State

Arch State

Arch State

Processor with out Hyper-Threading Technology

Processor with Hyper-Threading Technology

Ref: Intel Technology Journal, Volume 06 Issue 01, February 14, 2002


Following resources are duplicated to support Hyper-Threading Technology

  • Register Alias Tables

  • Next-Instruction Pointer

  • Instruction Streaming Buffers and Trace Cache Fill Buffers

  • Instruction Translation Look-aside Buffer


Figure: Intel Xeon processor pipeline Hyper-Threading Technology


Sharing of Resources Hyper-Threading Technology

ØMajor Sharing Schemes are-

o Partition

o Threshold

o Full Sharing

Partition

Ø     Each logical processor uses half the resources

Ø     Simple and low in complexity

Ø     Ensures fairness and progress

Ø     Good for major pipeline queues


Partitioned queue example
Partitioned Queue Example Hyper-Threading Technology

  • Yellow thread – It is faster thread

  • Green thread – It is slower thread


Partitioned queue example1
Partitioned Queue Example Hyper-Threading Technology

  • Partitioning resource ensures fairness and

    ensures progress for both logical processors.


Threshold Hyper-Threading Technology

Ø     Puts a threshold on number of resource entries a logical processor can use.

Ø     Limits maximum resource usage

Ø     For small structures where resource utilization in burst and time of utilization is short, uniform and predictable

Ø     Eg- Processor Scheduler


Full Sharing Hyper-Threading Technology

Ø     Most flexible mechanism for resource sharing, do not limit the maximum uses for resource usage for a logical processor

Ø     Good for large structures in which working set sizes are variable and there is no fear of starvation

Ø     Eg: All Processor caches are shared

o       Some applications benefit from a shared cache because they share code and data, minimizing redundant data in the caches



SINGLE-TASK AND MULTI-TASK MODES Hyper-Threading Technology

  • Two modes of operations

    • single-task (ST)

    • multi-task (MT).

  • MT-mode- There are two active logical processors and some of the resources are partitioned.

  • There are two flavors of ST-mode: single-task logical processor 0 (ST0) and single-task logical processor 1 (ST1).

  • In ST0- or ST1-mode, only one logical processor is active, and resources that were partitioned in MT-mode are re-combined to give the single active logical processor use of all of the resources


Single task and multi task modes
SINGLE-TASK AND MULTI-TASK Hyper-Threading TechnologyMODES


  • HALT instruction that stops processor execution. Hyper-Threading Technology

  • On a processor with Hyper-Threading Technology, executing HALT transition the processor from MT-mode to ST0- or ST1-mode, depending on which logical processor executed the HALT.

  • In ST0- or ST1-modes, an interrupt sent to the halted logical processor would cause a transition to MT-mode.


Operating system
OPERATING SYSTEM Hyper-Threading Technology

  • For best performance, the operating system should implement two optimizations.

    • The first is to use the HALT instruction if one logical processor is active and the other is not. HALT will allow the processor to transition MT mode to either the ST0- or ST1-mode.

    • The second optimization is in scheduling software threads to logical processors. The operating system should schedule threads to logical processors on different physical processors before scheduling two threads to the same physical processor.


Business Benefits of Hyper-Threading Hyper-Threading Technology

Technology

  • Higher transaction rates for e-Businesses

  • Improved reaction and response times for end-users and customers.

  • Increased number of users that a server system can support

  • Handle increased server workloads

  • Compatibility with existing server applications and operating systems


Performance increases from Hyper-Threading Technology on an OLTP workload

Web server benchmark performance


Conclusion

Conclusion OLTP workload

Intel’s Hyper-Threading Technology brings the concept of simultaneous multi-threading to the Intel Architecture.

It will become increasingly important going forward as it adds a new technique for obtaining additional performance for lower transistor and power costs.

The goal was to implement the technology at minimum cost while ensuring forward progress on logical processors, even if the other is stalled, and to deliver full performance even when there is only one active logical processor.


References
References OLTP workload

  • “HYPER-THREADING TECHNOLOGY ARCHITECTURE AND MICROARCHITECTURE” by Deborah T. Marr, Frank Binns, David L. Hill, Glenn Hinton,David A. Koufaty, J. Alan Miller, Michael Upton, intel Technology Journal, Volume 06 Issue 01, Published February 14, 2002. Pages: 4 –15.

  • “:HYPERTHREADING TECHNOLOGY IN THE NETBURST MICROARCHITECTURE” by David Koufaty,Deborah T. Marr,IEEE Micro, Vol. 23, Issue 2, March–April 2003. Pages: 56 – 65.

  • http://cache-www.intel.com/cd/00/00/22/09/220943_220943.pdf

  • http://www.cs.washington.edu/research/smt/papers/tlp2ilp.final.pdf

  • http://mos.stanford.edu/papers/mj_thesis.pdf


Thank you OLTP workload


ad