1 / 39

System Performance Assessment Tools for Windows Longhorn

System Performance Assessment Tools for Windows Longhorn. Richard G. Russell Development Manager Windows Core Operating System Division rgr @ microsoft.com Microsoft Corporation. Introduction. Large difference in platform capability moving forward

franklin
Download Presentation

System Performance Assessment Tools for Windows Longhorn

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. System Performance Assessment Tools for Windows Longhorn Richard G. Russell Development Manager Windows Core Operating System Divisionrgr @ microsoft.com Microsoft Corporation

  2. Introduction • Large difference in platform capability moving forward • High-end dual core 64-bit gaming system vs. low end valuesingle core system • Media center mobile system vs. thin and light orultra-mobile system • There are Microsoft Windows codenamed “Longhorn” client features that need to scale with platform capability: • The window manager and themes • Video playback • Gaming • TV Recording • How does Longhorn and applications makescaling decisions?

  3. Situation Today • Scaling decisions are left to applications • Microsoft Windows XP makes no material scaling decisions • Windows XP provides few tools to help applications make decisions based on platform capability • Graphics capability information from D3D is usefulfor games • Application can enumerate features via WMI • Only a few applications have their own system assessment code • Games are the exception • But, many games do a poor job of this – often leaving tunable settings up to the user

  4. Longhorn Helps Solve These Problems • New Tool: the Windows System Assessment Tool (WinSAT) • Built into Longhorn • WinSAT assesses system • Features* – something that can be enumeratedor detected • Attributes* – something that must be measured • Capabilities* – a fundamental ability to performa task or action • WinSAT provides performance metrics and other information necessary for Longhorn and applications to make scaling decisions * See detailed definitions at the end of the deck

  5. WinSAT Helps Answer Scaling Questions • Can the system efficiently run with desktop composition enabled? • Can the system efficiently support advanced window management and themes? • Aero-Express or Aero-Glass • Can the system play high definition video without dropping frames? • How many channels of TV can a system record simultaneously? • How many enemy robot AI agents shoulda game configure? • Can a game enable full water simulation?

  6. Better Automatic Configuration is the Goal • Thus avoiding: • Making the user experiment or guess at thebest settings • The use of default “lowest common denominator” settings • Assuming high-end systems – turning on too many features that will run poorly on some systems • Developers avoid having to develop, test, deploy, and maintain custom system assessment code Allow the operating system and applicationsto better automatically configure themselvesto provide the best user experienceof which an individual PC is capable

  7. What Does WinSAT Do? • WinSAT assesses these subsystems: • Graphics • Memory • Processor • Storage • WinSAT assess: • Attributes • Features • Capabilities • Only items that are relevant to Longhorn and application scalability are assessed

  8. WinSAT and Assessment Details

  9. WinSAT Component Diagram

  10. WinSAT.exe • WinSAT.exe is a command line tool that contains the individual assessments • Graphics, D3D, Memory, Storage, Computation • Video Playback, Feature Enumerator* (TBI) • Easy to run • Lots of parameters for each assessment • Command line help built into Windows* (TBI) • Must be run as administrator from the command line • Canonical output is XML • Human readable output to standard output • Easy to script * To Be Implemented (TBI) – not present in WinHEC Longhorn Distribution

  11. WinSAT.exe • Full 32-bit and 64-bit parity • Multi-Core/Multi-CPU aware where appropriate • Memory assessment • Computational assessment • Runs on Windows XP (just copy WinSAT.exe) • WinSAT will be updated over time to reflect changes and improvements in platform technology

  12. WinSAT API • Provided via a simple COM interface • Provides simple and easy programmatic access to WinSAT and the data store • Access to the XML data – returns an MSXML DOM • Allows WinSAT.exe to be run • With an arbitrary command line • With a “formal assessment”* (TBI) • From a limited user account* (TBI) • A formal assessment is set of assessments that is executed witha pre-defined set of parameters • Used to generate metrics used by Longhorn and reported by the WinSAT API • The API can be disabled through group policies • Off, or administrator only * To Be Implemented (TBI) – not present in WinHEC Longhorn Distribution

  13. The Data Store • Will only store data from formal assessments scheduled from the API • Data from assessments run from user command lines are not managedor accessible through the API • Such data is saved in normal XML files • The XML schemes are very simple • The Schemas will change! • A history of formal assessments is kept* (TBI) • up to 100 assessments (oldest deleted) • Always keeps the first one generated during first user logon • The data store files are read-only to all entities except the WinSAT API* (TBI) • Data is provided to callers by providing an MSXML DOM * To Be Implemented (TBI) – not present in WinHEC Longhorn Distribution

  14. The Graphics Assessment • Designed to assess a system’s ability to efficiently run a composited desktop and theAero themes • Metric is Effective Frames Per Second • Drives the hardware in a very similar way to the desktop composition engine*(TBI) • Used to make decisions about enablingdesktop composition and theme level • Focused on evaluating shader texture load performance, system to graphics memory bandwidth and back-buffer read-back * To Be Implemented (TBI) – not present in WinHEC Longhorn Distribution

  15. The D3D Assessment • Designed to assess a system’s ability to render 3D gaming graphics • Metric is Effective Frames Per Second • Targeted to Pixel Shader v2.0 or better hardware • Focused on three aspects: • Shader ALU performance • Shader texture load performance • Post-pixel blend performance • Evaluates 8-bit and 16-bit render targets

  16. The System Memory Assessment • Focused on throughput, not latency • Deigned to assess • How well large objects can be moved in system memory • Metric is Mega bytes per second • Methodology • Uses MMX or SSE registers for copies • Uses temporal reads • Uses non temporal writes • Uses explicit block pre-fetching (16K stride) • All data is aligned

  17. The Storage Assessment • Is a “sweep” test – divides a physical disk into regions and evaluates the following for each • Random read and write performance • Sequential read and write performance • Uses up to 16 regions • Metric is megabytes (MB) per second • Reports metrics for each region • Reports an aggregated metric for the disk using the geometric mean of each region

  18. The Processor Assessment • Designed to assess the system’s computational ability • Metric is Mega bytes per second processed • Is not a synthetic test – uses nativeWindows components • Data compression and decompression • Data encryption and decryption • Widows Media Video encoding and decoding* (TBI) • Values from each sub-assessment aggregated using the geometric mean * To Be Implemented (TBI) – not present in WinHEC Longhorn Distribution

  19. Windows XP vs. Longhorn Differences • Some assessments may produce moderately different results on Longhorn vs. Windows XP • Graphics and D3D: • Due to changes in the driver model XPDM vs. LDDM • Due to desktop composition • Processor assessments: • Due to differences in the Windows components usedin those assessments

  20. Demo

  21. Scenarios

  22. For the OEM • Focus on platform performance as it relates to key Longhorn features • Measuring performance consistency of similarly configured models • Understand performance differences between models • Diagnostics: misconfigured systems mayperform poorly • For example, the memory test is very good at picking out misconfigured DRAM

  23. Independent Software Vendor • ISV can use simple (or complex!) rules based on WinSAT metrics and built into an application to make scaling decisions • Since WinSAT is built into Longhorn, good correlation data can be collected during internal testing and external ALPHA and BETA testing • In short, ISVs can use WinSAT just as Microsoft uses it for Longhorn and our applications

  24. Media Player Example • User installs a new media player • User then tries to play a high definition video clip • The player checks the WinSAT data via the API and determines that the system is not capable of effectively playing HD video • Note that this happens without running any tests,nor does the player need to guess – it happens immediately • The player can then inform the user that they may see frame drops and other video artifacts– do they wish to continue?

  25. Longhorn Upgrade • User upgrades an existing system to Longhorn • WinSAT runs as part of the first user logon experience (this used to be called OOBE) • Longhorn uses WinSAT metrics to make configuration decisions and recommendationsto the user • “Your system is capable of running Aero-Glass, would you like this option configured?”

  26. Longhorn Graphics Card Upgrade • User installs a new graphics card in their system – upgrading from UMA graphics • LH detects this after the reboot and recommends to the user that he or she should reassess the system (the user says yes) • WinSAT runs and the system determines that it is now possible to run Aero-Glass • The user is informed, much the same way as the previous example

  27. Wrapping Up

  28. Where is the Development Process? • The WinSAT tool is in the WinHEC Longhorn build • It is in an early ALPHA state • Large correlation study is underway • Not all assessments and features are implemented in Longhorn Beta 1 • We may add further assessments • The metrics may change • Tuning and enhancements will happen through theBeta cycles • We may make interim versions of WinSAT available on the beta site

  29. Key Features and Updates to be Implemented • Consumer suitable visuals for the Graphics andD3D assessments • Windows Graphics Foundation (WGF) 2.0 support • Video decode assessment • System feature enumeration • Integration with: • The desktop composition system • First user logon process • LUA support • WMI integration • Detailed technical documentation • In Windows Help format • Group Policy Support

  30. Call to Action • Become familiar with WinSAT • Correlate the performance of your productwith WinSAT metrics • Think about how WinSAT metrics can help your application make scaling and configuration decisions • Help us make WinSAT better • How does it rank systems and components? • Is it statistically reliable? • How will does it correlate with your product’s performance? • Report bugs and problems

  31. Community Resources • Windows Hardware & Driver Central (WHDC) • www.microsoft.com/whdc/default.mspx • Technical Communities • www.microsoft.com/communities/products/default.mspx • Non-Microsoft Community Sites • www.microsoft.com/communities/related/default.mspx • Microsoft Public Newsgroups • www.microsoft.com/communities/newsgroups • Technical Chats and Webcasts • www.microsoft.com/communities/chats/default.mspx • www.microsoft.com/webcasts • Microsoft Blogs • www.microsoft.com/communities/blogs

  32. Additional Resources • See the WinSAT command line document onthe WinHEC CD • Please send your questions to Winsatfb @ microsoft.com • We will notify anyone who sends e-mail to this alias of updates as they become available

  33. © 2005 Microsoft Corporation. All rights reserved. This presentation is for informational purposes only. Microsoft makes no warranties, express or implied, in this summary.

  34. Key Definitions

  35. An Attribute Is: • A system characteristic that is measured • Attributes cannot be predetermined accurately, they must be measured as the system is built • WinSAT measures the performance attributes of: • System memory • The storage sub-system • The CPU (or CPUs) • The graphics sub-system

  36. A Feature Is: • A prominent or distinctive system part, component, or characteristic • Features are either present or absent • Features are not measured, they are detectedor enumerated • Examples: • x64, Hyper Threading, memory size, optical drive type (DVD, CD, writer etc), MMX, SSE, 3DNow, 1394, number of processors (logical and physical), number of cores per processor package, cache size, hardware MPEG-II or WMV decode support, shader 2.0 support

  37. A Capability Is: • A system’s ability to effectively perform a specific function or task • Capabilities are often present or absent but can also be present to a matter of degree, or some measure or metric • To be present, a capability requires: • A specified (or minimum) level of one or more attributes • The presence of one or more specified features • Some capabilities can be assessed directly • For example, the ability to play high definition video • Some must be inferred or determined by correlation with features and measured attributes

  38. © 2005 Microsoft Corporation. All rights reserved. This presentation is for informational purposes only. Microsoft makes no warranties, express or implied, in this summary.

More Related