90 likes | 116 Views
This proposed framework by Shravan Surineni and Kevin Karcz aims to identify wireless performance parameters, establish testing methods, and classify metrics for accurate evaluation. It outlines performance categories, testing methodologies, and highlights grey areas for consistent and reliable results.
E N D
Proposed framework for WPP Shravan Surineni, Kevin Karcz InterOperability Lab University of New Hampshire Shravan Surineni, UNH-IOL
Purpose • The purpose is to identify the parameters that affect wireless performance and methods to measure these parameters • Well defined metrics and classifying these metrics into categories is necessary before we start to evaluate the parameters • It is likely that these parameters will be handled differently by different people • The purpose of this framework is to discuss the structure and details Shravan Surineni, UNH-IOL
Expectations from WPP • WPP will define a set of parameters that are relevant to wireless performance which can be used by designers as well as users • Outline a test method in which these parameters can be measured within a given margin of error • A set of guidelines to get repeatable results with a reasonable error Shravan Surineni, UNH-IOL
Performance Categories • Identify and Define parameters at various levels and by user experience • Physical layer - EVM, Spectral mask, CFO, transmit power, sensitivity, immunity to interference, range • MAC layer - Association time, roaming time, throughput, rate adaptation • Network level - Throughput with TCP traffic, UDP traffic • System level - Performance in overlapping BSS Shravan Surineni, UNH-IOL
How can we use existing specifications • PICS contain PHY and MAC specifications for conformance with protocol • They provide a direct indication of performance of the particular device • Some of them can be measured directly using commercially available equipment • Detailed methods are required to avoid confusion in making measurements and to get results that are repeatable with reasonable error Shravan Surineni, UNH-IOL
Clear methods of testing are needed • Can each layer of the network be measured independently ? • Hassle of real world scenario testing vs. a PHY test mode - PHY test mode or Black box testing ? What models to be used to test performance under multipath? • Which metrics need to look at the interaction of multiple layers • Need test scenario that would give repeatable measurements with a reasonable error Shravan Surineni, UNH-IOL
Some grey areas of testing • Throughput • Is throughput measured as a MAC layer payload? At IP layer? TCP or UDP layer? • One DUT may have better PER measurements at the PHY layer than a 2nd DUT, but may get worse throughput if it’s rate selection algorithm is poor. • Difficult to maintain consistency in an open (uncontrolled) environment • Can system throughput be measured in a cabled environment without an antenna? • What if the DUT has a phased array antenna? • What if the device is mini-PCI and inherently has no antenna? • Range test • What if a higher TX level causes higher adjacent channel interference and brings the aggregate throughput down for a neighboring BSS? • Power consumption • Is this just the DC power drain at the cardbus card interface? • Should CPU load be included if the DUT implements much of it’s MAC functionality on a host PC? • Roaming • Quickest time: 1 STA, 2 APs on same channel • More realistic: AP reboots, Multiple STAs roam to new AP on new Channel Shravan Surineni, UNH-IOL
Applications may weigh test results differently Shravan Surineni, UNH-IOL
Outline of Proposed Method • Define parameters and classify them into various categories • Define test methods to test these parameters • Partial or complete list of items may be tested for different applications • It is likely that these results may be weighted differently by different applications Shravan Surineni, UNH-IOL