1 / 10

Atlanta Compromise Proposal for Resolving OTA CIDs

Atlanta Compromise Proposal for Resolving OTA CIDs. Date: 2007-11-15. Authors:. Abstract. This contribution describes a methodology for resolving several CIDS in the Over The Air environment and metrics in the 1.0 TGT Draft text. The number of CIDs resolved by this methodology is >> 40

mikkel
Download Presentation

Atlanta Compromise Proposal for Resolving OTA CIDs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Atlanta Compromise Proposal for Resolving OTA CIDs Date: 2007-11-15 Authors: D. Victor, Broadcom.

  2. Abstract • This contribution describes a methodology for resolving several CIDS in the Over The Air environment and metrics in the 1.0 TGT Draft text. • The number of CIDs resolved by this methodology is >> 40 • The CIDs addressed are: • 773,774,775,782,783,784,785,786,789,790,793,794,795,796,797,798,799,800,801,803,804,805,806,807,808,809,810,812,813,814,815,816,817,820,819,821,822,824,825,826, … D. Victor, Broadcom.

  3. Overview • The OTA testing methodology in draft 1.0 of the TGT draft is overly specified and complicated • To resolve several CIDs • All the OTA procedures should be collapsed into a single unified test procedure that can be applied to various test environments • Reduces the complexity of the draft • Simplifies procedures • Future proofs the draft for new implementations • Specific configuration information should be recorded in a test report, not specified in the TGT draft D. Victor, Broadcom.

  4. Grand-unifying Over-the-Air Test Plan D. Victor, Broadcom.

  5. OTA equipment & setup • Two wireless nodes (TU) • For example, one AP and one STA. • A data source and sink • For example, two computers • A throughput measurement application • For example, iperf from iperf.sourceforge.net is the application used to represent a throughput measurement application here on out • Two carts (if needed) • Test conditions • Ideally, a “clean” channel with • No APs/traffic on channel under test, overlapping channels, nor immediately adjacent channel • No frequency hopping devices operating in the band being tested • If impairments are present, those must be recorded with the note that the repeatability of the test results may be compromised D. Victor, Broadcom.

  6. UUT REF OTA Test Methodology Acronyms: • TU – Test Unit • UUT – Unit Under Test • REF – Reference Unit • Two wireless nodes are placed in known, repeatable locations. This combination of locations is called a configuration. • Each TU (UUT and REF) is set to a known, repeatable orientation. • Average throughput is measured with iperf in each direction for 20 sec. • Each TU is independently oriented to 0º, 90º, 180º, 270º between measurements. • The iperf throughput measurement is repeated for each of the 16 combinations of orientation. • The maximum 20-second-average throughput is recorded as the metric for that location. (See Appendix) • Report specific configuration details • Exact placement of the nodes should be repeatable within 1cm D. Victor, Broadcom.

  7. UUT UUT D. Victor, Broadcom.

  8. Benefits of Compromise Test • Test is simple • Can be as manual or as automated as desired • No mandate for potentially noisy equipment (motorized turntables) • No presumption of device configurability (tilt of display, elevation, etc) • Results are more reliable due to repeatability • Configuration data is recorded for repeatability • Devices are stationary during measurement periods • Use of max() statistic is more repeatable than avg() statistic • Less dependency of test on exact channel conditions (test engineer position, etc) • Higher degree of confidence in measurement D. Victor, Broadcom.

  9. Appendix D. Victor, Broadcom.

  10. REF REF REF REF REF REF REF REF REF REF REF REF REF REF REF REF UUT UUT UUT UUT UUT UUT UUT UUT UUT UUT UUT UUT UUT UUT UUT UUT 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 270 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 180 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 90 Orientation Combinations • 16 REF and UUT phase orientations • 1 of these 16 measurements will be the max used to record result for a given location coordinate pairs D. Victor, Broadcom.

More Related