1 / 16

The HPEC Challenge Benchmark Suite

The HPEC Challenge Benchmark Suite. Ryan Haney, Theresa Meuse, Jeremy Kepner and James Lebak Massachusetts Institute of Technology Lincoln Laboratory HPEC 2005.

caelan
Download Presentation

The HPEC Challenge Benchmark Suite

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The HPEC Challenge Benchmark Suite Ryan Haney, Theresa Meuse, Jeremy Kepner and James Lebak Massachusetts Institute of TechnologyLincoln Laboratory HPEC 2005 This work is sponsored by the Defense Advanced Research Projects Agency under Air Force Contract FA8721-05-C-0002. Opinions, interpretations, conclusions, and recommendations are those of the authors and are not necessarily endorsed by the United States Government.

  2. Acknowledgements • Lincoln Laboratory PCA Team • Matthew Alexander • Jeanette Baran-Gale • Hector Chan • Edmund Wong • Shomo Tech Systems • Marti Bancroft • Silicon Graphics Incorporated • William Harrod • Sponsor • Robert Graybill, DARPA PCA and HPCS Programs

  3. HPEC Challenge Benchmark Suite • PCA program kernel benchmarks • Single-processor operations • Drawn from many different DoD applications • Represent both “front-end” signal processing and “back-end” knowledge processing • HPCS program Synthetic SAR benchmark • Multi-processor compact application • Representative of a real application workload • Designed to be easily scalable and verifiable

  4. Outline • Introduction • Kernel Level Benchmarks • SAR Benchmark • Overview • System Architecture • Computational Components • Release Information • Summary

  5. Spotlight SAR System • Principal performance goal: Throughput • Maximize rate of results • Overlapped IO and computing • Intent of Compact App: • Scalable • High Compute Fidelity • Self-Verifying

  6. SAR System Architecture SAR Image Front-End Sensor Processing Scalable Data and Template Generator Kernel #2 Image Storage Template Insertion Kernel #1 Data Read and Image Formation SAR Image Templates Raw SAR Files SAR Image Files Raw SAR File Groups of Template Files Template Files File IO Computation Raw SAR Data Files Image Files Sub-Image Detection Files Groups of Template Files SAR Image Files Sub-Image Detection Files Template Files Template Files Kernel #3 Image Retrieval Kernel #4 Detection Validation SAR Images … but File IO performance is increasingly important Detections HPEC community has traditionally focused on Computation … Templates Back-End Knowledge Formation

  7. Data Generation and Computational Stages Sensor Processing Scalable Data and Template Generator Raw SAR Kernel #2 Image Storage Template Insertion Kernel #1 Image Formation SAR Image Templates Templates Raw SAR Files SAR Image Files Raw SAR File Groups of Template Files SAR Image Template Files Raw SAR Data Files Image Files Sub-Image Detection Files Groups of Template Files SAR Image Files Template Files Template Files Detection Files Kernel #3 Image Retrieval Kernel #4 Detection Validation SAR Image Detections Templates Knowledge Formation

  8. SAR Overview . . . • Radar captures echo returns from a ‘swath’ on the ground • Notional linear FM chirp pulse train, plus two ideally non-overlapping echoes returned from different positions on the swath • Summation and scaling of echo returns realizes a challengingly long antenna aperture along the flight path Synthetic Aperture, L Fixed to Broadside Range, X = 2X0 delayed transmitted SAR waveform reflection coefficient scale factor, different for each return from the swath Cross-Range, Y = 2Y0 received ‘raw’ SAR

  9. Scalable Synthetic Data Generator Spotlight SAR Returns Range Cross-Range • Generates synthetic raw SAR complex data • Data size is scalable to enable rigorous testing of high performance computing systems • User defined scale factor determines the size of images generated • Generates ‘templates’ that consist of rotated and pixelated capitalized letters

  10. Kernel 1 — SAR Image Formation Received Samples Fit a Polar Swath o s*0(w,ku) ky Fourier Transform (t,u)B(w,ku) Interpolation kx = sqrt(4k2 –ku2) ky = ku Inverse Fourier Transform (kx,ky) B (x,y) s(t,u) f(x,y) s(w,ku) kx F(kx,ky) Matched Filtering Processed SamplesFit a Rectangular Swath f Spatial Frequency Domain Interpolation Spotlight SAR Reconstruction Cross-Range, Pixels Range, Pixels

  11. Template Insertion(untimed) • Inserts rotated pixelated capital letter templates into each SAR image • Non-overlapping locations and rotations • Randomly selects 50% • Used as ideal detection targets in Kernel 4 Image only inserted with %50 random Templates If inserted with %100 Templates Y Pixels Y Pixels X Pixels X Pixels

  12. Kernel 4 — Detection Image A • Detects targets in SAR images • Image difference • Threshold • Sub-regions • Correlate with every template  max is target ID • Computationally difficult • Many small correlations over random pieces of a large image • 100% recognition no false alarms Image Difference Thresholded Difference Sub-region Image B

  13. Benchmark Summary andComputational Challenges Scalable Data and Template Generator Raw SAR Template Insertion Kernel #1 Image Formation Templates Templates SAR Image Kernel #4 Detection Validation SAR Image Detections Templates Back-End Knowledge Formation Front-End Sensor Processing • Scalable synthetic data generation • Pulse compression • Polar Interpolation • FFT, IFFT (corner turn) • Sequential store • Non-sequential • retrieve • Large & small IO • Large Images • difference & • Threshold • Many small • correlations on • random pieces of • large image

  14. Outline • Introduction • Kernel Level Benchmarks • SAR Benchmark • Release Information • Summary

  15. HPEC Challenge Benchmark Release • http://www.ll.mit.edu/HPECChallenge/ • Future site of documentation and software • Initial release is available to PCA, HPCS, and HPEC SI program members through respective program web pages • Documentation • ANSI C Kernel Benchmarks • Single processor MATLAB SAR System Benchmark • Complete release will be made available to the public in first quarter of CY06

  16. Summary • The HPEC Challenge is a publicly available suite of benchmarks for the embedded space • Representative of a wide variety of DoD applications • Benchmarks stress computation, communication and I/O • Benchmarks are provided at multiple levels • Kernel: small enough to easily understand and optimize • Compact application: representative of real workloads • Single-processor and multi-processor • For more information, see http://www.ll.mit.edu/HPECChallenge/

More Related