1 / 66

Identification of Tree Locations in Geographic Images

This thesis focuses on the identification of tree locations in geographic images using various image processing techniques and virtual reality tools. The purpose is to create a utility for accurately and efficiently placing trees in virtual wildfire simulations.

pooveyd
Download Presentation

Identification of Tree Locations in Geographic Images

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Identification of Tree Locations in Geographic Images A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Engineering By David Brown Dr. Frederick C. Harris, Jr., Thesis Advisor December, 2008

  2. Committee • Dr. Frederick C. Harris, Jr. • Dr. Sergiu M. Dascalu • Dr. Timothy J. Brown

  3. Overview • Purpose • Background • Methods • Software Specification and Design • Implementation and Results • Conclusion and Future Work

  4. Purpose

  5. Purpose • To create an item placement utility for VFIRE (Virtual Fire In Realistic Environments)‏ • VFIRE is a virtual reality application for visualizing wildfire simulations. • Current area of interest is Kyle Canyon in Southern Nevada.

  6. Purpose Main uses of VFIRE: • Fire Training • Fire Planning • Fire Model Verification Wildfire Visualization [25]

  7. Purpose • The placement of items in the visualization should correspond to their locations in the real environment. • This utility is intended to place large numbers of trees with reasonable speed and accuracy. • It can also be used to place a small number of houses with reasonable speed and accuracy.

  8. Background

  9. Background – Geographic Images Photography • Standard Color • Panchromatic • Multispectral • Hyperspectral • Color Infrared (CIR)‏

  10. Background – Photography • CIR images look different from most other types True Color Image [5, page 45] False Color Image [5, page 45]

  11. Background – Vegetation Maps • Vegetation data can be displayed as a map. • Vegetation maps have been created by LANDFIRE to show various attributes. Map of Vegetation Cover Map of Vegetation Type

  12. Background – Point Operations • Each output pixel is based on a single input pixel. • Changing the brightness of an image is a point operation. Original Image Image After Increasing Brightness

  13. Background – Neighborhood Operations • Blurring an image is a neighborhood operation. • The blur filter is applied to each input neighborhood. Original Image Blurred Image

  14. Background – Edge Detection • The LoG (Laplacian of Gaussian) Filter is an edge detection filter in which the level of detail can be controlled. Laplacian of Gaussian (LoG) filter [36]

  15. Background – Template Matching • Can be used as the first step in image analysis • Used when finding the location of known item • Neighborhood operation where the filter mask is a template of the desired item • Filtering produces a correlation image that can be scanned for bright spots.

  16. Background – Virtual Reality [2] Requirements: • Virtual World • Immersion • Sensory Feedback • Interactivity

  17. Background – HMDs [2] Head Mounted Displays • 100% Field of Regard • May cause dizziness • Only one person can view at one time A Head Mounted Display [2, page 14]

  18. Background – Multi-Sided Projection Displays [2] • Field of regard depends on the number of sides. • Wider field of view than HMDs • No dizziness • Many people can view at once • More bulky and expensive than HMDs Three-Sided Projection Display [25]

  19. Background – Head Tracking • View must be adjusted for location and orientation of head. • Stereoscopic display can be used to create depth perception. Head-Tracking Active Stereo Goggles [17]

  20. Background – Input Devices • A wand is a commonly used input device for virtual reality systems. Virtual Reality Wand [17]

  21. Background – Related Work Applications • Plantation Management • Assessing Forest Health • Harvestable Lumber Estimation • Fuel Load Estimation

  22. Background – Related Work Culvenor [6] • Even-Aged Mountain Ash (Eucalyptus)‏ • NIR Selected from CIR • Identify Local Maxima • Identify Local Minima • Cluster Intermediate Pixels

  23. Background – Related Work Pouliot et al. [33] • Uniformly Spaced Spruce Trees in an Arboretum • Absolute Difference of NIR and Red Bands • Moving Window, Local Maximum Filtering

  24. Background – Related Work Brandtberg and Walter [11] • 80-Year-Old Stands of Scot pine, Norway spruce, birch, and aspen • Perform Scale-Space Edge Detection to Extract Tree Crown Perimeters • Analyze Perimeter Curvatures to Estimate Centroids 10-cm, CIR Brightness Scale Space Image After Edge Detection Estimating Centers

  25. Background – Related Work Larsen [23] • Image of Norway spruce • Template Created from Ray-Traced 3D Tree Model • Model Incorporates Aircraft Position, sun position, and Spiecies-Specific Light-Scatering Parameters Norway Spruce 3D Template Model

  26. Background – Related Work Image Analysis Software [4] • Will perform template matching

  27. Background – Related Work Image Analysis Software • Not likely to output locations in geospatial coordinates • Not likely to provide geospatially aligned overlays of vegetation maps • Not likely to display vegetation map data for selected locations • Not likely to make placements based on vegetation maps

  28. Methods

  29. Methods Goals: • Achieve adequate tree-placement accuracy using whatever images (if any) are available. • Enhance Accuracy using vegetation maps. • Make tree placements using vegetation maps alone if no photographic image is available.

  30. Methods System: • Interactive (not fully automatic)‏ • Template Matching (no image constraints)‏ • Templates Created at Runtime (quickly create multiple templates)‏ • Vegetation maps provide information about terrain. • Placements can be made based on vegetation maps alone.

  31. Methods System: • The algorithm is not tailored to any particular image. • The user-defined templates are tailored to the image. • The algorithm is tailored to the correlation image produced using the templates.

  32. Methods Data for Kyle Canyon • 4-Meter Photographic GeoTIFFs • Red, Green, Blue, NIR • 1-Meter Photographic GeoTIFF • Panchromatic • 5-Meter Vegetation Maps Sampled from 30-Meters • Vegetation cover, vegetation type, vegetation Height

  33. Methods 1-Meter Panchromatic Image • Trees look like blobs. • Species, size, shadow, and density are different in different parts of the image.

  34. Software Specification and Design

  35. Software Specification and Design Use Cases

  36. Software Specification and Design System consists of five groups of global functions using two existing libraries.

  37. Implementation and Results

  38. Implementation and Results Detection Process • The user selects a tree to use as a template. • The tree is the gray blob. • The shadow is the dark, elongated region. Cite of First Template (Zoomed In)‏

  39. Implementation and Results Detection Process • The user draws a highlighting mark over the tree and shadow. Template Defined by User

  40. Implementation and Results Detection Process • Area Near Template, Zoomed Out Area Near Template (Zoomed Out)‏

  41. Implementation and Results Detection Process • Correlation Image Stored in Red Buffer of Workspace Image • Other buffers are used for intermediate processing. Correlation Image

  42. Implementation and Results Detection Results • Detection Results Using a Single Template Result Using One Template

  43. Implementation and Results Detection Process • User controls tuning parameters for tree detection . Tuning Parameter Window

  44. Implementation and Results • Detection Process • User specifies data for trees associated with each template. • Locations, types, etc are then written to file. Preparation to Create Output File

  45. Implementation and Results Detection Process • Entire Image of Kyle Canyon (8km × 6km)‏ Entire Kyle Canyon Image

  46. Implementation and Results Detection Process • Vegetation Map of Same Area (8km × 6km)‏ Entire Vegetation Map

  47. Implementation and Results Detection Process • Vegetation Map As Overlay onto Image (8km × 6km)‏ Overlay of Vegetation Map onto Photographic Image

  48. Implementation and Results Detection Process • Text Output Resulting When User Clicks on Image Text Output from Clicking on Image

  49. Implementation and Results Partially Random Placement • Tree Placements Made According to Map of Vegetation Coverage, Without Using Image Partially Random Placements

  50. Implementation and Results Partially Random Placement • User can control how placements are made when no image is available Options for Partially Random Placements

More Related