1 / 13

Object Detection at Different Resolutions in Archaeological Sidescan Sonar Images

Object Detection at Different Resolutions in Archaeological Sidescan Sonar Images. Louis Atallah and Changjing Shang Institute of Informatics The British University in Dubai-The University of Edinburgh latallah and shang @inf.ed.ac.uk Richard Bates School of Geography and Geosciences

Download Presentation

Object Detection at Different Resolutions in Archaeological Sidescan Sonar Images

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Object Detection at Different Resolutions in Archaeological Sidescan Sonar Images Louis Atallah and Changjing Shang Institute of Informatics The British University in Dubai-The University of Edinburgh latallah and shang @inf.ed.ac.uk Richard Bates School of Geography and Geosciences The University of St Andrews crb@st-andrews.ac.uk June 2005

  2. Talk plan • Motivation and background • Survey and Images used • The scale-saliency algorithm • Object detection • Object matching

  3. Motivation and Background This work is part of the ‘Rapid Archaeological Site Survey and Evaluation’, which is a three-year research project funded by the Aggregates Levy Sustainability Fund (ALSF) administered by English Heritage, and based at the University of St Andrews, School of Geography and Geosciences. Partners include: • The University of Ulster • The British University in Dubai/ The University of Edinburgh • Wessex Archaeology • Reson Offshore.

  4. Motivation and background The project involves exploring the following areas: • The Stirling Castle (lost in a storm in 1703), located on the Goodwin Sands, a series of banks off the East Kent coast. • Hastings Shingle Bank. Aggregate extraction already taking place. The RASSE project has identified a test site within the Hastings Shingle Bank Licence Area located approximately 15km south of Hastings. • Placing artificial targets in a low spring tide water depth of 3m in Plymouth Sound (the area already has an artificially target, a 5m long boat, in place). • This work is a preliminary experiment done in Belfast Lough as a part of this project aiming at differentiating between useful archaeological material and other objects.

  5. The Dataset The survey was done in Smelt Mill Bay (Belfast Lough) in July/August 2001. A test site of material objects was placed on the seafloor. These objects were car tyres, ceramic balls, leather jackets, among other types... Three different sidescan systems were used to survey these objects, and images obtained for these three types of sonar. These were: EdgeTech 272-TD: 100/500 kHz, Imagenex 885; 675 kHz, and Geoacoustics 159-A; 100/500 kHz. The images used for this work are the Edgetech 272-TD images, each image containing 8 objects.

  6. Images Types of Objects: 1 & 2 car tyres. 3 amphora shoulder and neck 4 ceramic ball 5, 6 7 baskets 8 leather jacket Any problems with these images? Typical commercial sonar images Noisy!

  7. Preprocessing • The images are taken at different depths. First use the sonar geometry to correct for that. • De-noising images using a Wiener filter… • Still, how can we locate the objects in these images? • Looking into local appearance based feature detection…scale saliency.

  8. The scale-saliency algorithm • Salient areas in an image are areas that stand out from the background. Defined also as areas with local unpredictability or complexity. • First, we start by calculating the local Shannon entropy over a range of scales : H(s,x). However, there might be several peaks over several scales. s refers to the radius of the circle used to calculate the scale saliency of a certain point. • We can weight the entropy function with W(s,x) describing the change of the magnitude at the scale peak points. • The weighted scale-saliency can be used to detect the range of most important scales for a certain pixel. • That’s per pixel, how do we find the objects in an image?

  9. Scale saliency - finding objects • Clustering algorithm using KNN to group neighbouring pixels to form individual salient regions, using a threshold T to remove the least salient features. T=4.9 T=3.5 T=4.5

  10. Detected Objects

  11. Results • Varying the parameters of the method (T, the number of scales), ideally using a large training set. • Results are very encouraging, the method detects the object parts of almost all objects (success rates of more than 90%). • The method is also robust to intensity, rotation and contrast which vary in a real survey.

  12. Object Classification • Using a subset of images for training. First find the salient areas and label the object parts from 1-9 (8 object types and 1 for background). • Obtaining features for each part including: location, entropy, saliency over scale and weighted saliency. Add also a normalised histogram of the area selected. • Feature size=38, use PCA to lower dimensions. • PCA dimension=11 for best classification rates. • The rates vary according to training/testing set selection • In general, promising results with misclassification rates as low as 3.6% in some cases.

  13. Conclusions and Future work • Scale saliency can be used as a method to locate objects in large surveys. • Object matching using scale saliency. • Future work on differentiating between archaeological material and other underwater objects • Observing objects over a certain period, matching with changing surroundings. • Applications to wreck detection.

More Related