1 / 19

Uwe Knauer, Andreas Backhaus, Udo Seiffert Biosystems Engineering Fraunhofer IFF

Non-Standard Metrics. To Improve Classification of Hyperspectral DaTa. Mittweida, 3. Juli 2014. Uwe Knauer, Andreas Backhaus, Udo Seiffert Biosystems Engineering Fraunhofer IFF. Outline. Introduction Datasets Experiments and Results Summary. Introduction. Hyperspectral Data.

xerxes
Download Presentation

Uwe Knauer, Andreas Backhaus, Udo Seiffert Biosystems Engineering Fraunhofer IFF

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Non-Standard Metrics ToImproveClassificationofHyperspectralDaTa Mittweida, 3. Juli 2014 • Uwe Knauer, Andreas Backhaus, Udo Seiffert • Biosystems Engineering • Fraunhofer IFF

  2. Outline • Introduction • Datasets • Experiments andResults • Summary

  3. Introduction Hyperspectral Data • High-dimensional functionaldata • Intensityofreflectedortransmitted light asfunctionofwavelength Frame measurements Tuneablefilters Fulldatacube Line measurements Spot measurements Source: Panalytical Source: NEO Source: Nuance Source: Cubert

  4. Introduction Studies & Applications • Rich dataas a keytoenablenovel non-contactmeasurementapplications in industrial, agricultural, oracademicenvironments • Biosystems Engineering operates a numberof different sensors at aspectral lab in Magdeburg • ASD Fieldspec (VIS, NIR, SWIR) • NEO SWIR 320e (SWIR) • NEO VNIR 1600 (VIS, NIR) • Nuance CIR (VIS, NIR) • JDSU MicroNIR (NIR, SWIR) • …

  5. Introduction Typical Approach • NEO SWIR 320m-e • Spatialresolution 320 px • Spectralresolution 256 bands • 970-2500 nm @ 6 nmbands 1 ... j ANN Output Feature Extraction Normalization 1 ... k Learning

  6. Introduction Scopeofourstudy • Classificationtreatedas a blackbox • Toolbox applicationdrivenselectionofthemostsophisticatedtool • Whatif a singletoolis not sufficient? • Weselected a numberofdatasetswhere different prototype basedmethodsfailedtoachieve a sufficientaccuracy. • Investigating Multiple Classifier Fusion • Focusing on different parametrizationsand non-standard metrics • Howtocombinetheexistingtools?

  7. Datasets D1 – Detectingaluminium in waste D2 – Immaturevsmaturecoffeebeans D3 – Putridvs normal hazelnuts Randomlychosen, equallydistributed, N=2000 samples per class, 256-dim featurespace

  8. Datasets D4 – Detectingfungiinfested hazelnuts D5– Anomalitydetection on thesurfaceoffluffedpulp Randomlychosen, equallydistributed, N=2000 samples per class, 256-dim featurespace

  9. Creating a Diverse Ensemble ofClassifiers Model Types, NumberofPrototypes, Metrics Backhaus, A., Bollenbeck, F., Seiffert, U.: Robust classificationofthenutritionstatein cropplantsbyhyperspectralimagingandartificialneuralnetworks. In: Proc. 3rd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, Lisboa, Portugal (2011) RBF Hammer, B., Strickert, M., Villmann, T.: Supervised Neural Gas with general similarity measure. Neural Processing Letters 21, 21–44 (2005) SNG Hammer, B., Villmann, T.: Generalizedrelevancelearningvectorquantization. Neural Networks 15, 1059–1068 (2002) GLVQ NumberofPrototypes/hiddenneurons 20, 30, 40 KLD

  10. Experiments andResults Baseline Conditions Do non-standard metricsimproveresults? obtainedwith 10-fold CV sometimes, slightly, itdepends

  11. Experiments andResults Baseline Conditions Howtochooseparameters?

  12. Multiple Classifier Fusion Base Classifiers TrainedCombiner Binary Combination CRAGORS Random Forest AdaBoost on DecisionTrees • Investigating different poolsofinputclassifiers • RBF • GLVQ • SNG • Euclidian • Gamma • All+KLD

  13. Experiments andResults Dimension offeaturespace 12 60 72 74 obtainedwith 10-fold CV Howmanyclassifierscontributedtotheresult?

  14. Experiments andResults Howmanyclassifierscontributedtotheresult? obtainedwith 10-fold CV Knauer, U., Seiffert, U.: CascadedReductionandGrowingofResult Sets forCombiningObjectDetectors. In: Zhou, Z.-H., Roli, F., Kittler, J. (eds.) MCS 2013. LNCS, vol. 7872, pp. 121–133. Springer, Heidelberg (2013) Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. In: European Conference on Computational Learning Theory, pp. 23–37 (1995) Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)

  15. Experiments andResults EstimationofRelevance Knauer, U.; Backhaus, A.; Seiffert, U., Fusion Treesfor Fast andAccurateClassificationofHyperspectral Data with Ensembles of Gamma-divergencebased RBF Networks, NeuralComputing andApplications, Springer, 2014.

  16. Experiments andResults Relevancebasedselectionvsrandomselection

  17. Experiments andResults Importanceof proper parametersettings

  18. Summary • Fusion ofclassifieroutputssignificantlyimprovesaccuracyfor all datasets • Creating a poolof prototype basedclassifiersbyvaryingthevalueof Gamma seemstobe an effectiveway • Most significantaccuracygainobservedfor SNG and GLVQ • Best accuracyobtainedwith RBF networks • Analysis oftreestructuresenablerelevancebasedselectionofclassifierstoreducecomputationalload

  19. Thankyouforyourattention. Questions?

More Related