1 / 17

Soo See Chai Department of Spatial Sciences, Curtin University of Technology

Backpropagation Neural Network for Soil Moisture Retrieval Using NAFE’05 Data : A Comparison Of Different Training Algorithms. Soo See Chai Department of Spatial Sciences, Curtin University of Technology. CONTENT. Neural Network and Soil Moisture Retrieval Backpropagation Neural Network

umay
Download Presentation

Soo See Chai Department of Spatial Sciences, Curtin University of Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Backpropagation Neural Network for Soil Moisture Retrieval Using NAFE’05 Data : A Comparison Of Different Training Algorithms Soo See Chai Department of Spatial Sciences, Curtin University of Technology

  2. CONTENT • Neural Network and Soil Moisture Retrieval • Backpropagation Neural Network • Training of Neural Network • Testing Results • Q and A

  3. Neural Network For Soil Moisture Retrieval • Radiometric signatures of a vegetation-covered field reflect an integrated response of the soil and vegetation system to the observing microwave system • Surface parameters and radiometric signatures :

  4. Backpropagation Neural Network

  5. Different Backpropagation Training Algorithms • Several different training algorithms • have a variety of different computation and storage requirements • No one algorithm is best suited to all locations • MATLAB : 11 different training algorithms • Review : basic gradient descent and Levenberg-Marquardt(LM) algorithm • How about the other algorithms ?

  6. Data Preparation • Roscommon area : 1/11, 8/11, 15/11 • Determine the area coordinate : • Roscommon : • Top latitude : -32.15380 • Bottom latitude : -32.18370 • Left longitude : 150.120 • Right longitude : 150.46900 • MATLAB : cut the area, extract the fields in the PLMR file • Copy the latitude, longitude, brightness temperature and altitude data into Excel • Extract the aircraft altitude of medium resolution mapping which is around 1050m to 1270m ASL

  7. Roscommon 1/11 Roscommon 8/11 Roscommon 15/11

  8. Example :

  9. A bit of Statistics … • Find the minimum and maximum of average Tb for each data set • Next find the range (max-min) • Find the width for each class ( 3 classes : training, validation and testing ) • Range / 3 • Find starting and ending point for each class

  10. We have now : • Group 1 of date 1/11, 8/11 and 15/11 (combined : GRP1) • Group 2 of date 1/11, 8/11, 15/11 (combined : GRP2) • Group 3 of date 1/11, 8/11, 15/11(combined : GRP3) • GRP 1 : randomly divide them into 3 groups : 60% for training, 30% for validation and 10% for testing • Same with GRP2 and GRP3 • All training data in one file , all validation data in one file, all testing data in one file

  11. Training : K-Fold Cross Validation • No. of data set is small, to get a better accuracy result, K-fold validation is used. • Training data + validation data = 112 • 8-fold cross validation, each time 14 data will be used for validation, 98 data for training • To make sure the data is random enough, each time the data will be randomized. Eg: • First run : • Second run : validation training validation training

  12. Training :NN Parameters determination • A series of experiments • trial and error • lowest RMSE • If yes, then save the input weight, layer weight and bias for the NN to be used for the other training algorithms • Fixed • Layer : 3 layers ( 1 input, 1 hidden, 1 output ) • Input : H polarized brightness temperature, TbH and physical soil temperature at 4cm • Hidden : sigmoid function • Output : linear function • Soil moisture (%v/v)

  13. Experiments carried out : • Decision : • Learning rate, lr = 0.005 • Momentum, mc = 0.4 • Input Weight, iw = W2.mat • Layer weight, lw = LW2.mat • Bias, b = B2.mat • No. of hidden neuron = 4 • No. of epochs = 200

  14. Testing Result I – Roscommon

  15. Conclusions • Different types of training algorithms of backpropagation NN is giving different but similar accuracy result • The training data is representative of the testing data

  16. Questions • Is the NN architecture transferable ? • Is number of data a factor contribute to the accuracy of the retrieval ? • Adding ancillary data (beside soil temperature) : vegetation water content and land cover information help ? • Adding V-polarized brightness temperature as an input ? • Adding these data directly or let the NN account for these data ?

  17. THANK YOU !

More Related