1 / 23

Canadian Weather Analysis Using Connectionist Learning Paradigms

Canadian Weather Analysis Using Connectionist Learning Paradigms. Imran Maqsood * , Muhammad Riaz Khan  , Ajith Abraham . * Environmental Systems Engineering Program, Faculty of Engineering, University of Regina, Regina, Saskatchewan S4S 0A2, Canada, E-mail: maqsoodi@uregina.ca

clowers
Download Presentation

Canadian Weather Analysis Using Connectionist Learning Paradigms

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Canadian Weather Analysis Using Connectionist Learning Paradigms Imran Maqsood*, Muhammad Riaz Khan, Ajith Abraham *Environmental Systems Engineering Program, Faculty of Engineering, University of Regina, Regina, Saskatchewan S4S 0A2, Canada, E-mail: maqsoodi@uregina.ca Partner Technologies Incorporated, 1155 Park Street, Regina, Saskatchewan S4N 4Y8, Canada, E-mail:riaz917@hotmail.com Faculty of Information Technology, School of Business Systems, Monash University, Clayton 3800, Australia, E-mail: ajith.abraham@ieee.org 7th Online World Conference on Soft Computing in Industrial Applications (on WWW), September 23 - October 4, 2002

  2. CONTENTS • Introduction • MLP, ERNN and RBFN Background • Experimental Setup of a Case Study • Test Results • Conclusions WSC7

  3. 1. INTRODUCTION • Weather forecasts provide critical information about future weather • Weather forecasting remains a complex business, due to its chaotic and unpredictable nature • Combined with threat by the global warming and green house gas effect, impact of extreme weather phenomena on society is growing costly, causing infrastructure damage, injury and the loss of life. WSC7

  4. Accurate weather forecast models are important to the countries, where the entire agriculture depends upon weather • Previously, several artificial intelligence techniques have been used in the past for modeling chaotic behavior of weather • However, several of them use simple feed-forward neural network training methods using backpropagation algorithm WSC7

  5. Study Objectives • To develop an accurate and reliable predictive models for forecasting the weather of Vancouver, BC, Canada. • To compare performance of multi-layered perception (MLP) neural networks, Elman recurrent neural networks (ERNN) and radial basis function network (RBFN) for the weather analysis. WSC7

  6. 2. ANN BACKGROUND INFORMATIONANN Advantages • An ability to solve complex and non-linear problems • Quick response • Self-organization • Real time operation • Fault tolerance via redundant information coding • Adaptability and generalization WSC7

  7. wjk Wij I1 O1 O2 I2    In Om k Oi Vj Input Layer Hidden Layer Output Layer (a) Multi-Layered Perceptron (MLP) Networks • network is arranged in layers of neurons • every neuron in a layer computes sum of its inputs and passes this sum through a nonlinear function as its output. • Each neuron has only one output, but this output is multiplied by a weighting factor if it is to be used as an input to another neuron (in a next higher layer) • There are no connections among neurons in the same layer. Figure: Architecture of 3-layered MLP network for weather forecasting WSC7

  8. D-1 Feedback I1 I2 O1     I3  Om In D-1 Feedback Input Layer Hidden Layer Output Layer (b) Elman Recurrent Neural Networks (ERNN) • ERNN are a subclass of recurrent networks • They are multilayer perceptron networks augmented with one or more additional context layers storing output values of one of the layers delayed by one step and used for activating this or some other layer in the next time step • The Elman network can learn sequences that cannot be learned with other recurrent neural network Figure: Architecture of 3-layered ERNN WSC7

  9. Output =  wi i(x)  W0 Pure linear Adjustable weights wi w0 = bias W1 Wn W2 Adjustable centers ci Adjustable spreads i n  1 2 Input Layer I1 I2 Im (c) Radial Basis Function Network (RBFN) • network consists of 3-layers: input layer, hidden layer, and output layer • The neurons in hidden layer are of local response to its input and known as RBF neurons, while the neurons of the output layer only sum their inputs and are called linear neurons • network is inherently well suited for weather prediction, because it naturally uses unsupervised learning to cluster the input data. Figure: Architecture of RBFN WSC7

  10. 3. EXPERIMENTAL SETUP Weather Data: • Vancouver, BC, Canada • 1-yr data: Sep 2000 – Aug 2001 • Observed Parameters (most important): • Minimum Temperature (oC) • Maximum Temperature(oC) • Wind-Speed (km/hr) WSC7

  11. Training and Testing Datasets • Dataset 1: MLP and ERNN • Testing dataset = 11-20 January 2001 • Training dataset = remaining data • Dataset 2: RBFN, MLP and ERNN • Testing dataset = 01-15 April 2001 • Training dataset = remaining data • We used this above method to ensure that there is no bias on the training and test datasets WSC7

  12. Simulation System Used • Pentium-III, 1GHz processor 256 MB RAM • all the experiments were simulated using MATLAB Steps taken before starting the training process: • Error level was set to a value (10-4) • The hidden neurons were varied (10-80) and the optimal number for each network were then decided. WSC7

  13. 4. TEST RESULTSTraining Convergence of MLP and ERNN Convergence of the LM and OSS training algorithms using MLP network Convergence of the LM and OSS training algorithms using ERNN WSC7

  14. Performance evaluation parameters (min. temperature) MLP Network ERNN OSS LM OSS LM Mean absolute % error (MAPE) 0.0221 0.0202 0.0182 0.0030 Root mean square error (RMSE) 0.0199 0.0199 0.0199 0.0031 Mean absolute deviation (MAD) 0.7651 0.8411 0.7231 0.1213 Correlation coefficient 0.9657 0.9940 0.9826 0.9998 Training time (minutes) 0.3 1 0.3 7 Number of iterations (epochs) 1015 7 673 11 Comparison of Actual vs. 10-day ahead Forecastingusing OSS and LM approaches (a) Minimum Temperature (11-20 Jan 2001) ERNN MLP network WSC7

  15. Performance evaluation parameters MLP Network ERNN OSS LM OSS LM Mean absolute % error (MAPE) 0.0170 0.0087 0.0165 0.0048 Root mean square error (RMSE) 0.0200 0.0099 0.0199 0.0067 Mean absolute deviation (MAD) 0.8175 0.4217 0.7944 0.2445 Correlation coefficient 0.964 0.999 0.945 0.982 Training time (minutes) 0.4 30 1.8 30 Number of iterations (epochs) 850 7 1135 10 (b) Maximum Temperature (11-20 Jan 2001) ERNN MLP network WSC7

  16. Performance evaluation parameters (wind-speed) MLP Network ERNN OSS LM OSS LM Mean absolute % error (MAPE) 0.0896 0.0770 0.0873 0.0333 Root mean square error (RMSE) 0.1989 0.0162 0.0199 0.0074 Mean absolute deviation (MAD) 0.8297 0.6754 0.7618 0.3126 Correlation coefficient 0.9714 0.9974 0.9886 0.9995 Training time (minutes) 0.3 1 0.5 8 Number of iterations (epochs) 851 8 1208 12 (c) Maximum Wind-Speed (11-20 Jan 2001) ERNN MLP network WSC7

  17. Comparison of Relative Percentage Error between Actual and Forecasted Parameters ERNN MLP network Minimum Temperature Maximum Temperature Wind-Speed WSC7

  18. Network model Number of hidden neurons Number of hidden layers Activation function used in hidden layer Activation function used in output layer MLP 45 1 Log-sigmoid Pure linear ERNN 45 1 Tan-sigmoid Pure linear RBFN 180 2 Gaussian function Pure linear Comparison of Training of Connectionist Models WSC7

  19. 15 C) o 10 Temperature ( 5 Actual value RBFN MLP RNN 0 0 3 6 9 12 15 15 Days of the Month C) o 10 Temperature ( 5 0 0 3 6 9 12 15 Days of the Month 60 40 Wind Speed (km/h) 20 0 0 3 6 9 12 15 Days of the Month Comparison among three Neural Networks Techniques for 15-day ahead Forecasting (1-15 Apr 2001) Maximum Temperature Minimum Temperature Wind-Speed WSC7

  20. Model Performance Evaluation Parameters Maximum Temperature Minimum Temperature Wind Speed RBFN MAP MAD Correlation Coefficient 3.821 0.420 0.987 3.622 1.220 0.947 4.135 0.880 0.978 MLP MAP MAD Correlation Coefficient 6.782 1.851 0.943 6.048 1.898 0.978 6.298 1.291 0.972 ERNN MAP MAD Correlation Coefficient 5.802 0.920 0.946 5.518 0.464 0.965 5.658 0.613 0.979 Performance Evaluation of RBFN, MLP and ERNN Techniques WSC7

  21. 5. CONCLUSIONS • In this paper, we developed and compared the performance of multi-layered perceptron (MLP) neural network, Elman recurrent neural network (ERNN) and radial basis functions network (RBFN). • It can be inferred that ERNN could yield more accurate results, if good data selection strategies, training paradigms, and network input and output representations are determined properly. WSC7

  22. Levenberg-Marquardt (LM) approach appears to be the best learning algorithm. However, it requires more memory and is computationally complex while compared to one-step-secant (OSS) algorithm. • Empirical results clearly demonstrate that compared to MLP neural network and ERNN, RBFN are much faster and more reliable for the weather forecasting problem considered. • A comparison of the neurocomputing techniques with other statistical techniques would be another future research topic. WSC7

  23. THANK YOU ! WSC7

More Related