1 / 24

Artificial Neural Networks: An Alternative Approach to Risk – Based Design

Artificial Neural Networks: An Alternative Approach to Risk – Based Design. By George Mermiris. Introduction. Inspiration from the study of the human brain and physical neurons Response speed for physical neurons is 10 -3 s compared to electrical circuits with 10 -9 s

shayla
Download Presentation

Artificial Neural Networks: An Alternative Approach to Risk – Based Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Neural Networks:An Alternative Approach toRisk – Based Design By George Mermiris

  2. Introduction • Inspiration from the study of the human brain and physical neurons • Response speed for physical neurons is 10-3 s compared to electrical circuits with 10-9 s • Massive parallel structure: 1011 neurons with 104 connections per neuron • The efficiency of the brain is directly dependent on the accumulated experience  new connections are established which determine our capabilities

  3. Dendrites Axon Synapses Cell Body The Biological Model

  4. Artificial Neural Networks (ANN):Basic Forms, Feed-Forward Networks General pattern: • p: input vector • w: weight matrix • b: bias vector • n: net output of the neuron • : activation function • a: output vector of the network a = f(n) = f(wp + b) a = f(n) = f(wp + b)

  5. Multi-Neuron, Single-Layer ANN a =f(n) =f(Wp + b)

  6. Multi-Layer, Multi-Neuron Network

  7. Abbreviated Form of a Network

  8. Activation Functions Linear Function Log – Sigmoid Function Hyperbolic Tangent Sigmoid Function

  9. Training Neural Networks • The training of a network has the same concept as for humans: the larger its experience the better its response • For an ANN the learning is established with suitable adjustment of its weights and biases • Requirements: training data and proper algorithm

  10. The Backpropagation Algorithm A three-fold concept • Performance Index: Approximate Square Error: F(x) = (t - a)T(t – a) = eTe The Steepest Descent Algorithm for function F and modifications: g: gradient

  11. The Backpropagation Algorithm Chain Rule of Calculus: Calculation of the first derivatives of the performance index starting from the last layer and backpropagating to the first (!) Levenberg – Marquardt algorithm: Main variation of the method based on the concept of Newton’s method with small approximation

  12. Example 1: Resistance Experiment Case 1: 1 cm wave amplitude ANN Architecture: 1-4-3-1 Activation Function: Log – Sigmoid for hidden layers and Linear for output layer

  13. Example 1: Resistance Experiment

  14. Example 1: Resistance Experiment Case 2:2 cm wave amplitude ANN Architecture: 1-3-2-1 Activation Function: Log – Sigmoid for hidden layers and Linear for output layer

  15. Example 1: Resistance Experiment

  16. a (101) a (121) a (211) p W (105) W (1210) W (2110) logsig purelin logsig + + + b (101) b (121) b (211) 51 Example 2: Section Areas Curve Input: L, Amax, , LCB, Cp ANN Architecture: 5-10-12-21 Activation Function: Log – Sigmoid for hidden layers and Linear for output layer

  17. Example 2: Section Areas Curve Training Set - L=[153 156 159 … 180], in m - Amax=[335 345 355 … 425], in m2 - =[36000 37000 38000 … 45000] , in m3 - LCB=[-2.4 –2.5 –2.6 … -3.3], in m - Cp=[0.702 0.688 0.660 …0.588] Ordinates of SA curves for each combination Generalisation Sets [L Amax  LCB Cp] - Set1=[160 360 38500 –2.65 0.6664] - Set2=[178.5 420 44500 –3.25 0.594] - Set3=[150 325 35000 –2.3 0.718] “Network input” “Network output” “Testing the network”

  18. Example 2: Section Areas Curve (Set1)

  19. Example 2: Section Areas Curve (Set2)

  20. Example 2: Section Areas Curve (Set3)

  21. Strong points of ANN • Readily applicable to any stage of the design process, especially at the preliminary design where rough approximations are necessary • Potential to include different design parameters in the training set and avoid iterations • Results are obtained very fast with high accuracy • No highly sophisticated mathematical technique is involved, only basic concepts of Linear Algebra and Calculus • Very short computer times in common PC’s

  22. Weak points of ANN • Basic requirement is the existence of historical data for the creation of training set • Not readily applicable to novel ship types • The results are very sensitive to the network’s architecture and the training method selected each time, although these two parameters are very easily adjusted • There is no specific network architecture for a specific calculation: different architectures can provide the same results. The general rule is to use the simplest possible network

  23. Future Work • Other networks and training algorithms: recurrent ANN Suitable database for creating the training set for different applications Application to the Global Ship Design including Risk Data and Human Reliability Data

  24. Thank You!

More Related