Financial Informatics –XIII: Neural Computing Systems - PowerPoint PPT Presentation

breanna-kramer
financial informatics xiii neural computing systems n.
Skip this Video
Loading SlideShow in 5 Seconds..
Financial Informatics –XIII: Neural Computing Systems PowerPoint Presentation
Download Presentation
Financial Informatics –XIII: Neural Computing Systems

play fullscreen
1 / 153
Download Presentation
Financial Informatics –XIII: Neural Computing Systems
106 Views
Download Presentation

Financial Informatics –XIII: Neural Computing Systems

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Financial Informatics –XIII:Neural Computing Systems Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2, IRELAND November 19th, 2008. https://www.cs.tcd.ie/Khurshid.Ahmad/Teaching.html 1

  2. Neural Networks:Real Networks London, Michael and Michael Häusser (2005). Dendritic Computation. Annual Review of Neuroscience. Vol. 28, pp 503–32

  3. Real Neural Networks:Cells and Processes A neuron is a cell with appendages; every cell has a nucleus and the one set of appendages brings in inputs – the dendrites – and another set helps to output signals generated by the cell

  4. Real Neural Networks:Cells and Processes The human brain is mainly composed of neurons: specialised cells that exist to transfer information rapidly from one part of an animal's body to another. This communication is achieved by the transmission (and reception) of electrical impulses (and chemicals) from neurons and other cells of the animal. Like other cells, neurons have a cell body that contains a nucleus enshrouded in a membrane which has double-layered ultrastructure with numerous pores. Dendrite Axon Terminals Soma Nucleus SOURCE: http://en.wikipedia.org/wiki/Neurons

  5. Real Neural Networks All the neurons of an organism, together with their supporting cells, constitute a nervous system. The estimates vary, but it is often reported that there are as many as 100 billion neurons in a human brain. Neurobiologist and neuroethologists have argued that intelligence is roughly proportional to the number of neurons when different species of animals compared. Typically the nervous system includes a : Spinal Cord is the least differentiated component of the central nervous system and includes neuronal connections that provide for spinal reflexes. There are also pathways conveying sensory data to the brain and pathways conducting impulses, mainly of motor significance, from the brain to the spinal cord; and, Medulla Oblongata: The fibre tracts of the spinal cord are continued in the medulla, which also contains of clusters of nerve cells called nuclei.

  6. Real Neural Networks Inputs to and outputs from an animal nervous system Cerebellum receives data from most sensory systems and the cerebral cortex, the cerebellum eventually influences motor neurons supplying the skeletal musculature. It produces muscle tonus in relation to equilibrium, locomotion and posture, as well as non-stereotyped movements based on individual experiences.

  7. Real Neural Networks Processing of some information in the nervous system takes place in Diencephalon. This forms the central core of the cerebrum and has influence over a number of brain functions including complex mental processes, vision, and the synthesis of hormones reaching the blood stream. Diencephalon comprises thalamus, epithalmus, hypothalmus, and subthalmus. The retina is a derivative of diencephalon; the optic nerve and the visual system are therefore intimately related to this part of the brain.

  8. Real Neural Networks Inputs to the nervous system are relayed thtough the Telencephalon (Cereberal Hemispheres) which includes the cerebral cortex, corpus striatum, and medullary center. Nine-tenths of the human cerebral cortex is neocortex, a possible result of evolution, and contains areas for all modalities of sensation (except smell), motor areas, and large expanses of association cortex in which presumably intellectual activity takes place. Corpus striatum, large mass of gray matter, deals with motor functions and the medullary center contains fibres to connect cortical areas of two hemispheres.

  9. Real Neural Networks Inputs to the nervous system are relayed thtough the Telencephalon (Cereberal Hemispheres) which includes the cerebral cortex, corpus striatum, and medullary center. Nine-tenths of the human cerebral cortex is neocortex, a possible result of evolution, and contains areas for all modalities of sensation (except smell), motor areas, and large expanses of association cortex in which presumably intellectual activity takes place. Corpus striatum, large mass of gray matter, deals with motor functions and the medullary center contains fibres to connect cortical areas of two hemispheres.

  10. Real Neural Networks:Cells and Processes Neurons have a variety of appendages, referred to as 'cytoplasmic processes known as neurites which end in close apposition to other cells. In higher animals, neurites are of two varieties: Axons are processes of generally of uniform diameter and conduct impulses away from the cell body; dendrites are short-branched processes and are used to conduct impulses towards the cell body. The ends of the neurites, i.e. axons and dendrites are called synaptic terminals, and the cell-to-cell contacts they make are known as synapses. Dendrite Axon Terminals Soma Nucleus SOURCE: http://en.wikipedia.org/wiki/Neurons

  11. 4 4 10 fan-in 10 fan-out + – summation – Asynchronous firing rate, c. 200 per sec. 1 - 100 meters per sec. Real Neural Networks:Cells and Processes 1010 neurons with 104 connections and an average of 10 spikes per second = 1015 adds/sec. This is a lower bound on the equivalent computational power of the brain.

  12. Real Neural Networks:Cells and Processes Henry Markram (2006). The Blue Brain Project. Nature Reviews Neuroscience Vol. 7, 153-160

  13. Neural Networks: Real and Artificial Observed Biological Processes(Data) Neural Networks & Neurosciences Biologically Plausible Mechanisms for Neural Processing & Learning (Biological Neural Network Models) Theory (Statistical Learning Theory & Information Theory) http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience

  14. Neural Networks & Neuro-economics The behaviour of economic actors: pattern recognition, risk-averse/prone activities; risk/reward Neural Networks & Neurosciences Biologically Plausible Mechanisms for Neural Processing & Learning (Biological Neural Network Models) Theory (Statistical Learning Theory & Information Theory) http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience

  15. Neural NetworksArtificial Neural Networks The study of the behaviour of neurons, either as 'single' neurons or as cluster of neurons controlling aspects of perception, cognition or motor behaviour, in animal nervous systems is currently being used to build information systems that are capable of autonomous and intelligent behaviour.

  16. Neural NetworksArtificial Neural Networks, the uses of

  17. Neural NetworksArtificial Neural Networks Artificial neural networks emulate threshold behaviour, simulate co-operative phenomenon by a network of 'simple' switches and are used in a variety of applications, like banking, currency trading, robotics, and experimental and animal psychology studies. These information systems, neural networks or neuro-computing systems as they are popularly known, can be simulated by solving first-order difference or differential equations.

  18. Neural NetworksArtificial Neural Networks The basic premise of the course, Neural Networks, is to introduce our students to an alternative paradigm of building information systems.

  19. Neural NetworksArtificial Neural Networks • Statisticians generally have good mathematical backgrounds with which to analyse decision-making algorithms theoretically. […] However, they often pay little or no attention to the applicability of their own theoretical results’ (Raudys 2001:xi). • Neural network researchers ‘advocate that one should not make assumptions concerning the multivariate densities assumed for pattern classes’ . Rather, they argue that ‘one should assume only the structure of decision making rules’ and hence there is the emphasis in the minimization of classification errors for instance. • In neural networks there are algorithms that have a theoretical justification and some have no theoretical elucidation’. • Given that there are strengths and weaknesses of both statistical and other soft computing algorithms (e.g. neural nets, fuzzy logic), one should integrate the two classifier design strategies (ibid) Raudys, Šarûunas. (2001). Statistical and Neural Classifiers: An integrated approach to design. London: Springer-Verlag

  20. Neural NetworksArtificial Neural Networks • Artificial Neural Networks are extensively used in dealing with problems of classification and pattern recognition. The complementary methods, albeit of a much older discipline, are based in statistical learning theory • Many problems in finance, for example, bankruptcy prediction, equity return forecasts, have been studied using methods developed in statistical learning theory:

  21. Neural NetworksArtificial Neural Networks Many problems in finance, for example, bankruptcy prediction, equity return forecasts, have been studied using methods developed in statistical learning theory: “The main goal of statistical learning theory is to provide a framework for studying the problem of inference, that is of gaining knowledge, making predictions, making decisions or constructing models from a set of data. This is studied in a statistical framework, that is there are assumptions of statistical nature about the underlying phenomena (in the way the data is generated).” (Bousquet, Boucheron, Lugosi) Statistical learning theory can be viewed as the study of algorithms that are designed to learn from observations, instructions, examples and so on. Olivier Bousquet, Stephane Boucheron, and Gabor Lugosi. Introduction to Statistical Learning Theory (http://www.econ.upf.edu/~lugosi/mlss_slt.pdf)

  22. Neural NetworksArtificial Neural Networks “Bankruptcy prediction models have used a variety of statistical methodologies, resulting in varying outcomes. These methodologies include linear discriminant analysis regression analysis, logit regression, and weighted average maximum likelihood estimation, and more recently by using neural networks.” The five ratios are net cash flow to total assets, total debt to total assets, exploration expenses to total reserves, current liabilities to total debt, and the trend in total reserves (over a three year period, in a ratio of change in reserves in computed as changes in Yrs 1 & 2 and changes in Yrs 2 &3). Zhang et al (1999) found that (a variant of) ‘neural network and Fischer Discriminant Analysis ‘achieve the best overall estimation’, although discriminant analysis gave superior results. Z. R. Yang, Marjorie B. Platt and Harlan D. Platt (1999)Probabilistic Neural Networks in Bankruptcy Prediction. Journal of Business Research, Vol 44, pp 67–74

  23. Neural NetworksArtificial Neural Networks A neural network can be described as a type of multiple regression in that it accepts inputs and processes them to predict some output. Like a multiple regression, it is a data modeling technique. Neural networks have been found particularly suitable in complex pattern recognition compared to statistical multiple discriminant analysis (MDA) since the networks are not subject to restrictive assumptions of MDA models Shaikh A. Hamid and Zahid Iqbal (2004). (1999) Using neural networks for forecasting volatility of S&P 500 Index futures prices. Journal of Business Research, Vol 57, pp 1116-1125.

  24. Neural NetworksArtificial Neural Networks Neural Networks for forecasting volatility of S&P 500 Index: A neural network was trained to learn to generate volatility forecasts for the S&P 500 Index over different time horizons. The results were compared with pricing option models used to compute implied volatility from S&P 500 Index futures options (the Barone-Adesi and Whaley American futures options pricing model) Forecasts from neural networks outperform implied volatility forecasts and are not found to be significantly different from realized volatility. Implied volatility forecasts are found to be significantly different from realized volatility in two of three forecast horizons. Shaikh A. Hamid and Zahid Iqbal (2004). (1999) Using neural networks for forecasting volatility of S&P 500 Index futures prices. Journal of Business Research, Vol 57, pp 1116-1125.

  25. Neural NetworksArtificial Neural Networks • The ‘remarkable qualities’ of neural networks: the dynamics of a single layer perceptron progresses from the simplest algorithms to the most complex algorithms: • Initial Training each pattern class characterized by sample mean vector  neuron behaves like E[uclidean] D[istance] C[lassifier]  ; • Further Training neuron begins to evaluate correlations and variances of features  neuron behaves like standard linear Fischer classifier • More training  neuron minimizes number of incorrectly identified training patterns  neuron behaves like a support vector classifier. • Statisticians and engineers usually design decision-making algorithms from experimental data by progressing from simple algorithms to more complex ones. Raudys, Šarûunas. (2001). Statistical and Neural Classifiers: An integrated approach to design. London: Springer-Verlag

  26. Neural Networks: Real and Artificial Observed Biological Processes(Data) Neural Networks & Neurosciences Biologically Plausible Mechanisms for Neural Processing & Learning (Biological Neural Network Models) Theory (Statistical Learning Theory & Information Theory) http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience

  27. Neural Networks & Neuro-economics The behaviour of economic actors: pattern recognition, risk-averse/prone activities; risk/reward Neural Networks & Neurosciences Biologically Plausible Mechanisms for Neural Processing & Learning (Biological Neural Network Models) Theory (Statistical Learning Theory & Information Theory) http://en.wikipedia.org/wiki/Neural_network#Neural_networks_and_neuroscience

  28. Neural Networks: Neuro-economics Observed Biological Processes(Data) Investors systematically deviate from rationality when making financial decisions, yet the mechanisms responsible for these deviations have not been identified. Using event-related fMRI, we examined whether anticipatory neural activity would predict optimal and suboptimal choices in a financial decision-making task. [] Two types of deviations from the optimal investment strategy of a rational risk-neutral agent as risk-seeking mistakes and risk-aversion mistakes. Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial Decisions.” Neuron, Vol. 47, pp 763–770

  29. Neural Networks: Neuro-economics Observed Biological Processes(Data) Nucleus accumbens [NAcc] activation preceded risky choices as well as risk-seeking mistakes, while anterior insula activation preceded riskless choices as well as risk-aversion mistakes. These findings suggest that distinct neural circuits linked to anticipatory affect promote different types of financial choices and indicate that excessive activation of these circuits may lead to investing mistakes. Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial Decisions.” Neuron, Vol. 47, pp 763–770

  30. Neural Networks: Neuro-economics Observed Biological Processes(Data) Association of Anticipatory Neural Activation with Subsequent Choice The left panel indicates a significant effect of anterior insula activation on the odds of making riskless (bond) choices and risk-aversion mistakes (RAM) after a stock choice (Stockt-1). The right panel indicates a significant effect of NAcc activation on the odds of making risk-aversion mistakes, risky choices, and risk-seeking mistakes (RSM) after a bond choice (Bondt-1). The odds ratio for a given choice is defined as the ratio of the probability of making that choice divided by the probability of not making that choice. Percent change in odds ratio results from a 0.1% increase in NAcc or anterior insula activation. Error bars indicate the standard errors of the estimated effect. *coefficient significant at p < 0.05. Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial Decisions.” Neuron, Vol. 47, pp 763–770

  31. Neural Networks: Neuro-economics Nucleus accumbens [NAcc] activation preceded risky choices as well as risk-seeking mistakes, while anterior insula activation preceded riskless choices as well as risk-aversion mistakes. These findings suggest that distinct neural circuits linked to anticipatory affect promote different types of financial choices and indicate that excessive activation of these circuits may lead to investing mistakes. Observed Biological Processes(Data) Camelia M. Kuhnen and Brian Knutson (2005). “Neural Antecedents of Financial Decisions.” Neuron, Vol. 47, pp 763–770

  32. Neural Networks: Neuro-economics Observed Biological Processes(Data) To explain investing decisions, financial theorists invoke two opposing metrics: expected reward and risk. Recent advances in the spatial and temporal resolution of brain imaging techniques enable investigators to visualize changes in neural activation before financial decisions. Research using these methods indicates that although the ventral striatum plays a role in representation of expected reward, the insula may play a more prominent role in the representation of expected risk. Accumulating evidence also suggests that antecedent neural activation in these regions can be used to predict upcoming financial decisions. These findings have implications for predicting choices and for building a physiologically constrained theory of decision-making. Brian Knutson and Peter Bossaerts (2007). “Neural Antecedents of Financial Decisions.” The Journal of Neuroscience, (August 1, 2007), Vol. 27 (No. 31), pp 8174–8177

  33. Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons Two examples: Categorisation and attentional modulation of conditioning Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  34. Category Nodes Feature Nodes Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet. Feature nodes respond to the presence or absence of marks at particular locations. Category nodes respond to the patterns of activation in the feature nodes. Weighted sum of inputs Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  35. Artificial Neural Networks Artificial Neural Networks (ANN) are computational systems, either hardware or software, which mimic animate neural systems comprising biological (real) neurons. An ANN is architecturally similar to a biological system in that the ANN also uses a number of simple, interconnected artificial neurons.

  36. Artificial Neural Networks In a restricted sense artificial neurons are simple emulations of biological neurons: the artificial neuron can, in principle, receive its input from all other artificial neurons in the ANN; simple operations are performed on the input data; and, the recipient neuron can, in principle, pass its output onto all other neurons. Intelligent behaviour can be simulated through computation in massively parallel networks of simple processors that store all their long-term knowledge in the connection strengths.

  37. Artificial Neural Networks According to Igor Aleksander, Neural Computing is the study of cellular networks that have a natural propensity for storing experiential knowledge. Neural Computing Systems bear a resemblance to the brain in the sense that knowledge is acquired through training rather than programming and is retained due to changes in node functions. Functionally, the knowledge takes the form of stable states or cycles of states in the operation of the net. A central property of such states is to recall these states or cycles in response to the presentation of cues.

  38. O 0 1 0 0 1 I 0 0 1 0 1 E 0 1 0 0 1 A 0 0 0 0 1 Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet: e.g. 25 patterns for representing 32 letters. U 0 1 1 0 1 The line patterns (vertical, horizontal, short. Long strokes) and circular patterns in Roman alphabet can be represented in a binary system. Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  39. O 0 1 0 0 1 I 0 0 1 0 1 E 0 1 0 0 1 A 0 0 0 0 1 Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons The transformation of linear and circular patterns into binary patterns requires a degree of pre-processing and judicious guesswork! U M A P P I N G 0 1 1 0 1 The line patterns (vertical, horizontal, short. Long strokes) and circular patterns in Roman alphabet can be represented in a binary system. Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  40. Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet. Association between feature nodes and category nodes should be allowed to change over time (during training), more specifically as a result of repeated activation of the connection A - Category Nodes Weighted sum of inputs 0 0 0 0 1 Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  41. Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet. Association between feature nodes and category nodes should be allowed to change over time (during training), more specifically as a result of repeated activation of the connection A - E Category Nodes Weighted sum of inputs 0 1 0 0 1 Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  42. Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet. Competition amongst to win over an individual letter shape and then to inhibit other neurons to respond to that shape. Especially helpful when there is noise in the signal, e.g. sloppy writing - Category Nodes Weighted sum of inputs Feature Nodes Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  43. θ 0 1 0 0 1 ι 0 0 1 0 1 ε 0 1 0 0 1 α 0 0 0 0 1 Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet: e.g. 25 patterns for representing 32 letters. υ 0 1 1 0 1 The line patterns (vertical, horizontal, short. Long strokes) and circular patterns in Greek alphabet can be represented in a binary system. Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  44. Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet. Since the weights change due to repeated presentation our system will learn to ‘identify’ Greek letters of the alphabet α - Category Nodes Weighted sum of inputs 0 0 0 0 1 Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  45. Neural Networks: Real and Artificial Organising Principles and Common Themes: Association between neurons and competition amongst the neurons A network for identifying handwritten letters of the alphabet. Association between feature nodes and category nodes should be allowed to change over time (during training), more specifically as a result of repeated activation of the connection α - ε Category Nodes Weighted sum of inputs 0 1 0 0 1 Levine, Daniel S. (1991). Introduction to Neural and Cognitive Modeling. Hillsdale, NJ & London: Lawrence Erlbaum Associates, Publishers (See Chapter 1)

  46. Artificial Neural Networks and Learning Artificial Neural Networks 'learn' by adapting in accordance with a training regimen: The network is subjected to particular information environments on a particular schedule to achieve the desired end-result. There are three major types of training regimens or learning paradigms: SUPERVISED; UN-SUPERVISED; REINFORCEMENT or GRADED.

  47. Biological and Artificial NN’s

  48. Biological and Artificial NN’s

  49. Biological and Artificial NN’s

  50. Biological and Artificial NN’s