1 / 19

USING PROBABILISTIC NEURAL NETWORKS FOR ATTRIBUTE ASSESSMENT AND GENERAL REGRESSION

USING PROBABILISTIC NEURAL NETWORKS FOR ATTRIBUTE ASSESSMENT AND GENERAL REGRESSION. By. David Lubo -Robles*, Thang Ha, S. Lakshmivarahan , and Kurt J. Marfurt. The University of Oklahoma. AASPI. Conclusions. Case study. MLFN vs . PNN. Workflow. Introduction. Optimize. GRNN.

Jimmy
Download Presentation

USING PROBABILISTIC NEURAL NETWORKS FOR ATTRIBUTE ASSESSMENT AND GENERAL REGRESSION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. USING PROBABILISTIC NEURAL NETWORKS FOR ATTRIBUTE ASSESSMENT AND GENERAL REGRESSION By David Lubo-Robles*, Thang Ha, S. Lakshmivarahan, and Kurt J. Marfurt The University of Oklahoma

  2. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Probabilistic Neural Networks (PNN) is an interpolation method in which different mathematical operations can be organized in different layers forming a neural network structure (Specht 1990, Hampson et al., 2001). • Compared to Multi-layer Feedforward Networks which commonly use a sigmoid function for its implementation, PNN uses a Gaussian distribution as an activation function (Specht, 1990). • PNN was designed fundamentally for classification between two or more classes. However, it can be modified into the General Regression Neural Network for mapping/regression tasks (Masters, 1995). • Objectives: • Predict missing logs in well data • Seismic attribute assessment. • Predict reservoir properties using seismic attributes. 1

  3. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN Multi-layer Feedforward Network (MLFN): Hidden Layer Input Layer Hidden Layer Output Layer i1 W1 i2 Linear Nonlinear W13 i3 W2 i4 Activation function net= W1*i1 + W2*i2 2

  4. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Probabilistic Neural Network (PNN): Pattern Layer Pattern Layer Summation Layer Summation Layer Input Layer p11 Class 1 Output Layer p12 + + … Class 2 p21 p22 3

  5. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • General Regression Neural Network (GRNN): Output Layer Summation Layer Input Layer Pattern Layer p11 Numerator Predicted Value Denominator 4

  6. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • GRNN: Training Data (pattern) Validation Data (input) Property () Attribute #2 Predicted Property () Attribute #1 Attribute #2 Attribute #1 Depth/Time Depth/Time 5

  7. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Optimizing σ: Exhaustive Search. Cross-validation is used to train the neural network. (Hampson, 2001) Training Data Validation Data Property () Predicted Property () --- > Error 1 // e1 2 // e2 3 // e3 4 // e4 6

  8. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Attribute assessment: • Combination List Training Data Validation Data • Leaving out: Well #2 1 // e1 2 // e2 3 // e3 1,2 // e1,2 1,3 // e1,3 2,3 // e2,3 1,2,3 // e1,2,3 7

  9. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN 8

  10. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN CASE STUDY: PREDICTING POROSITY USING WELL LOGS IN DIAMOND M FIELD, TX. 9

  11. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Geologic Background: Top Reef • Diamond M Field is located in Scurry County, TX., approximately 80 mi northeast of Midland, Texas. • The trend is part of the Horseshoe Atoll Reef Complex, an arcuate chain of reef mounds, made of mixed types of bioclastic debris that accumulated during the Late Paleozoic in the Midland basin (Vest, 1970). • The massive carbonates presented in the Atoll are separated by correlative shale beds (Davogustto, 2010) . Modified from Walker et al., 1995 Wells: Garnet, Emerald, Topaz, Jade. 10

  12. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Logs used for prediction: Gamma Ray (GR), Resistivity (RD), Density ( ), and Photoelectric Factor (PEFZ). GR vs. Porosity ρ vs. Porosity Pearson’s correlation coefficient --- > Porosity Porosity GR (API) ρ (g/cm3 ) PEFZ vs. Porosity RD vs. Porosity Porosity Porosity RD( omh-ft) PEFZ 11

  13. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Combination Lists Leaving out well: Garnet Leaving out well: Emerald Error=0.00122 Error= 0.00125 12

  14. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Combination Lists Leaving out well: Jade Leaving out well: Topaz Error= 0.00256 Error=0.000977 13

  15. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN Error=0.000977 Error= 0.00256 Error=0.00122 Error= 0.00125 14

  16. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Cross-plots Garnet: (Corr = 0.7004) Jade: (Corr = 0.6998) Emerald: (Corr = -0.0881) Topaz: (Corr = 0.7020) 15

  17. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • Future work 16

  18. AASPI Conclusions Case study MLFN vs. PNN Workflow Introduction Optimize GRNN Exhaustive GRNN • PNN provides faster learning speed than MLFN. • PNN are insensitive to outliers. Therefore it can be more accurate than MLFN. • GRRN is a powerful technique to predict missing data using known properties as input. • The novel technique Exhaustive GRNN allow us to determine the best group of attributes to be used for a regression task. Computation time increases as a larger range of σ is used. • Future work: • We will apply our technique using seismic attributes in order to obtain a predicted rock property survey. • We will expand our Exhaustive GRNN technique for classification tasks (Exhaustive PNN). • Simulated annealing, and the Gradient Descent Method will be implemented to obtain different sigma for each attribute and class. 17

  19. References • Davogustto, O., 2013, Quantitative Geophysical Investigations at the Diamond M Field, Scurry County, Texas. Ph.D Dissertation, The University of Oklahoma. • Hampson, D.P., Schuelke, J.S., Quirein, J.A., 2001, Use of multiattribute transforms to predict log properties from seismic data. Geophysics, 66, 220- 236. • Hughes D., and N. Correll, 2016, Distributed Machine Learning in Materials that Couple Sensing, Actuation, Computation and Communication, arXiv:1606.03508v1. • Masters, T., 1995, Advanced Algorithms for Neural Networks. • Specht, D., 1990, Probabilistic neural networks: Neural Networks, 3, 109 118. • Walker, D. A., J. Golonka, A. M. Reid, and S. T. Reid, 1991. The effects of late Paleozoic paleolatitude and paleogeography on carbonate sedimentation in the Midland Basin, Texas; Permian Basin plays: Society of Economic Paleontologists and Mineralogists, Permian Basin Chapter, Tomorrow’s Technology Today, 141-162. 18

More Related