1 / 21

Uncertainty in petroleum reservoirs

Uncertainty in petroleum reservoirs. Finding the reservoir I. Finding the reservoir II. The underground is packed with density gradients:. Top and base of reservoir (I think …) . Interpreting this is a far cry from hard science. Geological properties.

amara
Download Presentation

Uncertainty in petroleum reservoirs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Uncertainty in petroleum reservoirs

  2. Finding the reservoir I

  3. Finding the reservoir II The underground is packed with density gradients: Top and base of reservoir (I think …). Interpreting this is a far cry from hard science.

  4. Geological properties Exploration well – try to infer properties on km scale from point measurement.

  5. Porosity and permeability High porosity Low porosity High permeability Low permeability

  6. Internal barriers? Interface depth? OK – what is inside this reservoir

  7. Fluid properties Water-wet reservoir Oil-wet reservoir

  8. Uncertain factors • The geometry of the reservoir – including internal compartmentalization. • The spatial distribution of porosity and permeability. • Depth of fluid interfaces. • Fluid and fluid-reservoir properties. • …

  9. What to do with it? Deterministic models: Attempts at modelling and quantifying uncertainty arecertainly done, but this is mainly in the form of variable (stocastic) input, not stocastic dynamics. Before production: A range input values is tried out, and the future production is simulated.These simulations are an important basis for investment decisions. After production start: When the field is producing we have measured values of e.g. produced rates of oil, gas and water which can be compared with the simulated predictions → a misfit can be evaluated, and the models updated.

  10. History matching (or revisionism) • Select a set ”true” observations you want to reproduce in your simulations. • Select a (limited) set of parameters to update. • Update your parameters as best you can. • Simulate your model and compare simulated results with observations. • Discrepancy below tolerance? • You have an updated model. No Yes

  11. History matching – it is just plain stupid Traditionally History Matching is percieved as an optimization problem – a very problematic approach: • The problem is highly nonlinear, and severely underdetermined. • The observations we are comparing with can be highly uncertain. • The choice of parameterization is somewhat arbitrary – we will optimize in the wrong space anyway.

  12. Prior Posterior A probabilistic problem – Bayesian setting. {m} : Model parameters {d} : Observed data Likelihood

  13. The objective function Guassian likelihood: Covariance of measurement errors. P(d|m) = exp(-(S(m) – d)TC-1(S(m) – d)) Observed data Result from the simulator Evaluation of S(m) requires running the simulator and is very costly.

  14. How to find the posterior?? EnKF: Data assimilation technique based on ”resampling” of finite ensemble in a Gaussian approximation. Gives good results when the Gaussian approximation applies, and fails spectactularly when it does not apply. BASRA (McMC with proxy functions): Flexible and fully general approach. ”Guaranteed” to converge to the correct posterior, but the convergence rate can be slow.

  15. Kalman filter Kalman filter: Technique for sequential state estimation based on combining measurements and a linear equation of motion. Very simple example: Forecast State estimate: Updated Measurement Forecast error (Co)variance estimate: Measurement error

  16. EnKF • When the equation of motion is nonlinear predicting the state covariance becomes difficult. The EnKF approach is to let an ensemble (i.e. sample) evolve with the equation of motion, and use the sample covariance as a plugin estimator for the state covariance. • Gaussian likelihood. • Gaussian prior • A combined parameter and state estimation problem. • The updated state is linear combination of the prior states. Computationally efficient – but limiting

  17. Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Permeability Porosity Relperm MULTFLT Integrate EnKF - linear combination Time EnKF update: AA= AFX Observation

  18. WOPR TIME EnKF update: sequential Last historical data The EnKF method updates the models every time data is available. • When new data becomes available we can continue without ”going back”. Future prediction

  19. BASRA Workflow • Select a limited ( <~ 50 ) parameters {m} to update, with an accompanying prior. • Perturb the parameter set {m} → {m} + δ{m} and evaluate a new misfit O’({m}). • Accept the new state with probability P = min{1,exp(-δO({m})}. • When this has converged we have one realization {m} from the posterior which can be used for uncertainty studies; repeat to get an ensemble of realizations. The evaluation of the misfit is prohibitively expensive, and advanced proxy modelling is essential.

  20. BASRA Results Converging the proxies: Marginal posteriors: Prior Posterior ensemble: Posterior

  21. Current trends • Reservoir modelling usually involves a chain of weakly coupled models and applications – strive hard to update parameters early in the chain. • Update of slightly more exotic variables like surface shapes and the direction of channels. • The choice of parameterization is somewhat arbitrary – we will optimize in the wrong space anyway. A more systematic approach to choosing parameterization would be very valuable.

More Related