Create Presentation
Download Presentation

Daniel Sánchez Portal Ricardo Díez Muiño Centro de Física de Materiales Centro Mixto CSIC-UPV/EHU

Daniel Sánchez Portal Ricardo Díez Muiño Centro de Física de Materiales Centro Mixto CSIC-UPV/EHU

152 Views

Download Presentation
Download Presentation
## Daniel Sánchez Portal Ricardo Díez Muiño Centro de Física de Materiales Centro Mixto CSIC-UPV/EHU

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -

**CFM**centro de física de materiales Daniel Sánchez Portal Ricardo Díez Muiño Centro de Física de Materiales Centro Mixto CSIC-UPV/EHU**Electronicstructurecalculations:**Methodology and applicationstonanostructures Quantum Monte Carlo methods in electronicstructurecalculations**Electronicstructurecalculations:**Methodology and applicationstonanostructures Lectureson Quantum Monte Carlo methods: TuesdayMarch 10th: 9.45 --> 12.00 theoreticalbackground WednesdayMarch 11th : ?? 1stexercise Friday March 13th : 9.45 --> 12.30 2ndexercise**Electronicstructurecalculations:**Methodology and applicationstonanostructures • Outline • Briefintroduction • Monte Carlo methods • Quantum Monte Carlo • Examples - Variational Monte Carlo - Diffusion Monte Carlo - Othermethods**Electronicstructurecalculations:**Methodology and applicationstonanostructures Introduction • Fundamental object in quantum mechanics: wave function • We want to find special wave functions such that • where • This is a fundamentally many-body equation! • A large variety of methods have been proposed and are being used to solve this problem.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Quantum Monte Carlo (QMC) Thesolution of theSchrödingerequationwith a many-body wave function requirestheevaluation of multi-dimensional integrals. Forinstance: In general terms, Quantum Monte Carlo (QMC) referstothe use of Monte Carlo methodology(i.e., randomsamplings) tonumericallysolvetheseintegrals and/orsolvedifferentialequationsintrinsicto quantum mechanics.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Classical Monte Carlo The central idea of classical Monte Carlo methods is to represent the solution of a mathematical (or physical) problem by a parameter of a true or hypothetical distribution and to estimate the value of this parameter by sampling from this distribution.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo methods: General definition • Monte Carlo methods can be broadly described as statistical simulation methods, where statistical simulation means here any method that utilizes sequences of random numbers to perform the simulation. • Monte Carlo simulations are statistical and non-deterministic: each simulation will give a different result, but the results will be related via some statistical error. • In general, a similar pattern is followed in all these methods based on random events: • - A domain of possible inputs is defined • - Inputs are randomly generated from the domain. • - A deterministic calculation is performed using the inputs. • - The individual calculations are aggregated into the final result. • Monte Carlo methods are naturally used to simulate random, or stochastic, processes. However, many Monte Carlo applications have no apparent stochastic content, such as the evaluation of a definite integral or the inversion of a system of linear equations.**Electronicstructurecalculations:**Methodology and applicationstonanostructures A textbookexample: Thecalculation of numberp Step I.- Draw a square on the ground, then inscribe a circle within it. Step II.- Uniformly scatter some objects of uniform size throughout the square. For example, grains of rice or sand. Step III.- Count the number of objects in the circle, multiply by four, and divide by the total number of objects in the square. Step IV.- The proportion of objects within the circle vs objects within the square will approximate p/4, which is the ratio of the circle's area to the square's area, thus giving an approximation to p.. the approximation of p will become more accurate both as the grains are dropped more uniformly and as more are dropped.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo methods: BriefHistorical Note Although statistical simulation methods have been used for centuries, the name ”Monte Carlo”was coined by Nicholas Metropolis (inspired by Stanislaw Ulam's interest in poker) during the Manhattan Project of World War II, in Los Alamos, because of the similarity of statistical simulation to games of chance. The use of randomness and the repetitive nature of the process are analogous to the activities conducted at a casino.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo methods: BriefHistorical Note First Monte Carlo article in 1953: N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.H. Teller and E.Teller, J. Chem. Phys.21, 1087 (1953)**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo methods: BriefHistorical Note First Monte Carlo article in 1953: N. Metropolis, A.W. Rosenbluth, M.N. Rosenbluth, A.H. Teller and E.Teller, J. Chem. Phys.21, 1087 (1953)**Electronicstructurecalculations:**Methodology and applicationstonanostructures A simple application: Monte Carlo integration**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo integration: traditionalalgorithm Supposewewouldliketoperformthefollowing integral: The plain Monte Carlo algorithm samples points uniformly from the integration region to estimate the integral and its error. If the sample has size N and the points in the sample are denoted by x1, …, xN. , then the estimate for the integral is given by and thevarianceby: Monte Carlo integration gives only a probabilistic error bound, e.g. we can only give a probability that the Monte Carlo estimate lies within a certain range of the true value.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo integration: variancereducingtechniques In the traditional Monte Carlo algorithm, the sample x1, …, xN. , used to evaluate the integral is randomly chosen over the full available domain, with equal probability for each value. But there are several techniques to improve the result by modifying this choice: Stratified sampling: Divide the full integration space into subspaces, perform a Monte Carlo integration in each subspace, and add up the partial results in the end. If the subspaces and the number of points in each subspace are chosen carefully, this can lead to a dramatic reduction in the variance compared with crude Monte Carlo, but it can also lead to a larger variance if the choice is not appropriate. Control variates: Mathematically, this technique is based on the linearity of the integral : If the integral of g is known, the only uncertainty comes from the integral of (f − g), which will have smaller variance than f if g has been chosen carefully. Importance sampling**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo integration: importancesampling Simple schemes for Monte Carlo integration can suffer from low efficiency. Many functions of interest have significant weight in only a few regions. For example, most of the contributions to an integral of a simple Gaussian are located near the central peak. In a simple Monte Carlo integration scheme, points are sampled uniformly, wasting considerable effort sampling the tails of the Gaussian. Techniques for overcoming this problem act to increase the density of points in regions of interest and hence improve the overall efficiency. These techniques are called importance sampling.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo integration: importancesampling Mathematically, importance sampling corresponds to a change of integration variables : with In practice, this is a way to choose the xi values in which we sample to reduce the variance. The idea behind importance sampling is that certain values of the input random variables in a simulation have more impact on the parameter being estimated than others. If and thenwe can associate p(x) to a probabilitydistributionfunction. The integral can beevaluated as: Butwiththevalues of xichosenaccordinglytotheprobabilitydistribution p(x). One has tobecarefulforvalues of p(xi)=0!!**Electronicstructurecalculations:**Methodology and applicationstonanostructures Monte Carlo integration: importancesampling We try p(x) to be as similar as possible to f(x). The best case would obviously be p(x)=a*f(x) (a being a constant, a=1/E to be normalized). Thenthesumabovewouldbe a normalizedsum of N-terms of constantvalue‘E’ and the integral wouldbeexact. Butwe do notknowE (itisourgoal!) so wecannotfixp(x) in thisway! In practice, one uses probabilitydistributionfunctionsp(x) as close in shapetof(x) as possible.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Importancesampling in Variational Quantum Monte Carlo**Electronicstructurecalculations:**Methodology and applicationstonanostructures Importancesampling in Variational Quantum Monte Carlo**Electronicstructurecalculations:**Methodology and applicationstonanostructures Importancesampling: Metropolisalgorithm Generating samples according to a specified probability distribution The Metropolis algorithm generates a random walk of points distributed according to a required probability distribution p(r1, r2,…,rN). 1.- From an initial position (x1,x2,…,xN) in phase or configuration space, a proposed move to (y1,y2,…,yN) is generated. 2.- The move is either accepted or rejected according to the Metropolis algorithm .**Importancesampling: Metropolisalgorithm**The Metropolis algorithm was based on the notion of detailed balance that describes equilibrium for systems whose congurations have probability proportional to the Boltzmann factor. It can be interpreted as sampling the space of possible configurations in a thermal way. Consider two configurations A and B, each of which occurs with probability proportional totheBoltzmannfactor: 1.- Starting from a configuration A, with known energy EA, make a change in the configuration to obtain a new (nearby) configuration B. 2.- Compute EB (typically as a small change from EA. 3.- If EB < EA, assume the new configuration, since it has lower energy (a desirable thing, according to the Boltzmann factor). 4.- If EB > EA, accept the new (higher energy) configuration with probability p = e-(EB-EA)/T . This means that when the temperature is high, we don't mind taking steps in the “wrong" direction, but as the temperature is lowered, we are forced to settle into the lowest configuration we can find in our neighborhood.**Importancesampling: Metropolisalgorithm**If we follow these rules, then we will sample points in the space of all possible configurations with probability proportional to the Boltzmann factor, consistent with the theory of equilibrium statistical mechanics. We can compute average properties by summing them along the path we follow through possible configurations. Since one can start from any arbitrary state, it takes a finite number of steps to reach the desired equilibrium distribution.**Importancesampling: Metropolisalgorithm**• The equilibrium distribution p(x) is generated if the detailed balance condition is fulfilled: • Detailed balance relates the transition probability P to the probability distribution p. Detailed balance is fulfilled for • In words: If p(y) > p(x), one moves from x to y. Otherwise, the move is accepted only with probability p(y)/p(x). • The algorithm generates the desired probability function in the limit of a large number of moves.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Importancesampling: Metropolisalgorithm Generating samples according to a specified probability distribution • By taking a sufficient number of trial steps all of phase space is explored and the Metropolis algorithm ensures that the points are distributed according to the required probability distribution. • FortheMetropolisalgorithmtobevalid, itisessentialthattherandomwalkisergodic. Thatis: anypointin configurationspacemaybereachedfromanyotherpoint . • In someapplications of theMetropolisalgorithm, parts of configurationspacemaybedifficulttoreach. Long simulationsor a modification of thealgorithm are thennecessary.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Directsampling versus Markov-chainsampling New candidates in the Metropolis algorithm are picked out at random. A Markov chain is a process with a finite number of steps, in which the probability of being in a particular state at step n + 1 depends only on the state occupied at step n (loss of memory).**Electronicstructurecalculations:**Methodology and applicationstonanostructures Markovchains and randomwalks A Markov chain, named after Andrey Markov, is a stochastic (non-deterministic) process in which, given the present state, future states are independent of the past states (Markov property). In other words, the description of the present state fully captures all the information that could influence the future evolution of the process. Future states will be reached through a probabilistic process instead of a deterministic one. The Metropolis algorithm generates a Markov chain, which satisfies 2 conditions: 1. The outcome of each trial depends only on the preceding trial, and not on any prior trials 2. Each trial belongs to a finite set of possible states. An example of a Markov chain is a simple random walk where the state space is a set of vertices of a graph and the transition steps involve moving to any of the neighbours of the current vertex with equal probability (regardless of the history of the walk).**Electronicstructurecalculations:**Methodology and applicationstonanostructures Summary of Monte Carlo • Monte Carlo integration offers an efficient tool for numerical evaluation of integrals in high dimensions. • Importance sampling algorithms (Metropolis) improve the efficiency and accuracy of the calculation. • Numerical error scales as 1/(N)1/2**Electronicstructurecalculations:**Methodology and applicationstonanostructures Quantum Monte Carlo Calculating the electronic properties of matter using random numbers**Electronicstructurecalculations:**Methodology and applicationstonanostructures Quantum Monte Carlo • Fundamental object in quantum mechanics: wave function • We want to find special wave functions such that • where • This is a fundamentally many-body equation! Quantum Monte Carlo looks for a direct representation of many-body effects in the wavefunction at the cost of statistical uncertainty.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Quantum Monte Carlo 1st method.- Variational Monte Carlo**Electronicstructurecalculations:**Methodology and applicationstonanostructures Ertekin, Grossman, Wagner, Neaton (UC Berkeley and LBNL) http://www.nanohub.org**Electronicstructurecalculations:**Methodology and applicationstonanostructures**Electronicstructurecalculations:**Methodology and applicationstonanostructures**Electronicstructurecalculations:**Methodology and applicationstonanostructures**Electronicstructurecalculations:**Methodology and applicationstonanostructures**Electronicstructurecalculations:**Methodology and applicationstonanostructures**Electronicstructurecalculations:**Methodology and applicationstonanostructures**Electronicstructurecalculations:**Methodology and applicationstonanostructures • General form of wave function • Slater determinant (Hartree-Fock) • Two-body Jastrow • Three-body Jastrow Jastrow factor**Electronicstructurecalculations:**Methodology and applicationstonanostructures • General form of wave function • Slater determinant (Hartree-Fock) • Two-body Jastrow • Three-body Jastrow Jastrow factor ccoefficients tooptimize**Electronicstructurecalculations:**Methodology and applicationstonanostructures More onthe trial wave functions Slaterdeterminants: Jastrow factor: It is important that the trial wavefunction satisfies as many known properties of the exact wavefunction as possible. A determinantalwavefunction is correctly anti-symmetric with respect to the exchange of any two electrons.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Jastrowfactors The functions u(ri) are chosen to miminize the energy of the system under consideration, by choosing to increase the probability of particles being at a distance that minimizes their interaction energy. Well chosen correlation functions include correlation effects more efficiently than CI-based approaches. An additional local set of constraints which may be readily imposed are for electron-electron and electron-nucleus coalescence. These constraints are the cusp conditions, and are a constraint on the derivatives of the wavefunction: z=-Z forelectron-nucleus- z=1/2 forelectron-electron (spin parallel) z=1/4 forelectron-electron (spin antiparallel)**Probability Distribution**Function (PDF) Electronicstructurecalculations: Methodology and applicationstonanostructures • Rewrite expectation value • Generate random walkers with probability equal to the above PDF and average (Metropolis method) • Either minimize the variance of the local energy with respect to the parameters or minimize the energy itself.**Electronicstructurecalculations:**Methodology and applicationstonanostructures Fluctuationminimization • One way to approximate the eigenfunctionis to minimize deviation from a constant • Eigenvalue problem: Example: HarmonicOscillator Eigenfunction and Non-eigenfunction**Electronicstructurecalculations:**Methodology and applicationstonanostructures • The VMC algorithm consists of two distinct phases: • In the first a walker consisting of an initially random set of electron positions is propagated according to the Metropolis algorithm, in order to equilibrate it and begin sampling . • In the second phase, the walker continues to be moved, but energies and other observables are also accumulated for later averaging and statistical analysis.**Electronicstructurecalculations:**Methodology and applicationstonanostructures**Electronicstructurecalculations:**Methodology and applicationstonanostructures Flow chart forvariational Monte Carlo procedure**Electronicstructurecalculations:**Methodology and applicationstonanostructures Variational MC: Strengths and Weaknesses Powerful, simple, and full of physical insight No sign problem (Pauli principle: Antisymmetric wave functions) Large class of possible trial wave functions Methods favors simple over complex states Trial wave function insensitive to long-range order Result cannot be better than trial wave function allows The major limitation of the variational QMC method appears obvious: What happens when the assumed trial wavefunction isn't accurate enough?**Electronicstructurecalculations:**Methodology and applicationstonanostructures Quantum Monte Carlo 2nd method.- Diffusion Monte Carlo**Electronicstructurecalculations:**Methodology and applicationstonanostructures Mainweakness of variational MC: What happens when the assumed trial wavefunction isn't accurate enough? Main idea of diffusion MC: Avoid the dependence on the trial wave function. For this purpose: Project out the ground state from a trial wave function using a function that depends on the Hamitonian. It is based on the imaginary-time Schrödinger equation