1 / 77

Texture, Microstructure & Anisotropy, Fall 2009 A.D. Rollett, P. Kalu

Percolation, Cluster and Pair Correlation Analysis (L22). Texture, Microstructure & Anisotropy, Fall 2009 A.D. Rollett, P. Kalu. Last revised: 22 nd Nov. ‘09. Objectives. Introduce percolation analysis as a tool for understanding the properties of microstructures.

clarence
Download Presentation

Texture, Microstructure & Anisotropy, Fall 2009 A.D. Rollett, P. Kalu

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Percolation, Cluster and Pair Correlation Analysis (L22) Texture, Microstructure & Anisotropy, Fall 2009 A.D. Rollett, P. Kalu Last revised: 22nd Nov. ‘09

  2. Objectives • Introduce percolation analysis as a tool for understanding the properties of microstructures. • Apply percolation to electrical conductivity as an example of the dependence of a key transport property on grain boundary properties and texture. • Introduce cluster analysis via nearest neighbor distances • Introduce pair correlation functions to analyze medium range correlations in positions of objects (e.g. precipitates).

  3. References • D. Stauffer, Introduction to Percolation Theory, 2nd ed., Taylor and Francis, 1992. • First mention of percolation theory is from S.R. Broadbent and J.M Hammersley: “Percolation processes I. crystals and mazes”, in Proceedings of the Cambridge Philosophical Society, 53, 629-641 (1957). • Random Heterogeneous Materials: Microstructure and Macroscopic Properties, S. Torquato, Springer Verlag (2001, ISBN 0-387-95167-9).

  4. Definitions • Percolation is the study of how systems of discrete objects are connected to each other. • More specifically, percolation is the analysis of clusters - their statistics and their properties. • The applications of percolation are numerous: phase transitions (physics), forest fires, epidemics, fracture … • Applications in microstructure include conductivity, fracture, corrosion, clustering, correlation, particle analysis … • Transport Properties are particularly suited to percolation analysis because communication or transmission between successive neighboring elements is key.

  5. p := probability of bond (connection) between sites ps := probability of a site being occupied pc := percolation threshold (critical probability for network to percolate) s := size of a cluster S := mean size of clusters, <s> ns:= number of clusters of size s. ws:= probability that a given site belongs to a cluster of size s. d:= dimensionality of the system.  := correlation length r:= radius, distance between sites. g(r):= correlation function (connectivity function). c:= proportionality constant.  := critical exponent on cluster size, s  := critical exponent on proportionality constant, c P:= fraction of sites in the critical (infinite size) network, or “power”.  := critical exponent on size of critical network  := critical exponent on average cluster size, S L:= size of finite system.  := probability that a finite system will percolate pav := average percolation threshold in a finite system  := critical exponent on average threshold probability, pav Notation: Percolation

  6. r := radius, distance <rk> := average distance to the kth nearest neighbor k, or,n := order number of nearest neighbor (1 = first, 2 = second etc.) pk := Poisson process probability for occurrence of k objects in a given time or space interval ∆t := time (space) interval  , or,  := expected value (e.g. a density) ∆t := time (space) interval X := number of objects for evaluation of a cumulative probability d:= dimensionality  := Gamma function Wk := cumulative probability of finding at least k objects in a given volume Vd := volume of object in d dimensions Sd := surface of object in d dimensions wk := probability density of distance to the kth neighboring object Notation: cluster analysis

  7. N := total number of particles n := chosen number of particles r := radius, distance PN(rN) drN := specific N-particle probability density function, PN(rN) drN n(rN) := generic n-particle probability density function gn(rn):=n-particle correlation function (radial distribution function/RDF in 1D)  =NV:=number density of particles g2(r12):= 2-particle or pair correlation function Notation: Pair Correlation Function

  8. An example • It is always easier to understand a concept with a picture, so let’s see what clusters mean in 2 dimensions on a square lattice. If we populate some of the cells (LHS) we can see that there are cases where the dots fall into neighboring cells. If we then draw in all the cases where these nearest neighbor links exist (vertical or horizontal bars, RHS) then we can connect cells together into clusters. One cluster is colored red, and the other one blue. Obviously they have different sizes. The isolated dots are left in black. This example is called site percolation.

  9. Percolation Threshold • A key concept is the percolation threshold, pc. • For site percolation (to be defined next), there is a critical concentration (of occupied sites), above which a cluster exists that spans the domain, i.e. connects the left hand edge to the right hand edge. • Example: for a square 2D lattice and bond percolation, the percolation threshold = 0.5. This same value is found for the triangular lattice and site percolation. • In general, mathematically exact results are available for some lattice types in 2D but rarely in 3D.

  10. Site vs. Bond Percolation • For bond percolation, imagine that there is a bond or line drawn between each lattice site. Each bond has a certain probability, p, of being “good” or “existing” or “closed”, where the terminology depends on the field of enquiry. • Conversely, the discussion so far has been on site percolation where there is a certain probability, p, of any site being occupied, but perfect connectivity (i.e. good bonds) between nearest neighbor occupied sites. • As an example, when we discuss electrical conductivity in HTSC films, we will be dealing with bond percolation because it is the grain boundaries that determine the properties.

  11. Lattice Types • Although the use of different lattices is obvious to those who have written computer codes for numerical modeling of microstructures, the figure from Stauffer below illustrates 2 lattices: triangular and honeycomb in 2D, simple cubic in 3D.

  12. Percolation Thresholds Note that 2D grain structures can be regarded as being very close to hexagonal tilings, like the honeycomb lattice, or that the boundary networks contain mainly tri-junctions, like the triangular lattice. 3D grain structures can be regarded as being similar to the fcc lattice. Thus properties that are sensitive to percolation thresholds (e.g. fracture at weak boundaries) often exhibit thresholds that are similar to the values displayed in the table

  13. Cluster Size - 1D • In order to discuss clusters, we need some definitions. • This is most easily done in ONE dimension because exact solutions are available for 1D (but not, of course, for higher dimensions!): every site must be occupied for percolation, so pc = 1! • “ns” is the number of clusters per lattice site of size s. Note how the definition is given in terms of each individual site. This quantity is also called the normalized cluster number. • The probability that a given lattice site is a member of a cluster of size s is given by the product of the cluster size and ns. • These probabilities are related to the occupation probability in a simple way: ps =snss. • [Stauffer p. 21]

  14. Cluster Size - 1D, contd. • The probability, ns, that a given site belongs to a cluster of size s, is given by dividing the probability that that site belongs to that size class, ws, and dividing it by the occupation probability,ps: ws = nss / ps = nss / snss. • Thus, the average cluster size, S, is given by:S = swss = s{nss2/ snss}. • This definition of cluster size is also valid for higher dimensions (d>1) although the infinite cluster must be excluded from the sums. • Lattice animals are very similar but derived from graph theory.

  15. Cluster Size in 1D • To obtain the mean cluster size, S, in terms of the transition probability, p, significant work must be done, even in 1D. • The result, however, is very simple and elegant (p < pc): • It tells us that the cluster size diverges (goes to infinity) for probabilities as they approach the critical value, pc, which of course equals 1 in 1D.

  16. Correlation Length,  • It is useful to define a correlation function, g(r), that is the probability that a site, that is at a distance r away from an occupied site, belongs to the same cluster. In 1D, this means that every site in between must be occupied, so the probability is equal to the probability, p,raised to the rth power, pr:g(r) = pr. • Thus the correlation function (or connectivity function) goes exponentially to zero with distance, where  is the correlation length:

  17. Correlation Length, , contd. • The correlation length below the transition is given by: • Interestingly, in 1D it is proportional to the cluster size, S  . In higher dimensions, the relationship is more complex.

  18. Higher Dimensions • To discuss higher dimensionalities, we need to explain that we are interested in behavior near the critical point, i.e. what happens when a system is about to make a transition from non-percolating to percolating. More precisely, we say that |p-pc|<<1. Leaving out much of the (important, but time consuming) detail, the probability that a given point belongs to a cluster of size s, turns out to be given by an expression like this:

  19. Higher Dimensions, contd. • Note the appearance of an exponent “t” that turns out to be one of a set of “critical exponents”. The proportionality constant, c, is also described by an equation with another critical exponent, s: • Then we can write a similar expression for the fraction of sites in the critical (infinite) network, P, which will be of particular interest for conductivity:

  20. Higher Dimensions, contd. • By further derivations, one can find that there is a simple relation between P (fraction of sites in critical network) and the deviation from the critical transition probability, with a new critical exponent, b: • Finally, we find that for the average cluster size, S:

  21. Critical Exponents • The table shows values of the critical exponents for a variety of situations. • Note that the values of the exponents do not depend on the type of lattice but only on the dimensionality of the problem. • The Bethe Lattice is a special type of lattice (not very realistic!). http://www.ibiblio.org/e-notes/Perc/perc.htm Note: a Bethe lattice or Cayley tree, introduced by Hans Bethe in 1935, is a connected cycle-free graph where each node is connected to z neighbours, where z is called the coordination number. It can be seen as a tree-like structure emanating from a central node, with all the nodes arranged in shells around the central one. The central node may be called the root or origin of the lattice.

  22. Critical Exponents • Not only is it remarkable that these exponents depend only on the dimensionality of the problem, but there are definite, theoretically derivable relationships between them. We give two of the basic relationships here.  := critical exponent on proportionality constant, c

  23. Finite Size Systems • The percolation threshold becomes a probabilistic quantity for systems that are not infinite in size. In plain language, there is a certain probability that a spanning cluster exists in a given realization of a lattice with a specified filling probability. • The next step in this analysis is to analyze probabilities of the occurrence of spanning clusters.

  24. Percolation Cluster Examples • A spanning cluster is one that crosses completely from one side to the other (or top to bottom). • Non-spanning cluster shown in the picture. • See http://www.physics.buffalo.edu/gonsalves/ComPhys_1998/Java/Percolation.html for a java applet that allows you to experiment with percolation thresholds in 2D.

  25. Finite Size Systems • For systems of finite size, L, the transition from non-percolating to percolating is “fuzzier”. More precisely, there is a finite probability, P, that a large enough cluster (of the right shape) will occur in a given realization of a finite system that spans the domain. • As the system increases in size, so the probability of this happening decreases for a given deviation, pc – p, of the transition probability below the critical value. • The graph reproduced from Stauffer shows the behavior schematically: the solid line for P(L<∞) has a finite slope over an appreciable range of p. The dashed line shows the probability density for the same quantity.

  26. Approach to Critical Point • If one considers how to extract the critical probability, one approach is to seek the inflection point in dP/dp. More properly, one must integrate the slope of the spanning probability. • Then one finds that the approach of the measured probability approaches the true critical value as a function of the system size that includes one the the “critical exponents”, n: • Although actual values of the exponent are known, in practice one has to find the best value by inspection of, say, simulation results. It is possible (e.g. in certain 2D systems) for there to be no variation with system size for cases in which the transition is symmetrical.

  27. Electrical Conductivity • One obvious application of percolation analysis is to electrical conduction in materials with weak links, e.g. high temperature superconductors. Although the application of percolation may seem straightforward, the actual dependence of conductivity on the transition probability is not as simple as equating the conductivity to the strength, P, of the infinite (spanning) network. Think of the strength as the fraction of sites/cells that are part of the spanning network. Stauffer and Aharony quote a result from Last & Thouless (1971) in which the conductivity (solid line) increases considerably more slowly from the critical level than does the cluster strength (dashed line).

  28. High Tc Superconductors • As a result of the development of technologies that deposit (ceramic oxide) superconductors onto long lengths (>1 km!) of metal substrate tapes, analysis of the percolative nature of microstructures has been actively investigated. • The orientations of the grains in the nickel substrate are carried through to the grains in the superconductor layer (via epitaxial growth), see e.g. http://www.ornl.gov/HTSC/htsc.html and http://www.lanl.gov/orgs/mpa/stc/index.shtml . • See http://www.amsc.com/ for engineering applications.

  29. Boundaries in Hi-Tc Superconductors • The critical property of interest in the ceramic oxide superconductors is the strong inverse correlation between misorientation and ability to transmit current across a grain boundary. This plot from Heinig et al. shows how strongly boundaries above a certain angle effectively block current.- Appl. Phys. Lett., 69, (1996) 577.

  30. Magneto-optical Imaging • Feldmann et al. [Appl. Phys. Lett., 77 (2000) 2906] have used magneto-optical (MO) imaging to great effect to reveal the effects of microstructure on electrical behavior. • The micrographs show EBSD maps of surface orientation for the Ni substrate in (a) and boundaries with ∆≥1° in (d). The next column shows a percolation map in (b) such that connected points are shown in a single color, with boundaries ∆≥4° in (e). The MO image of current density in the overlaying YBCO film (~1µm) is shown in (c) - light color indicates low current density. (f) shows boundaries with ∆≥8°.

  31. Crystallographic Effects on Percolation • Schuh et al. [Mat. Res. Soc. Symp. Proc., 819, (2004) N7.7.1] have shown that, although standard percolation theory is applicable to analysis of materials properties, the existence of texture results in strong correlations between each link of the network, where properties depend on grain boundary characteristics. • Standard percolation theory assumes that the strength (or probability of a connection) for each link is independent of all others in the system. • Grain boundaries meet at triple junctions (topology of boundary networks) and so one of the 3 boundaries must be a combination (product, in a sense) of the other two. • The impact is significant. To paraphrase the paper, for conductivity in simulated 2D networks of grains and associated boundaries, the percolative threshold from non-conducting to conducting is between 0.31 and 0.336 for different standard texture types, whereas the theoretical threshold for a random network (triangular mesh) is 0.347.

  32. Other Analyses: Neighbor Distances, Pair Correlation • Many examples of microstructures involve characterization of two-phase systems. • If the material contains a dispersion of particles in a matrix, there are many types of analysis that can be applied. • If we are interested in the clustering (or separation) of the particles, we can examine inter-particle distances. • If we are interested in the spatial distribution of the particles, we can characterize pair correlation functions.

  33. kth-Neighbor Analysis • If particles are clustered together, the distances between them will be small compared to the average distance. • Therefore, it is useful to measure the average distance, <rk>, between each particle and its kth neighbor, as a function of k. • If particles are placed randomly, this quantity can be described analytically (equations to be described). • In the simplest, 1D case, this quantity is proportional to the neighbor number. • In 2D, the function is more complicated but <rk> varies approximately as √k.

  34. Kth Neighbor Example • In this example from Tong, this analysis was performed on nucleus spacing during recrystallization to examine the dimensionality of nucleation, i.e. whether new grains appearing on lines, were effectively random, or whether the restriction to lines was significant. The result shown by circles ( = 10) is for closely spaced nuclei on boundaries for which the latter was the case. W.S. Tong et al. (1999). "Quantitative analysis of spatial distribution of nucleation sites: microstructural implications." Acta materialia47(2): 435-445.

  35. Kth-Neighbor Analysis • There are a series of equations that are needed in order to understand the theoretical basis for <rk> • The first is taken from standard probability theory for the “Poisson Process”. This theory gives us a quantitative basis for predicting the probability that a given event will occur in a given interval of time or space. • Useful examples for application of the concept include radioactive decay (how likely is it that a decay event will be observed in a specified time interval, based on an average count rate) or counting trees in a forest (how likely is it a specified number of trees will be found in a given area, based on an average tree density).

  36. Poisson Process Probability • We begin by summarizing the basic theory of the Poisson process for predicting the probability that a given event will occur within a certain interval of time or space. This is easiest to understand with the help of practical, physical examples. As an example of a time-based Poisson process, consider radioactive decay. We know that if we measure a sample of a radioactive substance with a Geiger Counter such as a uranium bearing mineral, we will obtain a certain number of counts per minute. In statistical terms, the count rate is the expected value, or rate of process, for the event of interest, i.e. a single radioactive decay event. We will call this expected value “alpha” (a); in some texts this is written as <n>. The critical assumptions that permit us to apply the basic theory for the Poisson process are as follows. 1. The expected value, a, multiplied by a given time interval, ∆t, is the probability, a∆t, that a single event will occur in that time interval. 2. The probability of no events occurring in that same time interval is 1 - a∆t, which requires that the probability of more than one event occurring in that same interval is of order ∆t (o(∆t)). 3. The number of events in the given time interval is independent of the events that occur before the given time interval. Another way to say this is that the events are uncorrelated in time.

  37. Poisson, contd. • Based on these assumptions we can write the probability, pk, of k counts being recorded in the time interval ∆t, based on an expected value, a, as follows.

  38. Poisson, contd. • For space-based analysis, consider mapping out a forest and counting trees. The expected number of trees in a given area can expressed as so many trees per hectare, a. Then the probability of finding 10 trees in two hectares, for example, is simply the same expression but with the area, A, substituting for the time interval. • So, if we count 131 trees per hectare then the probability of finding only 10 trees in 2 hectares (clearly very unlikely) is: • Note that the formula contains unwieldy quantities from a numerical perspective so it may be necessary to re-scale the number of interest, the area or time interval and the expected value in order to make it possible to calculate a probability.

  39. Poisson, contd. • Now, often a more useful probability is the one that describes how likely it is that at least k events will be observed in the specified interval. Since this probability, p(kX), is effectively a measure based on a cumulative distribution, a summation is required in order to arrive at the desired answer.

  40. Cumulative Probabilities • Another way to understand this approach is to consider precipitates in a material. If the particles are located in the material in a completely random fashion, then we can model their positions on the basis of a Poisson process. Thus we can write the probability of observing nparticles in a given area by the following. In this version of the equation, <n> is the expected value, i.e. the expected/average value for that number of particles in the specified region or time interval.

  41. Nearest Neighbor Distances • Now we can extend this approach to relate it to nearest neighbor distances between particles. Let’s define a density of particles as ldso that the standard notation in 3D would be NV. For a given volume, Vd, where “d” denotes the dimension of the space (normally 2 or 3), then the expected value that we are interested in is given by <n> = NV Vd. From statistical mechanics [e.g. Pathria, R. K. (1972). Statistical Mechanics. New York, Pergamon Press], we know that the volume, Vd, and surface area, Sd, of a region of size (radius) r is given by the following, where G is the gamma function:

  42. Nearest Neighbor Distances, contd. • From these one can find the cumulative probability function, Wk, for the probability of finding at least k particles in the volume of interest (see above for the basic equation). • From this, we can find the probability density of the distance to the kth nearest neighbor by differentiation (the clever trick in all this!) of this quantity. Probability

  43. Nearest Neighbor Distances, contd. • This probability density function is not immediately useful to us so we have to make an average by taking the first moment, i.e. integrating the density, w, by the radius from 0 to infinity.

  44. Nearest Neighbor Distances, contd. • Evaluating this expression for 1, 2 and 3 dimensions, we obtain: (1D) (2D) (3D)

  45. 2D Examples • Here are the results from Tong et al. of calculating the 2D nearest neighbor average distance using both the theory given above and results from distributing points at random over a plane and measuring interparticle distances directly. Note that in order to accommodate different particle densities (l2 = NA) the vertical axis is normalized by the density. The theoretical result and the numerical ones lie essentially on top of one another, i.e. the agreement is near perfect.

  46. Application to Nucleation • Tong et al. exploited this analysis to diagnose the degree to which nuclei for a phase transformation examined in cross section (therefore 2D) were behaving as clusters on grain boundaries (and thus obeying the 1D nearest neighbor characteristic) versus being effectively dispersed at random throughout the material (thus 2D behavior). • The plot shows the normalized average distance to the kth neighbor for two sets of nuclei distributed randomly along grain boundaries in a 2D microstructure. The circles correspond to a case with a high density of points such that points within k<10 cluster as if on lines. In the second case, triangles, the low density of points means that the nuclei all behave as if they were randomly scattered throughout the area.

  47. Grain Morphology The difference between the low (“a” - triangles) and high (“b” - circles) nucleus densities illustrated is shown by these images of the fully transformed structures.

  48. Pair Correlation Functions • A simple concept that turns out to be very useful in particle analysis is that of pair correlation functions. • This is also an important concept in particle physics. • Conditional (2-point) probability function: given a vector whose tail is located within a particle, what is the probability that the other end (head) of the vector falls inside another particle (of the same phase)? • The average probability (over all vectors) is just the volume fraction of particles. • If particles are highly correlated in position in a given direction, then this probability will be higher than the volume fraction for vectors parallel to the correlated direction. • We can illustrate the idea with an example in aerospace aluminum.

  49. Pair Correlation: example Strict definition: conditional 2-point probability Input (500X500) Center of 1 dot to end of 5th dot is 53 pixels Output (401X401) Center of image to end of red dot is 53 pixels Color in a PCF is scaled from black (low probability) to white (high)

  50. PCF: correlation lengths Example from Al 7075, aerospace aluminum alloy Optical image of transverse plane (S-T) PCF of transverse plane Tran #1 PCF Tran #1 ND / S TD / T ~2 particles per stringer

More Related