1 / 39

Raluca Gordan February 12, 2008

Free Energy Estimates of All-atom Protein Structures Using Generalized Belief Propagation Kamisetty H., Xing, E.P. and Langmead C.J. Raluca Gordan February 12, 2008. Papers.

drea
Download Presentation

Raluca Gordan February 12, 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Free Energy Estimates of All-atom Protein Structures Using Generalized Belief PropagationKamisetty H., Xing, E.P. and Langmead C.J. Raluca Gordan February 12, 2008

  2. Papers • Free Energy Estimates of All-atom Protein Structures Using Generalized Belief PropagationKamisetty, H., Xing, E.P. and Langmead C.J. • Constructing Free-Energy Approximations and Generalized Belief Propagation Algorithms Yedidia, J.S., Freeman, W.T. and Weiss Y. • Understanding Belief Propagation and its GeneralizationsYedidia, J.S., Freeman, W.T. and Weiss Y. • Bethe free energy, Kikuchi approximations, and belief propagation algorithmsYedidia, J.S., Freeman, W.T. and Weiss Y. • Effective energy functions for protein structure predictionLazaridis, T. and Karplus M.

  3. probabilistic graphical models free energy potential function enthalpy marginal probabilities entropy pair-wise MRF inference internal energy factor graphs Markov random field Gibbs free energy Bayes nets belief propagation region-based free energy generalized belief propagation region graph

  4. probabilistic graphical models free energy potential function enthalpy marginal probabilities entropy pair-wise MRF inference internal energy factor graphs Markov random field Gibbs free energy Bayes nets belief propagation region-based free energy generalized belief propagation region graph

  5. probabilistic graphical models free energy potential function enthalpy marginal probabilities entropy pair-wise MRF inference internal energy factor graphs Markov random field Gibbs free energy Bayes nets belief propagation region-based free energy generalized belief propagation region graph

  6. probabilistic graphical models free energy potential function enthalpy marginal probabilities entropy pair-wise MRF inference internal energy factor graphs Markov random field Gibbs free energy Bayes nets belief propagation region-based free energy generalized belief propagation region graph

  7. Free energy • Free energy = the amount of energy in a system which can be converted into work • Gibbs free energy = the amount of thermodynamic energy which can be converted into work at constant temperature and pressure • Enthalpy = the “heat content” of a system • Entropy = a measure of the degree of randomness or disorder of a system G = H – T·S = (E + P·V) – T·S G = Gibbs free energy H = enthalpy S = entropy E = internal energy T = temperature P = pressure V = volume Stryer L., Biochemistry (4th Edition)

  8. Gibbs free energy (G) • Thermodynamics: changes in free energy, entropy, … • For nearly all biochemical reactions ΔV is small andΔH is almost equal to ΔE • Hence, we can write: ΔG = ΔH – T·ΔS ΔG = (ΔE + P·ΔV) – T·ΔS ΔG = ΔE– T·ΔS Stryer L., Biochemistry (4th Edition)

  9. Free energy functions G = E – T· S • Energy functions are used in protein structure prediction, fold recognition, homology modeling, protein design • E.g.: approaches to protein structure prediction are based on the thermodynamic hypothesis, which postulates that the native state of a protein is the state of lowest free energy under physiological conditions. • The contribution of Kamisetty H., Xing E.P and Langmead, C.J: • the entropy component of their free energy estimate can be used to distinguish native protein structures from decoys(structures with similar internal energy to that of the native structure, but otherwise incorrect) • compute estimates of ΔΔG upon mutation that correlate well with experimental values. Lazaridis T. and Karplus M., Effective energy function for protein structure prediction

  10. Free energy functions G = E – T· S • Internal energy functions E • model inter- and intramolecular interactions (e.g. van der Waals, electrostatic, solvent, etc.) • Entropy functions S • are harder to compute because they involve sums over an exponential number of terms

  11. The entropy term G = E – T· S • Ignore the entropy term + simple - limits the accuracy • Use statistical potentials derived from known protein structures (PDB) + these statistics encode both the entropy S and the internal energy E - the interactions are not independent* • Model the protein structure as a probabilistic graphical model and use inference-based approaches to estimate the free energy (Kamisetty et al.) + fast and accurate * Thomas P.D. and Dill, K.A., Statistical Potentials Extracted From Protein Structures: How Accurate Are They?

  12. probabilistic graphical models free energy potential function enthalpy marginal probabilities entropy pair-wise MRF inference internal energy factor graphs Markov random field Gibbs free energy Bayes nets belief propagation region-based free energy generalized belief propagation region graph

  13. Probabilistic Graphical Models • Are graphs that represent the dependencies among random variables • usually each random variable is a node, and the edges between the nodes represent conditional dependencies • E.g. • Bayesian networks • (pair-wise) Markov random fields • Factor graphs

  14. Bayes Nets • – random variables • – values for the rv • Each variable can be in a discrete number of states • Arrows - conditional probabilities • Each variable is independent of the other variables, given its parents • Joint probability: • Marginal probability:

  15. Bayes Nets • – random variables • – values for the rv • Each variable can be in a discrete number of states • Arrows - conditional probabilities • Each variable is independent of the other variables, given its parents • Joint probability: • Marginal probability: • Belief: probability computed approximately

  16. Markov Random Fields • – hidden variables • – values for the hidden vars • – observed variables • compatibility functions (potentials) often called the evidence for for connected vars and • Overall joint probability: • where Z is a normalization constant (also called the partition function) pair-wise MRF because the potential is pair-wise

  17. Factor Graphs • Bipartite graph: • – variable nodes ( – values for the vars) • – function (factor) nodes (represent the interactions between variables) • The joint probabilityfactors into a product of functions: • E.g.:

  18. Factor Graphs • Bipartite graph: • – variable nodes ( – values for the vars) • – function (factor) nodes (represent the interactions between variables) • The joint probabilityfactors into a product of functions: • E.g.:

  19. Graphical Models factor graphs pair-wise MRF Bayes nets Understanding Belief Propagation and its GeneralizationsYedidia, J.S., Freeman, W.T. and Weiss Y. (2002)

  20. probabilistic graphical models free energy potential function enthalpy marginal probabilities entropy pair-wise MRF inference internal energy factor graphs Markov random field Gibbs free energy Bayes nets belief propagation region-based free energy generalized belief propagation region graph

  21. Belief Propagation (BP) • Marginal probabilities that we compute approximately = beliefs • Marginal probability • The number of terms in the sums grows exponentially with the number of variables • BP is a method for approximating the marginal probabilities in a time that grows linearly with the number of variables (nodes) • BP for pwMRFs, BNs or FGs is precisely mathematically equivalent at every iteration of the BP algorithm

  22. Belief Propagation (BP) • hidden variables , observed variables • compatibility functions (potentials) , • marginal probabilities • The message from node to node about the state node should be in. • E.g.: has 3 possible values {1,2,3} and • The belief at each node: • The message update rule:

  23. Belief Propagation (BP) • The message update rule: • The belief at each node:

  24. Belief Propagation (BP) • Iterative method • When the MRF has no cycles, the beliefs computed using BP are exact! • Even when the MRF has cycles, the BP algorithm is still well defined and empirically often gives good approximate answers.

  25. Graphical Models and Free Energy • Statistical physics (Boltzmann’s law) • Kullback-Leibler distance: • KL = 0 iff the beliefs are exact and in this case we have • When the beliefs are exact the Gibbs free energy achieves its minimal value (–lnZ, also called the “Helmholz free energy”)

  26. Approximating the Free Energy • Approximations • Mean-field free energy approximation • uses one-node beliefs and assumes that • Bethe free energy approximation • uses one-node beliefs and two-node beliefs • Region-based free energy approximations • idea: break up the graph into a set of regions, compute the free energy over each region and then approximate the total free energy by the sum of the free energies over the regions Summations over an exponential number of terms

  27. Generalized Belief Propagation • Region-based free energy approximations • idea: break up the graph into a set of regions, compute the free energy over each region and then approximate the total free energy by the sum of the free energies over the regions • GBP • a message-passing algorithm similar to BP • messages between regions vs. messages between nodes • the regions of nodes that communicate can be visualized in terms of a region graph (Yedidia, Freeman, Weiss) • the region-graph approximation method generalizes the Bethe method, the junction graph method and the cluster variation method • different choices of region graphs give different GBP algorithms • tradeoff: complexity / accuracy • how to optimally choose the regions – more art than science

  28. Generalized Belief Propagation • Usually improves on simple BP (when the graph contains cycles) • Good advice: when constructing the regions, try to include at least the shortest cycles inside regions • For region graphs with no cycles, GBP is guaranteed to work • Even when the region graph has cycles, GBP usually gives good results • Constructing Free-Energy Approximations and Generalized Belief Propagation Algorithms Yedidia, J.S., Freeman, W.T. and Weiss Y.

  29. Free Energy Estimates of All-atom Protein Structures Using Generalized Belief PropagationKamisetty H., Xing, E.P. and Langmead C.J.

  30. Model • Model the protein structure as a complex probability distribution, using a pair-wise MRF • observed variables: backbone atom positions (continuous) • hidden variables: side chain atom positions represented using rotamers (discrete) • interactions (edges): two variables share an edge if they are closer than a threshold distance (Cα-Cα distance < 8Å) • potential functions: where is the energy of interaction between rotamer state of residue and rotamer state of residue

  31. Model

  32. MRF to Factor Graph

  33. Building the Region Graph big regions – 3 or 2 variables small regions – one variable To form the region graph, add edges from each big region to all small regions that contain a strict subset of the big region’s nodes.

  34. Generalized Belief Propagation • Choice of regions • Idea: place residues that are closely coupled together in the same big regions • Balance accuracy/complexity • Aji and McEliece • “Two-way algorithm” (Yedidia, Freeman, Weiss) • Initialize the GBP messages to random starting points and run the algorithm until the beliefs converge or for maximum 100 iterations

  35. Results on the Decoy Datasets G = E – T· S • 48 datasets • Each dataset : • multiple decoys and the native structure of a protein • all decoys had similar backbones to the native structure (Cα RMSD < 2.0Å) • when ranked in decreasing order of entropy, the native structure is ranked the highest in 87.5% of the datasets • PROCHECK (protein structure validation): for the datasets in which the native structure was ranked 3rd or 4th, this structure had a very high number of “bad” bond angles • For dissimilar backbones: 84%

  36. Results on the Decoy Datasets • Comparison to other energy functions:

  37. Predicting ΔΔG upon mutation

  38. Summary • Model protein structures as complex probability distributions, using probabilistic graphical models (MRFs and FGs) • Use Generalized Belief Propagation (two-way algorithm) to approximate the free energy • Successfully use the method to • distinguish native structures from decoys • predict changes in free energy after mutation • Other applications: side chain placement (Yanover and Weiss), other inference problems over the graphical model.

  39. Questions?

More Related