1 / 18

Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan smapip.is.tohoku.ac.jp/~kazu/

Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation . Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan http://www.smapip.is.tohoku.ac.jp/~kazu/. Collaborators Muneki Yasuda (GSIS, Tohoku University, Japan )

keren
Download Presentation

Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan smapip.is.tohoku.ac.jp/~kazu/

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan http://www.smapip.is.tohoku.ac.jp/~kazu/ Collaborators Muneki Yasuda (GSIS, Tohoku University, Japan) Sun Kataoka (GSIS, Tohoku University, Japan) D. M. Titterington(Department of Statistics, University of Glasgow, UK) SPDSA2013, Sendai, Japan

  2. Outline • Supervised Learning of Pairwise Markov Random Fields by Loopy Belief Propagation • Bayesian Image Modeling by Generalized Sparse Prior • Noise Reductions by Generalized Sparse Prior • Concluding Remarks SPDSA2013, Sendai, Japan

  3. Bayesian Networks Markov Random Fields Probabilistic Model and Belief Propagation Bayes Formulas Probabilistic Models Probabilistic Information Processing Belief Propagation =Bethe Approximation i j Message =Effective Field V: Set of all the nodes (vertices) in graph G E: Set of all the links (edges) in graph G SPDSA2013, Sendai, Japan

  4. Supervised Learning of Pairwise Markov Random Fields by Loopy Belief Propagation Prior Probability of natural images is assumed to be the following pairwise Markov random fields: 4 SPDSA2013, Sendai, Japan

  5. Supervised Learning of Pairwise Markov Random Fields by Loopy Belief Propagation Supervised Learning Scheme by Loopy Belief Propagation in Pairwise Markov random fields: Histogram from Supervised Data M. Yasuda, S. Kataoka and K.Tanaka, J. Phys. Soc. Jpn, Vol.81, No.4, Article No.044801, 2012. 5 SPDSA2013, Sendai, Japan

  6. Supervised Learning of Pairwise Markov Random Fields by Loopy Belief Propagation Supervised Learning Scheme by Loopy Belief Propagation in Pairwise Markov random fields: 6 SPDSA2013, Sendai, Japan

  7. Bayesian Image Modeling by Generalized Sparse Prior Assumption: Prior Probability is given as the following Gibbs distribution with the interaction a between every nearest neighbour pair of pixels: p=0: q-state Potts model p=2: Discrete Gaussian Graphical Model 7 SPDSA2013, Sendai, Japan

  8. Prior in Bayesian Image Modeling Loopy Belief Propagation q=16 In the region of 0<p<0.3504…, the first order phase transition appears and the solution a * does not exist. 8 SPDSA2013, Sendai, Japan

  9. Bayesian Image Modeling by Generalized Sparse Prior: Conditional Maximization of Entropy Lagrange Multiplier • K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012. Loopy Belief Propagation 9 SPDSA2013, Sendai, Japan

  10. Prior Analysis by LBP and Conditional Maximization of Entropy in Bayesian Image Modeling Prior Probability Repeat until A converges LBP q=16 p=0.2 q=16 p=0.5 10 SPDSA2013, Sendai, Japan

  11. Prior Analysis by LBP and Conditional Maximization of Entropy in Generalized Sparse Prior Free Energy of Prior Log-Likelihood for p when the original image f*is given LBP q=16 • K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012. 11 SPDSA2013, Sendai, Japan

  12. Degradation Process in Bayesian Image Modeling Assumption: Degraded image is generated from the original image by Additive White Gaussian Noise. SPDSA2013, Sendai, Japan 12

  13. Noise Reductions by Generalized Sparse Prior Posterior Probability Lagrange Multipliers • K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012. 13 SPDSA2013, Sendai, Japan

  14. Noise Reduction Procedures Based on LBP and Conditional Maximization of Entropy Input p and data g Repeat until C and M converge Repeat until C converge Repeat until u, B and L converge Marginals and Free Energy in LBP SPDSA2013, Sendai, Japan 14

  15. Noise Reductions by Generalized Sparse Priors and Loopy Belief Propagation Original Image Degraded Image Restored Image • K. Tanaka, M. Yasuda and • D. M. Titterington: • J. Phys. Soc. Jpn, 81, 114802, 2012. p=0.2 p=0.5 p=1 SPDSA2013, Sendai, Japan

  16. Noise Reductions by Generalized Sparse Priors and Loopy Belief Propagation Original Image Degraded Image Restored Image • K. Tanaka, M. Yasuda and • D. M. Titterington: • J. Phys. Soc. Jpn, 81, 114802, 2012. p=0.2 p=0.5 p=1 SPDSA2013, Sendai, Japan

  17. Summary • Formulation of Bayesian image modeling for image processing by means of generalized sparse priors and loopy belief propagation are proposed. • Our formulation is based on the conditional maximization of entropy with some constraints. • In our sparse priors, although the first order phase transitions often appear, our algorithm works well also in such cases. SPDSA2013, Sendai, Japan

  18. References • S. Kataoka, M. Yasuda, K. Tanaka and D. M. Titterington: Statistical Analysis of the Expectation-Maximization Algorithm with Loopy Belief Propagation in Bayesian Image Modeling, Philosophical Magazine: The Study of Condensed Matter, Vol.92, Nos.1-3, pp.50-63,2012. • M. Yasuda and K. Tanaka: TAP Equation for Nonnegative Boltzmann Machine: Philosophical Magazine: The Study of Condensed Matter, Vol.92, Nos.1-3, pp.192-209, 2012. • S. Kataoka, M. Yasuda and K. Tanaka: Statistical Analysis of Gaussian Image Inpainting Problems, Journal of the Physical Society of Japan, Vol.81, No.2, Article No.025001, 2012. • M. Yasuda, S. Kataoka and K.Tanaka: Inverse Problem in Pairwise Markov Random Fields using Loopy Belief Propagation, Journal of the Physical Society of Japan, Vol.81, No.4, Article No.044801, pp.1-8, 2012. • M. Yasuda, Y. Kabashima and K. Tanaka: Replica Plefka Expansion of Ising systems, Journal of Statistical Mechanics: Theory and Experiment, Vol.2012, No.4, Article No.P04002, pp.1-16, 2012. • K. Tanaka, M. Yasuda and D. M. Titterington: Bayesian image modeling by means of generalized sparse prior and loopy belief propagation, Journal of the Physical Society of Japan, Vol.81, No.11, Article No.114802, 2012. • M. Yasuda and K. Tanaka: Susceptibility Propagation by Using Diagonal Consistency, Physical Review E, Vol.87, No.1, Article No.012134, 2013. SPDSA2013, Sendai, Japan

More Related