AI帮你理解科学
AI 精读
AI抽取本论文的概要总结
微博一下:
Graphical Models, Exponential Families, and Variational Inference
Graphical Models, Exponential Families, and Variational Inference, no. 1–2 (2008): 1-305
EI
关键词
摘要
The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building large-scale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communicat...更多
代码:
数据:
简介
- Introduction complexity and feasibility
In particular, the running time of an algorithm or the magnitude of an error bound can often be characterized in terms of structural properties of a graph. - “A linear programming formulation and approximation algorithms for the metric labeling problem,” SIAM Journal on Discrete Mathematics, vol 18, no.
- J. Wainwright, “Probabilistic analysis of linear programming decoding,” IEEE Transactions on Information Theory, vol 54, no.
- “Convergent message-passing algorithms for inference over general graphs with convex free energy,” in Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, Arlington, VA: AUAI Press, 2008.
重点内容
- Introduction complexity and feasibility
In particular, the running time of an algorithm or the magnitude of an error bound can often be characterized in terms of structural properties of a graph - As we discuss, the computational complexity of a fundamental method known as the junction tree algorithm — which generalizes many of the recursive algorithms on graphs cited above — can be characterized in terms of a natural graphtheoretic measure of interaction among variables
- The junction tree algorithm provides a systematic solution to the problem of computing likelihoods and other statistical quantities associated with a graphical model
- One popular source of methods for attempting to cope with such cases is the Markov chain Monte Carlo (MCMC) framework, and there is a significant literature on the application of Markov chain Monte Carlo methods to graphical models [e.g., 28, 93, 202]. Our focus in this survey is rather different: we present an alternative computational methodology for statistical inference that is based on variational methods
- These techniques provide a general class of alternatives to Markov chain Monte Carlo, and have applications outside of the graphical model framework
- The principal object of interest in our exposition is a certain conjugate dual relation associated with exponential families. From this foundation of conjugate duality, we develop a general variational representation for computing likelihoods and marginal probabilities in exponential families
结果
- “Factor graphs and the sum-product algorithm,” IEEE Transactions on Information Theory, vol 47, no.
- J. Spiegelhalter, “Local computations with probabilities on graphical structures and their application to expert systems,” Journal of the Royal Statistical Society, Series B, vol 50, pp.
- S. Willsky, “Walk-sums and belief propagation in Gaussian graphical models,” Journal of Machine Learning Research, vol 7, pp.
- “Bayesian inference and optimal design in the sparse linear model,” in Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, San Juan, Puerto Rico, 2007.
- S. Willsky, “Loop series and Bethe variational bounds for attractive graphical models,” in Advances in Neural Information Processing Systems, pp.
- “Comparative study of energy minimization methods for Markov random fields,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol 30, pp.
- S. Willsky, “Tree-based reparameterization framework for analysis of sum-product and related algorithms,” IEEE Transactions on Information Theory, vol 49, no.
- S. Willsky, “Tree-reweighted belief propagation algorithms and approximate ML, estimation by pseudomoment matching,” in Proceedings of the Ninth International Conference on Artificial Intelligence and Statistics, 2003.
- S. Willsky, “Exact MAP estimates via agreement ontrees: Linear programming and message-passing,” IEEE Transactions on Information Theory, vol 51, no.
- S. Willsky, “A new class of upper bounds on the log partition function,” IEEE Transactions on Information Theory, vol 51, no.
- I. Jordan, “Log-determinant relaxation for approximate inference in discrete Markov random fields,” IEEE Transactions on Signal Processing, vol 54, no.
结论
- T. Freeman, “Correctness of belief propagation in Gaussian graphical models of arbitrary topology,” in Advances in Neural Information Processing Systems, pp.
- “MAP estimation, linear programming, and belief propagation with convex free energies,” in Proceedings of the 23rd Conference on Uncertainty in Artificial Intelligence, Arlington, VA: AUAI Press, 2007.
- “A linear programming approach to max-sum problem: A review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol 29, no.
- Wiegerinck, “Variational approximations between mean field theory and the junction tree algorithm,” in Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence, pp.
基金
- In the USA: This journal is registered at the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923
- Authorization does not extend to other kinds of copying, such as that for general distribution, for advertising or promotional purposes, for creating new collective works, or for resale
- In the rest of the world: Permission to photocopy must be obtained from the copyright owner
引用论文
- A. Agresti, Categorical Data Analysis. New York: Wiley, 2002.
- S. M. Aji and R. J. McEliece, “The generalized distributive law,” IEEE Transactions on Information Theory, vol. 46, pp. 325–343, 2000.
- N. I. Akhiezer, The Classical Moment Problem and Some Related Questions in Analysis. New York: Hafner Publishing Company, 1966.
- S. Amari, “Differential geometry of curved exponential families — curvatures and information loss,” Annals of Statistics, vol. 10, no. 2, pp. 357–385, 1982.
- S. Amari and H. Nagaoka, Methods of Information Geometry. Providence, RI: AMS, 2000.
- G. An, “A note on the cluster variation method,” Journal of Statistical Physics, vol. 52, no. 3, pp. 727–734, 1988.
- A. Bandyopadhyay and D. Gamarnik, “Counting without sampling: New algorithms for enumeration problems wiusing statistical physics,” in Proceedings of the 17th ACM-SIAM Symposium on Discrete Algorithms, 2006.
- O. Banerjee, L. El Ghaoui, and A. d’Aspremont, “Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data,” Journal of Machine Learning Research, vol. 9, pp. 485–516, 2008.
- F. Barahona and M. Groetschel, “On the cycle polytope of a binary matroid,” Journal on Combination Theory, Series B, vol. 40, pp. 40–62, 1986.
- D. Barber and W. Wiegerinck, “Tractable variational structures for approximating graphical models,” in Advances in Neural Information Processing Systems, pp. 183–189, Cambridge, MA: MIT Press, 1999.
- O. E. Barndorff-Nielsen, Information and Exponential Families. Chichester, UK: Wiley, 1978.
- R. J. Baxter, Exactly Solved Models in Statistical Mechanics. New York: Academic Press, 1982.
- M. Bayati, C. Borgs, J. Chayes, and R. Zecchina, “Belief-propagation for weighted b-matchings on arbitrary graphs and its relation to linear programs with integer solutions,” Technical Report arXiv: 0709 1190, Microsoft Research, 2007.
- M. Bayati and C. Nair, “A rigorous proof of the cavity method for counting matchings,” in Proceedings of the Allerton Conference on Control, Communication and Computing, Monticello, IL, 2007.
- M. Bayati, D. Shah, and M. Sharma, “Maximum weight matching for maxproduct belief propagation,” in International Symposium on Information Theory, Adelaide, Australia, 2005.
- M. J. Beal, “Variational algorithms for approximate Bayesian inference,” PhD thesis, Gatsby Computational Neuroscience Unit, University College, London, 2003.
- C. Berge, The Theory of Graphs and its Applications. New York: Wiley, 1964.
- E. Berlekamp, R. McEliece, and H. van Tilborg, “On the inherent intractability of certain coding problems,” IEEE Transactions on Information Theory, vol. 24, pp. 384–386, 1978.
- U. Bertele and F. Brioschi, Nonserial Dynamic Programming. New York: Academic Press, 1972.
- D. P. Bertsekas, Dynamic Programming and Stochastic Control. Vol. 1. Belmont, MA: Athena Scientific, 1995.
- D. P. Bertsekas, Nonlinear Programming. Belmont, MA: Athena Scientific, 1995.
- D. P. Bertsekas, Network Optimization: Continuous and Discrete Methods. Belmont, MA: Athena Scientific, 1998.
- D. P. Bertsekas, Convex Analysis and Optimization. Belmont, MA: Athena Scientific, 2003.
- D. Bertsimas and J. N. Tsitsiklis, Introduction to Linear Optimization. Belmont, MA: Athena Scientific, 1997.
- J. Besag, “Spatial interaction and the statistical analysis of lattice systems,” Journal of the Royal Statistical Society, Series B, vol. 36, pp. 192–236, 1974.
- J. Besag, “Statistical analysis of non-lattice data,” The Statistician, vol. 24, no. 3, pp. 179–195, 1975.
- J. Besag, “On the statistical analysis of dirty pictures,” Journal of the Royal Statistical Society, Series B, vol. 48, no. 3, pp. 259–279, 1986.
- J. Besag and P. J. Green, “Spatial statistics and Bayesian computation,” Journal of the Royal Statistical Society, Series B, vol. 55, no. 1, pp. 25–37, 1993.
- H. A. Bethe, “Statistics theory of superlattices,” Proceedings of Royal Society London, Series A, vol. 150, no. 871, pp. 552–575, 1935.
- P. J. Bickel and K. A. Doksum, Mathematical statistics: basic ideas and selected topics. Upper Saddle River, N.J.: Prentice Hall, 2001.
- D. M. Blei and M. I. Jordan, “Variational inference for Dirichlet process mixtures,” Bayesian Analysis, vol. 1, pp. 121–144, 2005.
- D. M. Blei, A. Y. Ng, and M. I. Jordan, “Latent Dirichlet allocation,” Journal of Machine Learning Research, vol. 3, pp. 993–1022, 2003.
- H. Bodlaender, “A tourist guide through treewidth,” Acta Cybernetica, vol. 11, pp. 1–21, 1993.
- B. Bollobas, Graph Theory: An Introductory Course. New York: SpringerVerlag, 1979.
- B. Bollobas, Modern Graph Theory. New York: Springer-Verlag, 1998.
- E. Boros, Y. Crama, and P. L. Hammer, “Upper bounds for quadratic 0-1 maximization,” Operations Research Letters, vol. 9, pp. 73–79, 1990.
- E. Boros and P. L. Hammer, “Pseudo-boolean optimization,” Discrete Applied Mathematics, vol. 123, pp. 155–225, 2002.
- J. Borwein and A. Lewis, Convex Analysis. New York: Springer-Verlag, 1999.
- S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge, UK: Cambridge University Press, 2004.
- X. Boyen and D. Koller, “Tractable inference for complex stochastic processes,” in Proceedings of the 14th Conference on Uncertainty in Artificial Intelligence, pp. 33–42, San Francisco, CA: Morgan Kaufmann, 1998.
- A. Braunstein, M. Mezard, and R. Zecchina, “Survey propagation: An algorithm for satisfiability,” Technical Report, arXiv:cs.CC/02122002 v2, 2003.
- L. M. Bregman, “The relaxation method for finding the common point of convex sets and its application to the solution of problems in convex programming,” USSR Computational Mathematics and Mathematical Physics, vol. 7, pp. 191–204, 1967.
- L. D. Brown, Fundamentals of Statistical Exponential Families. Hayward, CA: Institute of Mathematical Statistics, 1986.
- C. B. Burge and S. Karlin, “Finding the genes in genomic DNA,” Current Opinion in Structural Biology, vol. 8, pp. 346–354, 1998.
- Y. Censor and S. A. Zenios, Parallel Optimization: Theory, Algorithms, and Applications. Oxford: Oxford University Press, 1988.
- M. Cetin, L. Chen, J. W. Fisher, A. T. Ihler, R. L. Moses, M. J. Wainwright, and A. S. Willsky, “Distributed fusion in sensor networks,” IEEE Signal Processing Magazine, vol. 23, pp. 42–55, 2006.
- D. Chandler, Introduction to Modern Statistical Mechanics. Oxford: Oxford University Press, 1987.
- C. Chekuri, S. Khanna, J. Naor, and L. Zosin, “A linear programming formulation and approximation algorithms for the metric labeling problem,” SIAM Journal on Discrete Mathematics, vol. 18, no. 3, pp. 608–625, 2005.
- L. Chen, M. J. Wainwright, M. Cetin, and A. Willsky, “Multitargetmultisensor data association using the tree-reweighted max-product algorithm,” in SPIE Aerosense Conference, Orlando, FL, 2003.
- M. Chertkov and V. Y. Chernyak, “Loop calculus helps to improve belief propagation and linear programming decoding of LDPC codes,” in Proceedings of the Allerton Conference on Control, Communication and Computing, Monticello, IL, 2006.
- M. Chertkov and V. Y. Chernyak, “Loop series for discrete statistical models on graphs,” Journal of Statistical Mechanics, p. P06009, 2006.
- M. Chertkov and V. Y. Chernyak, “Loop calculus helps to improve belief propagation and linear programming decodings of low density parity check codes,” in Proceedings of the Allerton Conference on Control, Communication and Computing, Monticello, IL, 2007.
- M. Chertkov and M. G. Stepanov, “An efficient pseudo-codeword search algorithm for linear programming decoding of LDPC codes,” Technical Report arXiv:cs.IT/0601113, Los Alamos National Laboratories, 2006.
- S. Chopra, “On the spanning tree polyhedron,” Operations Research Letters, vol. 8, pp. 25–29, 1989.
- S. Cook, “The complexity of theorem-proving procedures,” in Proceedings of the Third Annual ACM Symposium on Theory of Computing, pp. 151–158, 1971.
- T. H. Cormen, C. E. Leiserson, and R. L. Rivest, Introduction to Algorithms. Cambridge, MA: MIT Press, 1990.
- T. M. Cover and J. A. Thomas, Elements of Information Theory. New York: John Wiley and Sons, 1991.
- G. Cross and A. Jain, “Markov random field texture models,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 5, pp. 25–39, 1983.
- I. Csiszar, “A geometric interpretation of Darroch and Ratcliff’s generalized iterative scaling,” Annals of Statistics, vol. 17, no. 3, pp. 1409–1413, 1989.
- I. Csiszar and G. Tusnady, “Information geometry and alternating minimization procedures,” Statistics and Decisions, Supplemental Issue 1, pp. 205–237, 1984.
- J. N. Darroch and D. Ratcliff, “Generalized iterative scaling for log-linear models,” Annals of Mathematical Statistics, vol. 43, pp. 1470–1480, 1972.
- C. Daskalakis, A. G. Dimakis, R. M. Karp, and M. J. Wainwright, “Probabilistic analysis of linear programming decoding,” IEEE Transactions on Information Theory, vol. 54, no. 8, pp. 3565–3578, 2008.
- A. d’Aspremont, O. Banerjee, and L. El Ghaoui, “First order methods for sparse covariance selection,” SIAM Journal on Matrix Analysis and its Applications, vol. 30, no. 1, pp. 55–66, 2008.
- J. Dauwels, H. A. Loeliger, P. Merkli, and M. Ostojic, “On structuredsummary propagation, LFSR synchronization, and low-complexity trellis decoding,” in Proceedings of the Allerton Conference on Control, Communication and Computing, Monticello, IL, 2003.
- A. P. Dawid, “Applications of a general propagation algorithm for probabilistic expert systems,” Statistics and Computing, vol. 2, pp. 25–36, 1992.
- R. Dechter, Constraint Processing. San Francisco, CA: Morgan Kaufmann, 2003.
- J. W. Demmel, Applied Numerical Linear Algebra. Philadelphia, PA: SIAM, 1997.
- A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm,” Journal of the Royal Statistical Society, Series B, vol. 39, pp. 1–38, 1977.
- M. Deza and M. Laurent, Geometry of Cuts and Metric Embeddings. New York: Springer-Verlag, 1997.
- A. G. Dimakis and M. J. Wainwright, “Guessing facets: Improved LP decoding and polytope structure,” in International Symposium on Information Theory, Seattle, WA, 2006.
- M. Dudik, S. J. Phillips, and R. E. Schapire, “Maximum entropy density estimation with generalized regularization and an application to species distribution modeling,” Journal of Machine Learning Research, vol. 8, pp. 1217–1260, 2007.
- R. Durbin, S. Eddy, A. Krogh, and G. Mitchison, eds., Biological Sequence Analysis. Cambridge, UK: Cambridge University Press, 1998.
- J. Edmonds, “Matroids and the greedy algorithm,” Mathematical Programming, vol. 1, pp. 127–136, 1971.
- B. Efron, “The geometry of exponential families,” Annals of Statistics, vol. 6, pp. 362–376, 1978.
- G. Elidan, I. McGraw, and D. Koller, “Residual belief propagation: Informed scheduling for asynchronous messagepassing,” in Proceedings of the 22nd Conference on Uncertainty in Artificial Intelligence, Arlington, VA: AUAI Press, 2006.
- J. Feldman, D. R. Karger, and M. J. Wainwright, “Linear programming-based decoding of turbo-like codes and its relation to iterative approaches,” in Proceedings of the Allerton Conference on Control, Communication and Computing, Monticello, IL, 2002.
- J. Feldman, T. Malkin, R. A. Servedio, C. Stein, and M. J. Wainwright, “LP decoding corrects a constant fraction of errors,” IEEE Transactions on Information Theory, vol. 53, no. 1, pp. 82–89, 2007.
- J. Feldman, M. J. Wainwright, and D. R. Karger, “Using linear programming to decode binary linear codes,” IEEE Transactions on Information Theory, vol. 51, pp. 954–972, 2005.
- J. Felsenstein, “Evolutionary trees from DNA sequences: A maximum likelihood approach,” Journal of Molecular Evolution, vol. 17, pp. 368–376, 1981.
- S. E. Fienberg, “Contingency tables and log-linear models: Basic results and new developments,” Journal of the American Statistical Association, vol. 95, no. 450, pp. 643–647, 2000.
- M. E. Fisher, “On the dimer solution of planar Ising models,” Journal of Mathematical Physics, vol. 7, pp. 1776–1781, 1966.
- G. D. Forney, Jr., “The Viterbi algorithm,” Proceedings of the IEEE, vol. 61, pp. 268–277, 1973.
- G. D. Forney, Jr., R. Koetter, F. R. Kschischang, and A. Reznick, “On the effective weights of pseudocodewords for codes defined on graphs with cycles,” in Codes, Systems and Graphical Models, pp. 101–112, New York: Springer, 2001.
- W. T. Freeman, E. C. Pasztor, and O. T. Carmichael, “Learning low-level vision,” International Journal on Computer Vision, vol. 40, no. 1, pp. 25–47, 2000.
- B. J. Frey, R. Koetter, and N. Petrovic, “Very loopy belief propagation for unwrapping phase images,” in Advances in Neural Information Processing Systems, pp. 737–743, Cambridge, MA: MIT Press, 2001.
- B. J. Frey, R. Koetter, and A. Vardy, “Signal-space characterization of iterative decoding,” IEEE Transactions on Information Theory, vol. 47, pp. 766– 781, 2001.
- R. G. Gallager, Low-Density Parity Check Codes. Cambridge, MA: MIT Press, 1963.
- A. E. Gelfand and A. F. M. Smith, “Sampling-based approaches to calculating marginal densities,” Journal of the American Statistical Association, vol. 85, pp. 398–409, 1990.
- S. Geman and D. Geman, “Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 6, pp. 721–741, 1984.
- H. O. Georgii, Gibbs Measures and Phase Transitions. New York: De Gruyter, 1988.
- Z. Ghahramani and M. J. Beal, “Propagation algorithms for variational Bayesian learning,” in Advances in Neural Information Processing Systems, pp. 507–513, Cambridge, MA: MIT Press, 2001.
- Z. Ghahramani and M. I. Jordan, “Factorial hidden Markov models,” Machine Learning, vol. 29, pp. 245–273, 1997.
- W. Gilks, S. Richardson, and D. Spiegelhalter, eds., Markov Chain Monte Carlo in Practice. New York: Chapman and Hall, 1996.
- A. Globerson and T. Jaakkola, “Approximate inference using planar graph decomposition,” in Advances in Neural Information Processing Systems, pp. 473–480, Cambridge, MA: MIT Press, 2006.
- A. Globerson and T. Jaakkola, “Approximate inference using conditional entropy decompositions,” in Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, San Juan, Puerto Rico, 2007.
- A. Globerson and T. Jaakkola, “Convergent propagation algorithms via oriented trees,” in Proceedings of the 23rd Conference on Uncertainty in Artificial Intelligence, Arlington, VA: AUAI Press, 2007.
- A. Globerson and T. Jaakkola, “Fixing max-product: Convergent message passing algorithms for MAP, LP-relaxations,” in Advances in Neural Information Processing Systems, pp. 553–560, Cambridge, MA: MIT Press, 2007.
- M. X. Goemans and D. P. Williamson, “Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming,” Journal of the ACM, vol. 42, pp. 1115–1145, 1995.
- G. Golub and C. Van Loan, Matrix Computations. Baltimore: Johns Hopkins University Press, 1996.
- V. Gomez, J. M. Mooij, and H. J. Kappen, “Truncating the loop series expansion for BP,” Journal of Machine Learning Research, vol. 8, pp. 1987–2016, 2007.
- D. M. Greig, B. T. Porteous, and A. H. Seheuly, “Exact maximum a posteriori estimation for binary images,” Journal of the Royal Statistical Society B, vol. 51, pp. 271–279, 1989.
- G. R. Grimmett, “A theorem about random fields,” Bulletin of the London Mathematical Society, vol. 5, pp. 81–84, 1973.
- M. Grotschel, L. Lovasz, and A. Schrijver, Geometric Algorithms and Combinatorial Optimization. Berlin: Springer-Verlag, 1993.
- M. Grotschel and K. Truemper, “Decomposition and optimization over cycles in binary matroids,” Journal on Combination Theory, Series B, vol. 46, pp. 306–337, 1989.
- P. L. Hammer, P. Hansen, and B. Simeone, “Roof duality, complementation, and persistency in quadratic 0-1 optimization,” Mathematical Programming, vol. 28, pp. 121–155, 1984.
- J. M. Hammersley and P. Clifford, “Markov fields on finite graphs and lattices,” Unpublished, 1971.
- M. Hassner and J. Sklansky, “The use of Markov random fields as models of texture,” Computer Graphics and Image Processing, vol. 12, pp. 357–370, 1980.
- T. Hazan and A. Shashua, “Convergent message-passing algorithms for inference over general graphs with convex free energy,” in Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, Arlington, VA: AUAI Press, 2008.
- T. Heskes, “On the uniqueness of loopy belief propagation fixed points,” Neural Computation, vol. 16, pp. 2379–2413, 2004.
- T. Heskes, “Convexity arguments for efficient minimization of the Bethe and Kikuchi free energies,” Journal of Artificial Intelligence Research, vol. 26, pp. 153–190, 2006.
- T. Heskes, K. Albers, and B. Kappen, “Approximate inference and constrained optimization,” in Proceedings of the 19th Conference on Uncertainty in Artificial Intelligence, pp. 313–320, San Francisco, CA: Morgan Kaufmann, 2003.
- J. Hiriart-Urruty and C. Lemarechal, Convex Analysis and Minimization Algorithms. Vol. 1, New York: Springer-Verlag, 1993.
- J. Hiriart-Urruty and C. Lemarechal, Fundamentals of Convex Analysis. Springer-Verlag: New York, 2001.
- R. A. Horn and C. R. Johnson, Matrix Analysis. Cambridge, UK: Cambridge University Press, 1985.
- J. Hu, H.-A. Loeliger, J. Dauwels, and F. Kschischang, “A general computation rule for lossy summaries/messages with examples from equalization,” in Proceedings of the Allerton Conference on Control, Communication and Computing, Monticello, IL, 2005.
- B. Huang and T. Jebara, “Loopy belief propagation for bipartite maximum weight b-matching,” in Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, San Juan, Puerto Rico, 2007.
- X. Huang, A. Acero, H.-W. Hon, and R. Reddy, Spoken Language Processing. New York: Prentice Hall, 2001.
- A. Ihler, J. Fisher, and A. S. Willsky, “Loopy belief propagation: Convergence and effects of message errors,” Journal of Machine Learning Research, vol. 6, pp. 905–936, 2005.
- E. Ising, “Beitrag zur theorie der ferromagnetismus,” Zeitschrift fur Physik, vol. 31, pp. 253–258, 1925.
- T. S. Jaakkola, “Tutorial on variational approximation methods,” in Advanced Mean Field Methods: Theory and Practice, (M. Opper and D. Saad, eds.), pp. 129–160, Cambridge, MA: MIT Press, 2001.
- T. S. Jaakkola and M. I. Jordan, “Improving the mean field approximation via the use of mixture distributions,” in Learning in Graphical Models, (M. I. Jordan, ed.), pp. 105–161, MIT Press, 1999.
- T. S. Jaakkola and M. I. Jordan, “Variational probabilistic inference and the QMR-DT network,” Journal of Artificial Intelligence Research, vol. 10, pp. 291–322, 1999.
- E. T. Jaynes, “Information theory and statistical mechanics,” Physical Review, vol. 106, pp. 620–630, 1957.
- J. K. Johnson, D. M. Malioutov, and A. S. Willsky, “Lagrangian relaxation for MAP estimation in graphical models,” in Proceedings of the Allerton Conference on Control, Communication and Computing, Monticello, IL, 2007.
- M. I. Jordan, Z. Ghahramani, T. S. Jaakkola, and L. Saul, “An introduction to variational methods for graphical models,” Machine Learning, vol. 37, pp. 183– 233, 1999.
- T. Kailath, A. H. Sayed, and B. Hassibi, Linear Estimation. Englewood Cliffs, NJ: Prentice Hall, 2000.
- H. Kappen and P. Rodriguez, “Efficient learning in Boltzmann machines using linear response theory,” Neural Computation, vol. 10, pp. 1137–1156, 1998.
- H. J. Kappen and W. Wiegerinck, “Novel iteration schemes for the cluster variation method,” in Advances in Neural Information Processing Systems, pp. 415–422, Cambridge, MA: MIT Press, 2002.
- D. Karger and N. Srebro, “Learning Markov networks: Maximum bounded tree-width graphs,” in Symposium on Discrete Algorithms, pp. 392–401, 2001.
- S. Karlin and W. Studden, Tchebycheff Systems, with Applications in Analysis and Statistics. New York: Interscience Publishers, 1966.
- R. Karp, “Reducibility among combinatorial problems,” in Complexity of Computer Computations, pp. 85–103, New York: Plenum Press, 1972.
- P. W. Kastelyn, “Dimer statistics and phase transitions,” Journal of Mathematical Physics, vol. 4, pp. 287–293, 1963.
- R. Kikuchi, “The theory of cooperative phenomena,” Physical Review, vol. 81, pp. 988–1003, 1951.
- S. Kim and M. Kojima, “Second order cone programming relaxation of nonconvex quadratic optimization problems,” Technical report, Tokyo Institute of Technology, July 2000.
- J. Kleinberg and E. Tardos, “Approximation algorithms for classification problems with pairwise relationships: Metric labeling and Markov random fields,” Journal of the ACM, vol. 49, pp. 616–639, 2002.
- R. Koetter and P. O. Vontobel, “Graph-covers and iterative decoding of finite length codes,” in Proceedings of the 3rd International Symposium on Turbo Codes, pp. 75–82, Brest, France, 2003.
- V. Kolmogorov, “Convergent tree-reweighted message-passing for energy minimization,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1568–1583, 2006.
- V. Kolmogorov and M. J. Wainwright, “On optimality properties of treereweighted message-passing,” in Proceedings of the 21st Conference on Uncertainty in Artificial Intelligence, pp. 316–322, Arlington, VA: AUAI Press, 2005.
- N. Komodakis, N. Paragios, and G. Tziritas, “MRF optimization via dual decomposition: Message-passing revisited,” in International Conference on Computer Vision, Rio de Janeiro, Brazil, 2007.
- V. K. Koval and M. I. Schlesinger, “Two-dimensional programming in image analysis problems,” USSR Academy of Science, Automatics and Telemechanics, vol. 8, pp. 149–168, 1976.
- A. Krogh, B. Larsson, G. von Heijne, and E. L. L. Sonnhammer, “Predicting transmembrane protein topology with a hidden Markov model: Application to complete genomes,” Journal of Molecular Biology, vol. 305, no. 3, pp. 567–580, 2001.
- F. R. Kschischang and B. J. Frey, “Iterative decoding of compound codes by probability propagation in graphical models,” IEEE Selected Areas in Communications, vol. 16, no. 2, pp. 219–230, 1998.
- F. R. Kschischang, B. J. Frey, and H.-A. Loeliger, “Factor graphs and the sum-product algorithm,” IEEE Transactions on Information Theory, vol. 47, no. 2, pp. 498–519, 2001.
- A. Kulesza and F. Pereira, “Structured learning with approximate inference,” in Advances in Neural Information Processing Systems, pp. 785–792, Cambridge, MA: MIT Press, 2008.
- P. Kumar, V. Kolmogorov, and P. H. S. Torr, “An analysis of convex relaxations for MAP estimation,” in Advances in Neural Information Processing Systems, pp. 1041–1048, Cambridge, MA: MIT Press, 2008.
- P. Kumar, P. H. S. Torr, and A. Zisserman, “Solving Markov random fields using second order cone programming,” IEEE Conference of Computer Vision and Pattern Recognition, pp. 1045–1052, 2006.
- J. B. Lasserre, “An explicit equivalent positive semidefinite program for nonlinear 0–1 programs,” SIAM Journal on Optimization, vol. 12, pp. 756–769, 2001.
- J. B. Lasserre, “Global optimization with polynomials and the problem of moments,” SIAM Journal on Optimization, vol. 11, no. 3, pp. 796–817, 2001.
- M. Laurent, “Semidefinite relaxations for Max-Cut,” in The Sharpest Cut: Festschrift in Honor of M. Padberg’s 60th Birthday, New York: MPS-SIAM Series in Optimization, 2002.
- M. Laurent, “A comparison of the Sherali-Adams, Lovasz-Schrijver and Lasserre relaxations for 0-1 programming,” Mathematics of Operations Research, vol. 28, pp. 470–496, 2003.
- S. L. Lauritzen, Lectures on Contingency Tables. Department of Mathematics, Aalborg University, 1989.
- S. L. Lauritzen, “Propagation of probabilities, means and variances in mixed graphical association models,” Journal of the American Statistical Association, vol. 87, pp. 1098–1108, 1992.
- S. L. Lauritzen, Graphical Models. Oxford: Oxford University Press, 1996.
- S. L. Lauritzen and D. J. Spiegelhalter, “Local computations with probabilities on graphical structures and their application to expert systems,” Journal of the Royal Statistical Society, Series B, vol. 50, pp. 155–224, 1988.
- M. A. R. Leisink and H. J. Kappen, “Learning in higher order Boltzmann machines using linear response,” Neural Networks, vol. 13, pp. 329–335, 2000.
- M. A. R. Leisink and H. J. Kappen, “A tighter bound for graphical models,” in Advances in Neural Information Processing Systems, pp. 266–272, Cambridge, MA: MIT Press, 2001.
- H. A. Loeliger, “An introduction to factor graphs,” IEEE Signal Processing Magazine, vol. 21, pp. 28–41, 2004.
- L. Lovasz, “Submodular functions and convexity,” in Mathematical Programming: The State of the Art, (A. Bachem, M. Grotschel, and B. Korte, eds.), pp. 235–257, New York: Springer-Verlag, 1983.
- L. Lovasz and A. Schrijver, “Cones of matrices, set-functions and 0-1 optimization,” SIAM Journal of Optimization, vol. 1, pp. 166–190, 1991.
- M. Luby, M. Mitzenmacher, M. A. Shokrollahi, and D. Spielman, “Improved low-density parity check codes using irregular graphs,” IEEE Transactions on Information Theory, vol. 47, pp. 585–598, 2001.
- D. M. Malioutov, J. M. Johnson, and A. S. Willsky, “Walk-sums and belief propagation in Gaussian graphical models,” Journal of Machine Learning Research, vol. 7, pp. 2013–2064, 2006.
- E. Maneva, E. Mossel, and M. J. Wainwright, “A new look at survey propagation and its generalizations,” Journal of the ACM, vol. 54, no. 4, pp. 2–41, 2007.
- C. D. Manning and H. Schutze, Foundations of Statistical Natural Language Processing. Cambridge, MA: MIT Press, 1999.
- S. Maybeck, Stochastic Models, Estimation, and Control. New York: Academic Press, 1982.
- J. McAuliffe, L. Pachter, and M. I. Jordan, “Multiple-sequence functional annotation and the generalized hidden Markov phylogeny,” Bioinformatics, vol. 20, pp. 1850–1860, 2004.
- R. J. McEliece, D. J. C. McKay, and J. F. Cheng, “Turbo decoding as an instance of Pearl’s belief propagation algorithm,” IEEE Journal on Selected Areas in Communications, vol. 16, no. 2, no. 2, pp. 140–152, 1998.
- R. J. McEliece and M. Yildirim, “Belief propagation on partially ordered sets,” in Mathematical Theory of Systems and Networks, (D. Gilliam and J. Rosenthal, eds.), Minneapolis, MN: Institute for Mathematics and its Applications, 2002.
- T. Meltzer, C. Yanover, and Y. Weiss, “Globally optimal solutions for energy minimization in stereo vision using reweighted belief propagation,” in International Conference on Computer Vision, pp. 428–435, Silver Springs, MD: IEEE Computer Society, 2005.
- M. Mezard and A. Montanari, Information, Physics and Computation. Oxford: Oxford University Press, 2008.
- M. Mezard, G. Parisi, and R. Zecchina, “Analytic and algorithmic solution of random satisfiability problems,” Science, vol. 297, p. 812, 2002.
- M. Mezard and R. Zecchina, “Random K-satisfiability: From an analytic solution to an efficient algorithm,” Physical Review E, vol. 66, p. 056126, 2002.
- T. Minka, “Expectation propagation and approximate Bayesian inference,” in Proceedings of the 17th Conference on Uncertainty in Artificial Intelligence, pp. 362–369, San Francisco, CA: Morgan Kaufmann, 2001.
- T. Minka, “Power EP,” Technical Report MSR-TR-2004-149, Microsoft Research, October 2004.
- T. Minka and Y. Qi, “Tree-structured approximations by expectation propagation,” in Advances in Neural Information Processing Systems, pp. 193–200, Cambridge, MA: MIT Press, 2004.
- T. P. Minka, “A family of algorithms for approximate Bayesian inference,” PhD thesis, MIT, January 2001.
- C. Moallemi and B. van Roy, “Convergence of the min-sum message-passing algorithm for quadratic optimization,” Technical Report, Stanford University, March 2006.
- C. Moallemi and B. van Roy, “Convergence of the min-sum algorithm for convex optimization,” Technical Report, Stanford University, May 2007.
- J. M. Mooij and H. J. Kappen, “Sufficient conditions for convergence of loopy belief propagation,” in Proceedings of the 21st Conference on Uncertainty in Artificial Intelligence, pp. 396–403, Arlington, VA: AUAI Press, 2005.
- R. Neal and G. E. Hinton, “A view of the EM algorithm that justifies incremental, sparse, and other variants,” in Learning in Graphical Models, (M. I. Jordan, ed.), Cambridge, MA: MIT Press, 1999.
- G. L. Nemhauser and L. A. Wolsey, Integer and Combinatorial Optimization. New York: Wiley-Interscience, 1999.
- Y. Nesterov, “Semidefinite relaxation and non-convex quadratic optimization,” Optimization methods and software, vol. 12, pp. 1–20, 1997.
- M. Opper and D. Saad, “Adaptive TAP equations,” in Advanced Mean Field Methods: Theory and Practice, (M. Opper and D. Saad, eds.), pp. 85–98, Cambridge, MA: MIT Press, 2001.
- M. Opper and O. Winther, “Gaussian processes for classification: Mean field algorithms,” Neural Computation, vol. 12, no. 11, pp. 2177–2204, 2000.
- M. Opper and O. Winther, “Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer approach,” Physical Review Letters, vol. 64, p. 3695, 2001.
- M. Opper and O. Winther, “Expectation-consistent approximate inference,” Journal of Machine Learning Research, vol. 6, pp. 2177–2204, 2005.
- J. G. Oxley, Matroid Theory. Oxford: Oxford University Press, 1992.
- M. Padberg, “The Boolean quadric polytope: Some characteristics, facets and relatives,” Mathematical Programming, vol. 45, pp. 139–172, 1989.
- P. Pakzad and V. Anantharam, “Iterative algorithms and free energy minimization,” in Annual Conference on Information Sciences and Systems, Princeton, NJ, 2002.
- P. Pakzad and V. Anantharam, “Estimation and marginalization using Kikuchi approximation methods,” Neural Computation, vol. 17, no. 8, pp. 1836–1873, 2005.
- G. Parisi, Statistical Field Theory. New York: Addison-Wesley, 1988.
- P. Parrilo, “Semidefinite programming relaxations for semialgebraic problems,” Mathematical Programming, Series B, vol. 96, pp. 293–320, 2003.
- J. Pearl, Probabilistic Reasoning in Intelligent Systems. San Francisco, CA: Morgan Kaufmann, 1988.
- J. S. Pedersen and J. Hein, “Gene finding with a hidden Markov model of genome structure and evolution,” Bioinformatics, vol. 19, pp. 219–227, 2003.
- P. Pevzner, Computational Molecular Biology: An Algorithmic Approach. Cambridge, MA: MIT Press, 2000.
- T. Plefka, “Convergence condition of the TAP equation for the infinite-ranged Ising model,” Journal of Physics A, vol. 15, no. 6, pp. 1971–1978, 1982.
- L. R. Rabiner and B. H. Juang, Fundamentals of Speech Recognition. Englewood Cliffs, NJ: Prentice Hall, 1993.
- P. Ravikumar, A. Agarwal, and M. J. Wainwright, “Message-passing for graph-structured linear programs: Proximal projections, convergence and rounding schemes,” in International Conference on Machine Learning, pp. 800–807, New York: ACM Press, 2008.
- P. Ravikumar and J. Lafferty, “Quadratic programming relaxations for metric labeling and Markov random field map estimation,” International Conference on Machine Learning, pp. 737–744, 2006.
- T. Richardson and R. Urbanke, “The capacity of low-density parity check codes under message-passing decoding,” IEEE Transactions on Information Theory, vol. 47, pp. 599–618, 2001.
- T. Richardson and R. Urbanke, Modern Coding Theory. Cambridge, UK: Cambridge University Press, 2008.
- B. D. Ripley, Spatial Statistics. New York: Wiley, 1981.
- C. P. Robert and G. Casella, Monte Carlo statistical methods. Springer texts in statistics, New York, NY: Springer-Verlag, 1999.
- G. Rockafellar, Convex Analysis. Princeton, NJ: Princeton University Press, 1970.
- T. G. Roosta, M. J. Wainwright, and S. S. Sastry, “Convergence analysis of reweighted sum-product algorithms,” IEEE Transactions on Signal Processing, vol. 56, no. 9, pp. 4293–4305, 2008.
- P. Rusmevichientong and B. Van Roy, “An analysis of turbo decoding with Gaussian densities,” in Advances in Neural Information Processing Systems, pp. 575–581, Cambridge, MA: MIT Press, 2000.
- S. Sanghavi, D. Malioutov, and A. Willsky, “Linear programming analysis of loopy belief propagation for weighted matching,” in Advances in Neural Information Processing Systems, pp. 1273–1280, Cambridge, MA: MIT Press, 2007.
- S. Sanghavi, D. Shah, and A. Willsky, “Message-passing for max-weight independent set,” in Advances in Neural Information Processing Systems, pp. 1281–1288, Cambridge, MA: MIT Press, 2007.
- L. K. Saul and M. I. Jordan, “Boltzmann chains and hidden Markov models,” in Advances in Neural Information Processing Systems, pp. 435–442, Cambridge, MA: MIT Press, 1995.
- L. K. Saul and M. I. Jordan, “Exploiting tractable substructures in intractable networks,” in Advances in Neural Information Processing Systems, pp. 486– 492, Cambridge, MA: MIT Press, 1996.
- M. I. Schlesinger, “Syntactic analysis of two-dimensional visual signals in noisy conditions,” Kibernetika, vol. 4, pp. 113–130, 1976.
- A. Schrijver, Theory of Linear and Integer Programming. New York: WileyInterscience Series in Discrete Mathematics, 1989.
- A. Schrijver, Combinatorial Optimization: Polyhedra and Efficiency. New York: Springer-Verlag, 2003.
- M. Seeger, “Expectation propagation for exponential families,” Technical Report, Max Planck Institute, Tuebingen, November 2005.
- M. Seeger, F. Steinke, and K. Tsuda, “Bayesian inference and optimal design in the sparse linear model,” in Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, San Juan, Puerto Rico, 2007.
- P. D. Seymour, “Matroids and multi-commodity flows,” European Journal on Combinatorics, vol. 2, pp. 257–290, 1981.
- G. R. Shafer and P. P. Shenoy, “Probability propagation,” Annals of Mathematics and Artificial Intelligence, vol. 2, pp. 327–352, 1990.
- H. D. Sherali and W. P. Adams, “A hierarchy of relaxations between the continuous and convex hull representations for zero-one programming problems,” SIAM Journal on Discrete Mathematics, vol. 3, pp. 411–430, 1990.
- A. Siepel and D. Haussler, “Combining phylogenetic and hidden Markov models in biosequence analysis,” in Proceedings of the Seventh Annual International Conference on Computational Biology, pp. 277–286, 2003.
- D. Sontag and T. Jaakkola, “New outer bounds on the marginal polytope,” in Advances in Neural Information Processing Systems, pp. 1393–1400, Cambridge, MA: MIT Press, 2007.
- T. P. Speed and H. T. Kiiveri, “Gaussian Markov distributions over finite graphs,” Annals of Statistics, vol. 14, no. 1, pp. 138–150, 1986.
- R. P. Stanley, Enumerative Combinatorics. Vol. 1, Cambridge, UK: Cambridge University Press, 1997.
- R. P. Stanley, Enumerative Combinatorics. Vol. 2, Cambridge, UK: Cambridge University Press, 1997.
- E. B. Sudderth, M. J. Wainwright, and A. S. Willsky, “Embedded trees: Estimation of Gaussian processes on graphs with cycles,” IEEE Transactions on Signal Processing, vol. 52, no. 11, pp. 3136–3150, 2004.
- E. B. Sudderth, M. J. Wainwright, and A. S. Willsky, “Loop series and Bethe variational bounds for attractive graphical models,” in Advances in Neural Information Processing Systems, pp. 1425–1432, Cambridge, MA: MIT Press, 2008.
- C. Sutton and A. McCallum, “Piecewise training of undirected models,” in Proceedings of the 21st Conference on Uncertainty in Artificial Intelligence, pp. 568–575, San Francisco, CA: Morgan Kaufmann, 2005.
- C. Sutton and A. McCallum, “Improved dyamic schedules for belief propagation,” in Proceedings of the 23rd Conference on Uncertainty in Artificial Intelligence, San Francisco, CA: Morgan Kaufmann, 2007.
- R. Szeliski, R. Zabih, D. Scharstein, O. Veskler, V. Kolmogorov, A. Agarwala, M. Tappen, and C. Rother, “Comparative study of energy minimization methods for Markov random fields,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, pp. 1068–1080, 2008.
- M. H. Taghavi and P. H. Siegel, “Adaptive linear programming decoding,” in International Symposium on Information Theory, Seattle, WA, 2006.
- K. Tanaka and T. Morita, “Cluster variation method and image restoration problem,” Physics Letters A, vol. 203, pp. 122–128, 1995.
- S. Tatikonda and M. I. Jordan, “Loopy belief propagation and Gibbs measures,” in Proceedings of the 18th Conference on Uncertainty in Artificial Intelligence, pp. 493–500, San Francisco, CA: Morgan Kaufmann, 2002.
- Y. W. Teh and M. Welling, “On improving the efficiency of the iterative proportional fitting procedure,” in Proceedings of the Ninth International Conference on Artificial Intelligence and Statistics, Key West, FL, 2003.
- A. Thomas, A. Gutin, V. Abkevich, and A. Bansal, “Multilocus linkage analysis by blocked Gibbs sampling,” Statistics and Computing, vol. 10, pp. 259– 269, 2000.
- A. W. van der Vaart, Asymptotic Statistics. Cambridge, UK: Cambridge University Press, 1998.
- L. Vandenberghe and S. Boyd, “Semidefinite programming,” SIAM Review, vol. 38, pp. 49–95, 1996.
- L. Vandenberghe, S. Boyd, and S. Wu, “Determinant maximization with linear matrix inequality constraints,” SIAM Journal on Matrix Analysis and Applications, vol. 19, pp. 499–533, 1998.
- V. V. Vazirani, Approximation Algorithms. New York: Springer-Verlag, 2003.
- S. Verduand H. V. Poor, “Abstract dynamic programming models under commutativity conditions,” SIAM Journal of Control and Optimization, vol. 25, no. 4, pp. 990–1006, 1987.
- P. O. Vontobel and R. Koetter, “Lower bounds on the minimum pseudo-weight of linear codes,” in International Symposium on Information Theory, Chicago, IL, 2004.
- P. O. Vontobel and R. Koetter, “Towards low-complexity linear-programming decoding,” in Proceedings of the International Conference on Turbo Codes and Related Topics, Munich, Germany, 2006.
- M. J. Wainwright, “Stochastic processes on graphs with cycles: Geometric and variational approaches,” PhD thesis, MIT, May 2002.
- M. J. Wainwright, “Estimating the “wrong” graphical model: Benefits in the computation-limited regime,” Journal of Machine Learning Research, vol. 7, pp. 1829–1859, 2006.
- M. J. Wainwright, T. S. Jaakkola, and A. S. Willsky, “Tree-based reparameterization framework for analysis of sum-product and related algorithms,” IEEE Transactions on Information Theory, vol. 49, no. 5, pp. 1120–1146, 2003.
- M. J. Wainwright, T. S. Jaakkola, and A. S. Willsky, “Tree-reweighted belief propagation algorithms and approximate ML, estimation by pseudomoment matching,” in Proceedings of the Ninth International Conference on Artificial Intelligence and Statistics, 2003.
- M. J. Wainwright, T. S. Jaakkola, and A. S. Willsky, “Tree consistency and bounds on the max-product algorithm and its generalizations,” Statistics and Computing, vol. 14, pp. 143–166, 2004.
- M. J. Wainwright, T. S. Jaakkola, and A. S. Willsky, “Exact MAP estimates via agreement on (hyper)trees: Linear programming and message-passing,” IEEE Transactions on Information Theory, vol. 51, no. 11, pp. 3697–3717, 2005.
- M. J. Wainwright, T. S. Jaakkola, and A. S. Willsky, “A new class of upper bounds on the log partition function,” IEEE Transactions on Information Theory, vol. 51, no. 7, pp. 2313–2335, 2005.
- M. J. Wainwright and M. I. Jordan, “Treewidth-based conditions for exactness of the Sherali-Adams and Lasserre relaxations,” Technical Report 671, University of California, Berkeley, Department of Statistics, September 2004.
- M. J. Wainwright and M. I. Jordan, “Log-determinant relaxation for approximate inference in discrete Markov random fields,” IEEE Transactions on Signal Processing, vol. 54, no. 6, pp. 2099–2109, 2006.
- Y. Weiss, “Correctness of local probability propagation in graphical models with loops,” Neural Computation, vol. 12, pp. 1–41, 2000.
- Y. Weiss and W. T. Freeman, “Correctness of belief propagation in Gaussian graphical models of arbitrary topology,” in Advances in Neural Information Processing Systems, pp. 673–679, Cambridge, MA: MIT Press, 2000.
- Y. Weiss, C. Yanover, and T. Meltzer, “MAP estimation, linear programming, and belief propagation with convex free energies,” in Proceedings of the 23rd Conference on Uncertainty in Artificial Intelligence, Arlington, VA: AUAI Press, 2007.
- M. Welling, T. Minka, and Y. W. Teh, “Structured region graphs: Morphing EP into GBP,” in Proceedings of the 21st Conference on Uncertainty in Artificial Intelligence, pp. 609–614, Arlington, VA: AUAI Press, 2005.
- M. Welling and S. Parise, “Bayesian random fields: The Bethe-Laplace approximation,” in Proceedings of the 22nd Conference on Uncertainty in Artificial Intelligence, Arlington, VA: AUAI Press, 2006.
- M. Welling and Y. W. Teh, “Belief optimization: A stable alternative to loopy belief propagation,” in Proceedings of the 17th Conference on Uncertainty in Artificial Intelligence, pp. 554–561, San Francisco, CA: Morgan Kaufmann, 2001.
- M. Welling and Y. W. Teh, “Linear response for approximate inference,” in Advances in Neural Information Processing Systems, pp. 361–368, Cambridge, MA: MIT Press, 2004.
- T. Werner, “A linear programming approach to max-sum problem: A review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 7, pp. 1165–1179, 2007.
- N. Wiberg, “Codes and decoding on general graphs,” PhD thesis, University of Linkoping, Sweden, 1996.
- N. Wiberg, H. A. Loeliger, and R. Koetter, “Codes and iterative decoding on general graphs,” European Transactions on Telecommunications, vol. 6, pp. 513–526, 1995.
- W. Wiegerinck, “Variational approximations between mean field theory and the junction tree algorithm,” in Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence, pp. 626–633, San Francisco, CA: Morgan Kaufmann Publishers, 2000.
- W. Wiegerinck, “Approximations with reweighted generalized belief propagation,” in Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, pp. 421–428, Barbardos, 2005.
- W. Wiegerinck and T. Heskes, “Fractional belief propagation,” in Advances in Neural Information Processing Systems, pp. 438–445, Cambridge, MA: MIT Press, 2002.
- A. S. Willsky, “Multiresolution Markov models for signal and image processing,” Proceedings of the IEEE, vol. 90, no. 8, pp. 1396–1458, 2002.
- J. W. Woods, “Markov image modeling,” IEEE Transactions on Automatic Control, vol. 23, pp. 846–850, 1978.
- N. Wu, The Maximum Entropy Method. New York: Springer, 1997.
- C. Yanover, T. Meltzer, and Y. Weiss, “Linear programming relaxations and belief propagation: An empirical study,” Journal of Machine Learning Research, vol. 7, pp. 1887–1907, 2006.
- C. Yanover, O. Schueler-Furman, and Y. Weiss, “Minimizing and learning energy functions for side-chain prediction,” in Eleventh Annual Conference on Research in Computational Molecular Biology, pp. 381–395, San Francisco, CA, 2007.
- J. S. Yedidia, “An idiosyncratic journey beyond mean field theory,” in Advanced Mean Field Methods: Theory and Practice, (M. Opper and D. Saad, eds.), pp. 21–36, Cambridge, MA: MIT Press, 2001.
- J. S. Yedidia, W. T. Freeman, and Y. Weiss, “Generalized belief propagation,” in Advances in Neural Information Processing Systems, pp. 689–695, Cambridge, MA: MIT Press, 2001.
- J. S. Yedidia, W. T. Freeman, and Y. Weiss, “Constructing free energy approximations and generalized belief propagation algorithms,” IEEE Transactions on Information Theory, vol. 51, no. 7, pp. 2282–2312, 2005.
- A. Yuille, “CCCP algorithms to minimize the Bethe and Kikuchi free energies: Convergent alternatives to belief propagation,” Neural Computation, vol. 14, pp. 1691–1722, 2002.
- G. M. Ziegler, Lectures on Polytopes. New York: Springer-Verlag, 1995.
标签
评论
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn