Probabilistic Modelling, Machine Learning, and the Information Revolution

By Zoubin Ghahramani et al
Published on June 10, 2012
Read the original document by opening this link in a new tab.

Table of Contents

An Information Revolution? Modelling Tools Probabilistic Modelling Bayes Rule Representing Beliefs in Artificial Intelligence Representing Beliefs II The Dutch Book Theorem Bayesian Machine Learning Modeling vs toolbox views of Machine Learning Bayesian Nonparametrics Why nonparametrics? Parametric vs Nonparametric Models Gaussian and Dirichlet Processes Nonlinear regression and Gaussian processes Gaussian Processes and SVMs Some Comparisons Bayesian nonparametrics applied to models of other structured objects: Gaussian processes and Dirichlet processes In nite hidden Markov models (iHMMs) In nite HMM: Changepoint detection and video segmentation Sparse Matrices Indian Buffet Process The Big Picture: Relations between some models Nonparametric Binary Matrix Factorization Learning Structure of Deep Sparse Graphical Models Hierarchies Dirichlet Di usion Trees (DDT) Pitman-Yor Di usion Trees Covariance Matrices Generalised Wishart Processes for Covariance modelling

Summary

The document discusses various aspects of probabilistic modelling, machine learning, and the information revolution. It delves into tools for modelling, probabilistic modelling, Bayes rule, representing beliefs in artificial intelligence, Bayesian machine learning, nonparametric models, Gaussian processes, SVMs, Bayesian nonparametrics, hidden Markov models, sparse matrices, Indian Buffet Process, deep sparse graphical models, hierarchies, and covariance matrices. The content covers a wide range of topics in the field of machine learning and probabilistic modelling.
×
This is where the content will go.