Foundations and Trends® in Machine Learning机器学习是人工智能的一个分支。人工智能的研究历史有着一条从以“推理”为重点，到以“知识”为重点，再到以“学习”为重点的自然、清晰的脉络。显然，机器学习是实现人工智能的一个途径，即以机器学习为手段解决人工智能中的问题。机器学习在近30多年已发展为一门多领域交叉学科，涉及概率论、统计学、逼近论、凸分析、计算复杂性理论等多门学科。机器学习理论主要是设计和分析一些让计算机可以自动“学习”的算法。机器学习算法是一类从数据中自动分析获得规律，并利用规律对未知数据进行预测的算法。其基础涉及了大量的统计学理论，机器学习与推断统计学联系尤为密切，也被称为统计学习理论。而近年来的趋势则是使用更大规模的神经网络（即深度学习）及模型的可解释性方向发展。

Graph-structured data are an integral part of many application domains, including chemoinformatics, computational biology, neuroimaging, and social network analysis. Over the last fifteen years, numerous graph kernels, i.e. kernel functions between graphs, have been proposed to...

Foundations and Trends in Machine Learning, no. 4 (2019): 307-392

<div class="col-sm-9">
<h4>Abstract</h4>
<p>Variational autoencoders provide a principled framework
for learning deep latent-variable models and corresponding
inference models. In this work, we provide an introduction
...

Foundations and Trends in Machine Learning, no. 1-2 (2019): 1-286

Multi-armed bandits a simple but very powerful framework for algorithms that make decisions over time under uncertainty. An enormous body of work has accumulated over the years, covered in several books and surveys. This book provides a more introductory, textbook-like treatment ...

Foundations and Trends in Machine Learning, no. 5-6 (2019): 393-536

<div class="col-sm-9">
<h4>Abstract</h4>
<p>Spectral methods have been the mainstay in several domains
such as machine learning, applied mathematics and scientific
computing. They involve finding a certain kind of spec...

Foundations and Trends in Machine Learning, no. 3 (2019): 307-392

A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations. This is the fundamental problem of Bayesian statistics and machine learning, which frames all inference as expectations with respect to the posterior distribu...

Foundations and Trends in Machine Learning, no. 1 (2018): 1-96

Thompson sampling is an algorithm for online decision problemswhere actions are taken sequentially in a manner thatmust balance between exploiting what is known to maximizeimmediate performance and investing to accumulatenew information that may improve future performance. Thealg...

Foundations and Trends in Machine Learning, no. 3-4 (2018): 219-354

Deep reinforcement learning is the combination of reinforcement learning (RL) and deep learning. This field of research has been able to solve a wide range of complex decision-making tasks that were previously out of reach for a machine. Thus, deep RL opens up many new applicatio...

Foundations and Trends in Machine Learning, no. 5-6 (2018): 337-588

<div class="col-sm-9">
<h4>Abstract</h4>
<p>Many modern methods for prediction leverage nearest neighbor
search to find past training examples most similar to
a test example, an idea that dates back in text to at least...

Foundations and Trends in Machine Learning, no. 2 (2018): 97-218

This article provides a comprehensive, rigorous, and self-contained introduction to the analysis of Wishart matrix moments. This article may act as an introduction to some aspects of random matrix theory, or as a self-contained exposition of Wishart matrix moments. Random matrix ...

Foundations and Trends in Machine Learning, no. 1-2 (2017)

A Hilbert space embedding of distributions---in short, kernel mean embedding---has recently emerged as a powerful machinery for probabilistic modeling, statistical inference, machine learning, and causal discovery. The basic idea behind this framework is to map distributions in...

Foundations and Trends in Machine Learning, no. 6 (2017): 431-673

This monograph builds on Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions by discussing tensor network models for super-compressed higher-order representation of data/parameters and cost functions, together with an o...

Foundations and Trends in Machine Learning, no. 3-4 (2017): 142-336

A vast majority of machine learning algorithms train their models andperform inference by solving optimization problems. In order to capturethe learning and prediction problems accurately, structural constraintssuch as sparsity or low rank are frequently imposed or else the objec...

Foundations and Trends in Machine Learning, no. 1 (2016): 1-118

Principal components analysis (PCA) is a well-known technique for approximating a tabular data set by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types. This framework enco...

Foundations and Trends in Machine Learning, no. 4-5 (2016): 249-429

Modern applications in engineering and data science are increasinglybased on multidimensional data of exceedingly high volume, variety,and structural richness. However, standard machine learning algorithmstypically scale exponentially with data volume and complexityof cross-modal...

Foundations and Trends in Machine Learning, no. 2-3 (2016): 119-247

Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been consid...

Foundations and Trends in Machine Learning, no. 1-2 (2015)

In recent years, random matrices have come to play a major role in computational mathematics, but most of the classical areas of random matrix theory remain the province of experts. Over the last decade, with the advent of matrix concentration inequalities, research has advance...

Foundations and Trends® in Machine Learning, no. 5-6 (2015): 359-483

Bayesian methods for machine learning have been widely investigated,yielding principled methods for incorporating prior information intoinference algorithms. In this survey, we provide an in-depth reviewof the role of Bayesian methods for the reinforcement learning RLparadigm. Th...