Foundations and Trends® in Machine Learning机器学习是人工智能的一个分支。人工智能的研究历史有着一条从以“推理”为重点,到以“知识”为重点,再到以“学习”为重点的自然、清晰的脉络。显然,机器学习是实现人工智能的一个途径,即以机器学习为手段解决人工智能中的问题。机器学习在近30多年已发展为一门多领域交叉学科,涉及概率论、统计学、逼近论、凸分析、计算复杂性理论等多门学科。机器学习理论主要是设计和分析一些让计算机可以自动“学习”的算法。机器学习算法是一类从数据中自动分析获得规律,并利用规律对未知数据进行预测的算法。其基础涉及了大量的统计学理论,机器学习与推断统计学联系尤为密切,也被称为统计学习理论。而近年来的趋势则是使用更大规模的神经网络(即深度学习)及模型的可解释性方向发展。
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 5-6 (2020): 393-536
Spectral methods have been the mainstay in several domains such as machine learning, applied mathematics and scientific computing. They involve finding a certain kind of spectral decomposition to obtain basis functions that can capture important structures or directions for the p...
Cited by11BibtexViews89DOI
0
0
Found. Trends Mach. Learn., no. 2-3 (2020): 158-331
Cited by0BibtexViews66DOI
0
0
Found. Trends Mach. Learn., no. 1 (2020): 1-157
Cited by0BibtexViews88DOI
0
0
Karsten Borgwardt, Elisabetta Ghisu, Felipe Llinares-López, Leslie O'Bray,Bastian Rieck
Found. Trends Mach. Learn., no. 5-6 (2020)
Graph-structured data are an integral part of many application domains, including chemoinformatics, computational biology, neuroimaging, and social network analysis. Over the last fifteen years, numerous graph kernels, i.e. kernel functions between graphs, have been proposed to...
Cited by0BibtexViews59DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 4 (2019): 4-89
Variational autoencoders provide a principled framework for learning deep latent-variable models and corresponding inference models. In this work, we provide an introduction to variational autoencoders and some important extensions.
Cited by268BibtexViews158DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 1-2 (2019): 1-286
Multi-armed bandits a simple but very powerful framework for algorithms that make decisions over time under uncertainty. An enormous body of work has accumulated over the years, covered in several books and surveys. This book provides a more introductory, textbook-like treatment ...
Cited by114BibtexViews58DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 2 (2019): 97-218
These lecture notes provide a comprehensive, self-contained introduction to the analysis of Wishart matrix moments. This study may act as an introduction to some particular aspects of random matrix theory, or as a self-contained exposition of Wishart matrix moments. Random matrix...
Cited by2BibtexViews31DOI
0
0
Foundations and Trends in Machine Learning, no. 6 (2019): 803-886
Markov switching models MSMs are probabilistic models that employmultiple sets of parameters to describe different dynamic regimesthat a time series may exhibit at different periods of time. Theswitching mechanism between regimes is controlled by unobserved randomvariables that f...
Cited by1BibtexViews0DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 3 (2019): 187-306
A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations. This is the fundamental problem of Bayesian statistics and machine learning, which frames all inference as expectations with respect to the posterior distribu...
Cited by0BibtexViews51DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 3-4 (2018): 219-354
Deep reinforcement learning is the combination of reinforcement learning (RL) and deep learning. This field of research has been able to solve a wide range of complex decision-making tasks that were previously out of reach for a machine. Thus, deep RL opens up many new applicatio...
Cited by342BibtexViews119DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 1 (2018): 1-96
Thompson sampling is an algorithm for online decision problems where actions are taken sequentially in a manner that must balance between exploiting what is known to maximize immediate performance and investing to accumulate new information that may improve future performance. Th...
Cited by234BibtexViews67DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 5-6 (2018): 337-588
Many modern methods for prediction leverage nearest neighbor search to find past training examples most similar to a test example, an idea that dates back in text to at least the 11th century and has stood the test of time. This monograph aims to explain the success of these meth...
Cited by49BibtexViews35DOI
0
0
Foundations and Trends in Machine Learning, no. 1-2 (2017)
A Hilbert space embedding of distributions---in short, kernel mean embedding---has recently emerged as a powerful machinery for probabilistic modeling, statistical inference, machine learning, and causal discovery. The basic idea behind this framework is to map distributions in...
Cited by261BibtexViews42
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 4-5 (2017): I-+
Modern applications in engineering and data science are increasingly based on multidimensional data of exceedingly high volume, variety, and structural richness. However, standard machine learning algorithms typically scale exponentially with data volume and complexity of cross-m...
Cited by162BibtexViews37DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 6 (2017): 431-+
Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1. It focuses on tensor network models for super-compressed higher-order representation of data/parameters and related cost functions, while providing an outline of their...
Cited by110BibtexViews33DOI
0
0
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, no. 3-4 (2017): 142-336
A vast majority of machine learning algorithms train their models and perform inference by solving optimization problems. In order to capture the learning and prediction problems accurately, structural constraints such as sparsity or low rank are frequently imposed or else the ob...
Cited by13BibtexViews76DOI
0
0
Foundations and Trends in Machine Learning, no. 1 (2016): 1-118
Principal components analysis (PCA) is a well-known technique for approximating a tabular data set by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types. This framework enco...
Cited by259BibtexViews28DOI
0
0
Foundations and Trends® in Machine Learning, no. 5-6 (2016): 359-483
Bayesian methods for machine learning have been widely investigated,yielding principled methods for incorporating prior information intoinference algorithms. In this survey, we provide an in-depth reviewof the role of Bayesian methods for the reinforcement learning RLparadigm. Th...
Cited by239BibtexViews44DOI
0
0
Foundations and Trends in Machine Learning, no. 2-3 (2016): 119-247
Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been consid...
Cited by64BibtexViews59DOI
0
0
小科