Approximation results regarding the multiple-output Gaussian gated mixture of linear experts model.
arXiv: Methodology(2019)
摘要
Mixture of experts (MoE) models are a class of artificial neural networks that can be used for functional approximation and probabilistic modeling. An important class of MoE models is the class of mixture of linear experts (MoLE) models, where the expert functions map to real topological output spaces. Recently, Gaussian gated MoLE models have become popular in applied research. There are a number of powerful approximation results regarding Gaussian gated MoLE models, when the output space is univariate. These results guarantee the ability of Gaussian gated MoLE mean functions to approximate arbitrary continuous functions, and Gaussian gated MoLE models themselves to approximate arbitrary conditional probability density functions. We utilize and extend upon the univariate approximation results in order to prove a pair of useful results for situations where the output spaces are multivariate. We do this by proving a pair of lemmas regarding the combination of univariate MoLE models, which are interesting in their own rights.
更多查看译文
关键词
Artificial neural network,Conditional model,Gaussian distribution,Mean function,Multiple-output,Multivariate analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络