Towards an Algebraic Framework For Approximating Functions Using Neural Network Polynomials
CoRR(2024)
摘要
We make the case for neural network objects and extend an already existing
neural network calculus explained in detail in Chapter 2 on . Our
aim will be to show that, yes, indeed, it makes sense to talk about neural
network polynomials, neural network exponentials, sine, and cosines in the
sense that they do indeed approximate their real number counterparts subject to
limitations on certain of their parameters, q, and ε. While doing
this, we show that the parameter and depth growth are only polynomial on their
desired accuracy (defined as a 1-norm difference over ℝ), thereby
showing that this approach to approximating, where a neural network in some
sense has the structural properties of the function it is approximating is not
entire intractable.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要