SymbolNet: Neural Symbolic Regression with Adaptive Dynamic Pruning
CoRR(2024)
摘要
Contrary to the use of genetic programming, the neural network approach to
symbolic regression can scale well with high input dimension and leverage
gradient methods for faster equation searching. Common ways of constraining
expression complexity have relied on multistage pruning methods with
fine-tuning, but these often lead to significant performance loss. In this
work, we propose SymbolNet, a neural network approach to symbolic regression in
a novel framework that enables dynamic pruning of model weights, input
features, and mathematical operators in a single training, where both training
loss and expression complexity are optimized simultaneously. We introduce a
sparsity regularization term per pruning type, which can adaptively adjust its
own strength and lead to convergence to a target sparsity level. In contrast to
most existing symbolic regression methods that cannot efficiently handle
datasets with more than O(10) inputs, we demonstrate the effectiveness of our
model on the LHC jet tagging task (16 inputs), MNIST (784 inputs), and SVHN
(3072 inputs).
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要