AutoML in NLP网络架构搜索(NAS)已成为机器学习领域的热门课题。商业服务(如谷歌的AutoML)和开源库(如Auto-Keras)使NAS可用于更广泛的机器学习环境。在自然语言处理领域,自动机器学习也有着广泛的应用,研究者们很关注将多任务学习和自动机器学习结合起来应用于自然语言处理中。
Chen Daoyuan,Li Yaliang,Qiu Minghui, Wang Zhen, Li Bofang,Ding Bolin,Deng Hongbo, Huang Jun,Lin Wei,Zhou Jingren
IJCAI 2020, pp.2463-2469, (2020)
Extensive experiments demonstrate that AdaBERT achieves comparable performance while significantly improves the efficiency by 12.7x to 29.3x speedup in inference time and 11.5x to 17.0x compression ratio in parameter size
Cited by17BibtexViews481DOI
0
0
CVPR, pp.10294-10303, (2020)
We propose a neural architecture search algorithm to optimize the configuration of Lightweight Non-Local blocks
Cited by12BibtexViews181DOI
0
0
Our experimental results show that the proposed DC-neural architecture search can significantly improve the performance of the searched architecture with only a slightly increase in the evaluation cost compared to traditional NAS methods
Cited by1BibtexViews45
0
0
Yao Shu,Wei Wang, Shaofeng Cai
ICLR, (2020)
Given that popular Neural architecture search cells share the common connection pattern, we explore the impact of this common connection pattern from the optimization perspective to answer the question: why the wide and shallow cells are selected during the architecture search? W...
Cited by0BibtexViews220
0
0
Yuqiao Liu,Yanan Sun,Bing Xue,Mengjie Zhang, Gary Yen
Most ENAS methods define the fitness function as a criterion based on the task in hand, e.g., classification accuracy rate or error rate is suitable for image classification tasks and it can represent the performance of the corresponding architecture well
Cited by0BibtexViews21
0
0
Shoukang Hu,Xurong Xie, Shansong Liu, Mengzhe Geng,Xunying Liu,Helen Meng
Experimental results suggest neural architecture search techniques can be used for the automatic configuration of Deep neural networks based speech recognition systems and allow their wider application to different tasks
Cited by0BibtexViews36
0
0
Yujing Wang, Yaming Yang, Yiren Chen, Jing Bai,Ce Zhang, Guinan Su, Xiaoyu Kou, Yunhai Tong, Mao Yang, Lidong Zhou
national conference on artificial intelligence, (2020)
We propose a novel architecture search space specialized for text representation by leveraging multi-path ensemble and a mixture of convolutional, recurrent, pooling, and self-attention layers
Cited by0BibtexViews19
0
0
JOURNAL OF MACHINE LEARNING RESEARCH, no. 55 (2019)
Neural Architecture Search can be seen as subfield of AutoML and has significant overlap with hyperparameter optimization and meta-learning
Cited by764BibtexViews107
0
0
international conference on learning representations, (2019)
We introduced ProxylessNAS that can directly learn neural network architectures on the target task and target hardware without any proxy
Cited by676BibtexViews105
0
0
CVPR, (2019): 82-92
We present one of the first attempts to extend Neural Architecture Search beyond image classification to dense image prediction problems
Cited by402BibtexViews372
0
0
UAI, (2019): 129
An analogous baseline could be useful for Neural architecture search, where the impact of a novel Neural architecture search method can be quantified in terms of a multiplicative speedup relative to a standard hyperparameter optimization method such as random search with early-st...
Cited by125BibtexViews38
0
0
international conference on learning representations, (2019)
We propose the Graph HyperNetwork, a composition of graph neural networks and hypernetworks that generates the weights of any architecture by operating directly on their computation graph representation
Cited by104BibtexViews128
0
0
Christian Sciuto, Kaicheng Yu,Martin Jaggi,Claudiu Musat,Mathieu Salzmann
arXiv: Learning, (2019)
The search policies of state-ofthe-art Neural Architecture Search techniques are no better than random, and have traced the reason for this to the use of a constrained search space and weight sharing, which shuffles the architecture ranking during the search, negatively impacting...
Cited by99BibtexViews114
0
0
CVPR, (2019): 9126-9135
We demonstrate how to efficiently search for these architectures within limited time and computational budgets
Cited by62BibtexViews41
0
0
Martin Wistuba, Ambrish Rawat, Tejaswini Pedapati
arXiv: Learning, (2019)
We reviewed various optimization algorithms based on methods such as reinforcement learning, evolutionary algorithms, surrogate model-based optimization, and one-shot models
Cited by43BibtexViews10
0
0
Journal of Machine Learning Research, no. 243 (2019): 1-18
We proposed 14 best practices for scientific research on neural architecture search methods
Cited by30BibtexViews40
0
0
We present automated graph neural networks to find the optimal neural architecture given a node classification task
Cited by22BibtexViews37
0
0
Linnan Wang,Saining Xie,Teng Li, Rodrigo Fonseca,Yuandong Tian
CoRR, (2019)
We propose Latent Action Neural Architecture Search(LaNAS) that learns the action space to maximize search efficiency for a given performance metric
Cited by2BibtexViews91
0
0
George Adam, Jonathan Lorraine
arXiv: Learning, (2019)
The method we presented to regularize the Efficient Neural Architecture Search controller and to condition on past actions can be further improved via multitask training rather than experience replay, though we leave this for future work
Cited by0BibtexViews7
0
0
Li Jixiang, Liang Chuming,Zhang Bo, Wang Zhao, Xiang Fei,Chu Xiangxiang
INTERSPEECH, pp.1171-1175, (2019)
We present a novel and efficient network for Acoustic Scene Classification tasks where its feature extractor is inspired by MobileNetV2
Cited by0BibtexViews396DOI
0
0