AutoML in CV2017年5月,Google Brain的研究人员宣布创建AutoML--一种能够让AI生成AI的人工智能。 最近,他们决定向AutoML提出迄今为止最大的挑战,使得AI可以自己“孕育”AI,创建了一个超过所有人类智慧的“孩子”。Google的研究人员使用一种称为强化学习的方法来自动设计机器学习模型。 AutoML充当一个神经网络控制器,为特定任务开发一个子AI网络。 对于研究人员称为NASNet的这个特殊的儿童AI来说,这个任务是实时地在视频中识别物体:人,汽车,交通信号灯,手袋,背包等等。AutoML将评估NASNet的性能,并使用这些信息来改善其子AI,重复这个过程数千次。在Google研究人员称为“计算机视觉领域最受尊敬的两个大型学术数据集”的ImageNet图像分类和COCO目标检测数据集上进行测试时,NASNet胜过了所有其他计算机视觉系统。
CVPR, pp.12962-12971, (2020)
These contributions target the main bottleneck for Differentiable Neural Architecture Search – high memory cost that induces constraints on the search space size – and yield state-of-the-art performance
Cited by38BibtexViews174DOI
0
0
european conference on computer vision, pp.702-717, (2020)
We presented a novel paradigm for neural architecture search by training a single-stage model, from which high-quality child models of different sizes can be induced for instant deployment without retraining or finetuning
Cited by26BibtexViews157
0
0
CVPR, pp.11393-11401, (2020)
We systematically investigate some widely adopted reduction factors and report our observations
Cited by13BibtexViews105DOI
0
0
CVPR, pp.10294-10303, (2020)
We propose a neural architecture search algorithm to optimize the configuration of Lightweight Non-Local blocks
Cited by12BibtexViews181DOI
0
0
european conference on computer vision, pp.798-813, (2020)
The Unsupervised Neural Architecture Search algorithm variants with Rotation prediction [12], Colorization [34], jigsaw puzzles [21] objectives all perform very well, closely approaching the results obtained by the supervised counterpart
Cited by12BibtexViews372
0
0
CVPR, pp.11990-11999, (2020)
MiLeNAS can alleviate gradient error caused by approximation in bilevel optimization and benefits from the first-order efficiency seen in singlelevel methods
Cited by5BibtexViews139DOI
0
0
european conference on computer vision, pp.660-676, (2020)
On NASBench-101 the Neural Predictor and Regularized Evolution are clearly better than random search
Cited by3BibtexViews54
0
0
Zhihang Li, Teng Xi,Jiankang Deng,Gang Zhang, Shengzhao Wen,Ran He
CVPR, pp.11930-11939, (2020)
We propose the Gaussian Process based Neural Architecture Search, a theoretical modeling for Neural architecture search
Cited by0BibtexViews57DOI
0
0
CVPR, pp.12081-12089, (2020)
Discrete Stochastic Neural Architecture Search is orthogonal to the random wiring solution, which focuses on graph topology search
Cited by0BibtexViews99DOI
0
0
Xuelian Cheng,Yiran Zhong, Mehrtash T Harandi,Yuchao Dai,Xiaojun Chang,Hongdong Li, Tom Drummond,Zongyuan Ge
NIPS 2020, (2020)
We proposed the first end-to-end hierarchical Neural Architecture Search framework for deep stereo matching, which incorporates task-specific human knowledge into the architecture search framework
Cited by0BibtexViews43
0
0
CVPR, (2019): 2820-2828
This paper presents an automated neural architecture search approach for designing resource-efficient mobile Convolutional neural networks models using reinforcement learning
Cited by912BibtexViews469
0
0
JOURNAL OF MACHINE LEARNING RESEARCH, no. 55 (2019)
Neural Architecture Search can be seen as subfield of AutoML and has significant overlap with hyperparameter optimization and meta-learning
Cited by764BibtexViews107
0
0
international conference on learning representations, (2019)
We introduced ProxylessNAS that can directly learn neural network architectures on the target task and target hardware without any proxy
Cited by676BibtexViews105
0
0
CVPR, (2019): 10734-10742
We present differentiable neural architecture search, a differentiable neural architecture search framework
Cited by426BibtexViews287
0
0
CVPR, (2019): 82-92
We present one of the first attempts to extend Neural Architecture Search beyond image classification to dense image prediction problems
Cited by402BibtexViews372
0
0
international conference on learning representations, (2019)
We presented Stochastic Neural Architecture Search, a novel and economical end-to-end neural architecture search framework
Cited by376BibtexViews86
0
0
Christian Sciuto, Kaicheng Yu,Martin Jaggi,Claudiu Musat,Mathieu Salzmann
arXiv: Learning, (2019)
The search policies of state-ofthe-art Neural Architecture Search techniques are no better than random, and have traced the reason for this to the use of a constrained search space and weight sharing, which shuffles the architecture ranking during the search, negatively impacting...
Cited by99BibtexViews114
0
0
international conference on machine learning, (2019)
The test error rate of BayesNAS is competitive against state-of-the-art techniques and BayesNAS is able to find convolutional cells with fewer parameters when compared to DARTS and SNAS
Cited by67BibtexViews72
0
0
CVPR, pp.4787-4796, (2019)
We have proposed a method for neural architecture search by integrating evolution algorithm and reinforcement learning into a unified framework
Cited by53BibtexViews99
0
0
Martin Wistuba, Ambrish Rawat, Tejaswini Pedapati
arXiv: Learning, (2019)
We reviewed various optimization algorithms based on methods such as reinforcement learning, evolutionary algorithms, surrogate model-based optimization, and one-shot models
Cited by43BibtexViews10
0
0