熵启发的分级可微分网络架构搜索
CSTR:
作者:
作者单位:

(1.中国科学院 成都计算机应用研究所,成都 610041;2.中国科学院大学,北京 100049; 3.哈尔滨工业大学(深圳) 国际人工智能研究院,广东 深圳 518055)

作者简介:

李建明(1989—),男,博士研究生; 陈斌(1970—),男,研究员,博士生导师

通讯作者:

李建明,lugeeljm@gmail.com; 陈斌,chenbin2020@hit.edu.cn

中图分类号:

TP183

基金项目:

广东省云计算与大数据管理技术重大科技专项(2017B030306017)


Multi-level differentiable architecture search with heuristic entropy
Author:
Affiliation:

(1.Chengdu Institute of Computer Applications, Chinese Academy of Sciences, Chengdu 610041, China; 2.University of Chinese Academy of Sciences, Beijing 100049, China; 3.International Research Institute of Artificial Intelligence, Harbin Institute of Technology, Shenzhen, Shenzhen 518055, Guangdong, China)

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    网络架构是影响卷积神经网络性能的重要因素,由于传统的人工设计方法效率较低,通过算法自动设计网络架构的方法受到了越来越多的关注。可微分网络架构搜索(DARTS)方法,能高效地自动设计网络架构,但其超网络的构建和架构派生策略也存在不足之处。针对其不足之处,本文提出了改进算法。首先,通过量化分析该算法搜索过程中跳连(skip)操作数量的变化,发现共享架构参数的设置导致DARTS算法的超网络存在耦合问题;其次,针对超网络的耦合问题,设计了元胞(cell)分级的超网络,以避免不同层级间cell的相互影响;然后,针对超网络与派生架构在性能表现上存在“鸿沟”的问题,引入架构熵作为目标函数的损失项,以启发超网络的训练。最后,在CIFAR-10数据集上进行架构搜索实验,并分别在CIFAR-10和ImageNet上进行了架构评测实验。在CIFAR-10上的实验结果表明,本文提出的算法解除了不同层级cell间的耦合,提升了自动设计的架构性能,取得了仅2.69%的分类错误率;该架构在ImageNet上的分类错误率为25.9%,实验结果表明搜得的架构具有良好的迁移性。

    Abstract:

    Network architecture is an important factor affecting the performance of convolutional neural networks. Due to the low efficiency of the traditional manual design of network architecture, the method of automatically designing network architecture through algorithms has attracted more and more attention. Although the approach of differentiable architecture search (DARTS) has the capacity of designing networks automatically and efficiently, there are still problems owing to its super network construction and derivation strategy. An improved algorithm was proposed to overcome these shortcomings. First, the coupling problem caused by sharing architecture parameters in the super network was disclosed by quantifying the changes in the number of the skip candidate operations during the algorithm search process. Next, aiming at the coupling problem of the super network, a super network with multi-level cells was designed to avoid the mutual influence of cells at different levels. Then, in view of the “gap” between the super network and the derived architecture, the entropy of architecture parameters was introduced as the loss term of the objective function to inspire the training of the super network. Finally, architecture search experiments were conducted on CIFAR-10 dataset, and architecture evaluation experiments were conducted on CIFAR-10 and ImageNet respectively. Experimental results on CIFAR-10 show that the proposed algorithm removed the coupling problem between cells at different levels and improved the performance of the automatically designed architecture, which achieved classification error rate of only 2.69%. The architecture had the classification error rate of 25.9% on ImageNet, which proved its transferability.

    参考文献
    相似文献
    引证文献
引用本文

李建明,陈斌,孙晓飞.熵启发的分级可微分网络架构搜索[J].哈尔滨工业大学学报,2021,53(8):22. DOI:10.11918/202011092

复制
相关视频

分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2020-11-20
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2021-08-10
  • 出版日期:
文章二维码