期刊检索

  • 2024年第56卷
  • 2023年第55卷
  • 2022年第54卷
  • 2021年第53卷
  • 2020年第52卷
  • 2019年第51卷
  • 2018年第50卷
  • 2017年第49卷
  • 2016年第48卷
  • 2015年第47卷
  • 2014年第46卷
  • 2013年第45卷
  • 2012年第44卷
  • 2011年第43卷
  • 2010年第42卷
  • 第1期
  • 第2期

主管单位 中华人民共和国
工业和信息化部
主办单位 哈尔滨工业大学 主编 李隆球 国际刊号ISSN 0367-6234 国内刊号CN 23-1235/T

期刊网站二维码
微信公众号二维码
引用本文:李建明,陈斌,孙晓飞.熵启发的分级可微分网络架构搜索[J].哈尔滨工业大学学报,2021,53(8):22.DOI:10.11918/202011092
LI Jianming,CHEN Bin,SUN Xiaofei.Multi-level differentiable architecture search with heuristic entropy[J].Journal of Harbin Institute of Technology,2021,53(8):22.DOI:10.11918/202011092
【打印本页】   【HTML】   【下载PDF全文】   查看/发表评论  下载PDF阅读器  关闭
过刊浏览    高级检索
本文已被:浏览 1316次   下载 964 本文二维码信息
码上扫一扫!
分享到: 微信 更多
熵启发的分级可微分网络架构搜索
李建明1,2,陈斌2,3,孙晓飞1,2
(1.中国科学院 成都计算机应用研究所,成都 610041;2.中国科学院大学,北京 100049; 3.哈尔滨工业大学(深圳) 国际人工智能研究院,广东 深圳 518055)
摘要:
网络架构是影响卷积神经网络性能的重要因素,由于传统的人工设计方法效率较低,通过算法自动设计网络架构的方法受到了越来越多的关注。可微分网络架构搜索(DARTS)方法,能高效地自动设计网络架构,但其超网络的构建和架构派生策略也存在不足之处。针对其不足之处,本文提出了改进算法。首先,通过量化分析该算法搜索过程中跳连(skip)操作数量的变化,发现共享架构参数的设置导致DARTS算法的超网络存在耦合问题;其次,针对超网络的耦合问题,设计了元胞(cell)分级的超网络,以避免不同层级间cell的相互影响;然后,针对超网络与派生架构在性能表现上存在“鸿沟”的问题,引入架构熵作为目标函数的损失项,以启发超网络的训练。最后,在CIFAR-10数据集上进行架构搜索实验,并分别在CIFAR-10和ImageNet上进行了架构评测实验。在CIFAR-10上的实验结果表明,本文提出的算法解除了不同层级cell间的耦合,提升了自动设计的架构性能,取得了仅2.69%的分类错误率;该架构在ImageNet上的分类错误率为25.9%,实验结果表明搜得的架构具有良好的迁移性。
关键词:  网络架构搜索  可微分架构搜索  分级超网络  架构熵
DOI:10.11918/202011092
分类号:TP183
文献标识码:A
基金项目:广东省云计算与大数据管理技术重大科技专项(2017B030306017)
Multi-level differentiable architecture search with heuristic entropy
LI Jianming1,2,CHEN Bin2,3,SUN Xiaofei1,2
(1.Chengdu Institute of Computer Applications, Chinese Academy of Sciences, Chengdu 610041, China; 2.University of Chinese Academy of Sciences, Beijing 100049, China; 3.International Research Institute of Artificial Intelligence, Harbin Institute of Technology, Shenzhen, Shenzhen 518055, Guangdong, China)
Abstract:
Network architecture is an important factor affecting the performance of convolutional neural networks. Due to the low efficiency of the traditional manual design of network architecture, the method of automatically designing network architecture through algorithms has attracted more and more attention. Although the approach of differentiable architecture search (DARTS) has the capacity of designing networks automatically and efficiently, there are still problems owing to its super network construction and derivation strategy. An improved algorithm was proposed to overcome these shortcomings. First, the coupling problem caused by sharing architecture parameters in the super network was disclosed by quantifying the changes in the number of the skip candidate operations during the algorithm search process. Next, aiming at the coupling problem of the super network, a super network with multi-level cells was designed to avoid the mutual influence of cells at different levels. Then, in view of the “gap” between the super network and the derived architecture, the entropy of architecture parameters was introduced as the loss term of the objective function to inspire the training of the super network. Finally, architecture search experiments were conducted on CIFAR-10 dataset, and architecture evaluation experiments were conducted on CIFAR-10 and ImageNet respectively. Experimental results on CIFAR-10 show that the proposed algorithm removed the coupling problem between cells at different levels and improved the performance of the automatically designed architecture, which achieved classification error rate of only 2.69%. The architecture had the classification error rate of 25.9% on ImageNet, which proved its transferability.
Key words:  neural architecture search  differentiable architecture search  super network with multi-level cells  architecture entropy

友情链接LINKS