期刊检索

  • 2024年第56卷
  • 2023年第55卷
  • 2022年第54卷
  • 2021年第53卷
  • 2020年第52卷
  • 2019年第51卷
  • 2018年第50卷
  • 2017年第49卷
  • 2016年第48卷
  • 2015年第47卷
  • 2014年第46卷
  • 2013年第45卷
  • 2012年第44卷
  • 2011年第43卷
  • 2010年第42卷
  • 第1期
  • 第2期

主管单位 中华人民共和国
工业和信息化部
主办单位 哈尔滨工业大学 主编 李隆球 国际刊号ISSN 0367-6234 国内刊号CN 23-1235/T

期刊网站二维码
微信公众号二维码
引用本文:李玉庆,江飞龙,陈卓,王日新,黄胜全,王瑞星,崔祜涛,徐敏强.一种利用Transformer的无人集群对抗态势要素识别方法[J].哈尔滨工业大学学报,2022,54(12):1.DOI:10.11918/202102061
LI Yuqing,JIANG Feilong,CHEN Zhuo,WANG Rixin,HUANG Shengquan,WANG Ruixing,CUI Hutao,XU Minqiang.Recognition method for unmanned swarm adversarial situation elements using Transformer[J].Journal of Harbin Institute of Technology,2022,54(12):1.DOI:10.11918/202102061
【打印本页】   【HTML】   【下载PDF全文】   查看/发表评论  下载PDF阅读器  关闭
过刊浏览    高级检索
本文已被:浏览 1055次   下载 783 本文二维码信息
码上扫一扫!
分享到: 微信 更多
一种利用Transformer的无人集群对抗态势要素识别方法
李玉庆,江飞龙,陈卓,王日新,黄胜全,王瑞星,崔祜涛,徐敏强
1.哈尔滨工业大学 航天学院,哈尔滨 150090;2.中国船舶工业系统工程研究院,北京 100094
摘要:
针对无人集群对抗问题中原始态势信息繁杂,难以准确识别集群阵型和集群运动趋势等态势要素的特点,为提高无人集群态势要素识别能力,设计了一种利用Transformer的无人集群对抗态势要素识别方法。基于Transformer模型的思想构建了可应用于无人集群对抗态势要素识别问题的Transformer-Decoder注意力层模型,实现良好的集群态势要素识别能力,设计层间注意力结构以改进提升了Transformer-Decoder的特征表达能力,进一步提高识别准确率。首先将集群态势序列信息输入到LSTM循环神经网络,编码成时序特征信息;然后使用Transformer-Decoder注意力模块和层间注意力模块提取集群的综合高阶态势信息,最后多维度分类网络和softmax层实现对多类态势要素的分类。实验结果表明:利用Transformer和层间注意力的无人集群对抗态势要素识别方法在态势要素分类问题上表现出良好的性能,能够同时对多类态势要素进行准确分类;相对于基线方法,利用Transformer和层间注意力的集群态势识别方法在集群阵型和运动趋势识别问题上具有更高的准确率。尤其在体现集群内部相对趋势的态势要素的分类问题上,该方法明显表现出更好的性能。
关键词:  集群态势识别  集群阵型  集群运动趋势  注意力机制  层间注意力
DOI:10.11918/202102061
分类号:TP183
文献标识码:A
基金项目:国家自然科学基金(52075117);装发领域基金(JZX7Y20190243001201);哈尔滨工业大学深空探测着陆与返回控制技术国防重点学科实验室开放基金(HIT.KLOF.2016.077, HIT.KLOF.2017.076, HIT.KLOF.2018.076, HIT.KLOF.2018.074)
Recognition method for unmanned swarm adversarial situation elements using Transformer
Yuqing LI1, Feilong JIANG1, Zhuo CHEN2, Rixin WANG1, Shengquan HUANG1, Ruixing WANG1, Hutao CUI1, Minqiang XU1
1.School of Astronautics, Harbin Institute of Technology, Harbin 150090, China;2.Systems Engineering Research Institute of CSSC, Beijing 100094, China
Abstract:
Since the original situation information in the unmanned swarm confrontation problem is complicated, it is difficult to accurately identify the situation elements such as the swarm formation and swarm movement trend. In order to improve the identification ability for unmanned swarm situation elements, an identification method for unmanned swarm adversarial situation elements using Transformer was designed. On the basis of the Transformer model, a Transformer-Decoder attention layer model that can be applied to unmanned swarm confrontation situation element identification problem was constructed, so as to achieve a good ability to identify swarm situation elements. The inter-layer attention structure was designed to improve the feature expression ability of Transformer-Decoder to further improve the recognition accuracy. First, the situation sequence information of the unmanned swarm was input into LSTM and encoded into time sequence feature information. Then the Transformer-Decoder attention module and the inter-layer attention module were used to extract the comprehensive high-order situation information of the swarm. Finally multi-dimensional classification network and softmax layer were adopted to realize the classification of multiple situation elements. The experimental results showed that the unmanned swarm adversarial situation elements recognition method using Transformer and inter-layer attention exhibited good performance on situation element classification problem, and could accurately classify multiple situation elements synchronously. Compared with the baseline method, the swarm situation recognition method using Transformer and inter-layer attention had higher accuracy in the recognition of swarm formation and movement trend. Especially in the classification of situation elements that reflect the relative trends within the swarm, the proposed method clearly showed better performance.
Key words:  swarm situation recognition  swarm formation  swarm movement trend  attention mechanism  inter-layer attention

友情链接LINKS