Please submit manuscripts in either of the following two submission systems

    ScholarOne Manuscripts

  • ScholarOne
  • 勤云稿件系统

  • 登录

Search by Issue

  • 2024 Vol.31
  • 2023 Vol.30
  • 2022 Vol.29
  • 2021 Vol.28
  • 2020 Vol.27
  • 2019 Vol.26
  • 2018 Vol.25
  • 2017 Vol.24
  • 2016 vol.23
  • 2015 vol.22
  • 2014 vol.21
  • 2013 vol.20
  • 2012 vol.19
  • 2011 vol.18
  • 2010 vol.17
  • 2009 vol.16
  • No.1
  • No.2

Supervised by Ministry of Industry and Information Technology of The People's Republic of China Sponsored by Harbin Institute of Technology Editor-in-chief Yu Zhou ISSNISSN 1005-9113 CNCN 23-1378/T

期刊网站二维码
微信公众号二维码
Related citation:ZHAO Xue-zhi,YeBangYan.ART-2 neural network based on eternal term memory vector:Architecture and algorithm[J].Journal of Harbin Institute Of Technology(New Series),2009,16(6):843-848.DOI:10.11916/j.issn.1005-9113.2009.06.019.
【Print】   【HTML】   【PDF download】   View/Add Comment  Download reader   Close
←Previous|Next→ Back Issue    Advanced Search
This paper has been: browsed 1370times   downloaded 388times 本文二维码信息
码上扫一扫!
Shared by: Wechat More
ART-2 neural network based on eternal term memory vector:Architecture and algorithm
Author NameAffiliation
ZHAO Xue-zhi School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou 510640, China 
YeBangYan School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou 510640, China 
Abstract:
Aimed at the problem that the traditional ART-2 neural network can not recognize a gradually changing course, an eternal term memory (ETM) vector is introduced into ART-2 to simulate the function of human brain, i.e. the deep remembrance for the initial impression.. The eternal term memory vector is determined only by the initial vector that establishes category neuron node and is used to keep the remembrance for this vector for ever. Two times of vigilance algorithm are put forward, and the posterior input vector must first pass the first vigilance of this eternal term memory vector, only succeeded has it the qualification to begin the second vigilance of long term memory vector. The long term memory vector can be revised only when both of the vigilances are passed. Results of recognition examples show that the improved ART-2 overcomes the defect of traditional ART-2 and can recognize a gradually changing course effectively.
Key words:  ART-2 neural network  eternal term memory vector  two times of vigilance  gradually changing course  pattern recognition
DOI:10.11916/j.issn.1005-9113.2009.06.019
Clc Number:TP183
Fund:

LINKS