Please submit manuscripts in either of the following two submission systems

    ScholarOne Manuscripts

  • ScholarOne
  • 勤云稿件系统

  • 登录

Search by Issue

  • 2024 Vol.31
  • 2023 Vol.30
  • 2022 Vol.29
  • 2021 Vol.28
  • 2020 Vol.27
  • 2019 Vol.26
  • 2018 Vol.25
  • 2017 Vol.24
  • 2016 vol.23
  • 2015 vol.22
  • 2014 vol.21
  • 2013 vol.20
  • 2012 vol.19
  • 2011 vol.18
  • 2010 vol.17
  • 2009 vol.16
  • No.1
  • No.2

Supervised by Ministry of Industry and Information Technology of The People's Republic of China Sponsored by Harbin Institute of Technology Editor-in-chief Yu Zhou ISSNISSN 1005-9113 CNCN 23-1378/T

期刊网站二维码
微信公众号二维码
Related citation:Lasheng Yu,Xiaopeng Zheng.Research on Deep Knowledge Tracking Incorporating Rich Features and Forgetting Behaviors[J].Journal of Harbin Institute Of Technology(New Series),2022,29(4):1-6.DOI:10.11916/j.issn.1005-9113.2020081.
【Print】   【HTML】   【PDF download】   View/Add Comment  Download reader   Close
←Previous|Next→ Back Issue    Advanced Search
This paper has been: browsed 673times   downloaded 360times 本文二维码信息
码上扫一扫!
Shared by: Wechat More
Research on Deep Knowledge Tracking Incorporating Rich Features and Forgetting Behaviors
Author NameAffiliation
Lasheng Yu School of Computer Science and Engineering, Central South University, Changsha 410083, China 
Xiaopeng Zheng School of Computer Science and Engineering, Central South University, Changsha 410083, China 
Abstract:
The individualization of education and teaching through the computer-aided education system provides students with personalized learning, so that each student can obtain the knowledge they need. At this stage, there are a lot of intelligent tutoring systems. In these systems, students learning actions are tracked in real-time, and there are a lot of available data. From these data, personalized education that suits each student can be mined. To improve the quality of education, some models for predicting students next practice have been produced, such as Bayesian Knowledge Tracing (BKT), Performance Factor Analysis (PFA), and Deep Knowledge Tracing (DKT) with the development of deep learning. However, the model only considers the knowledge component and correctness of the problem, ignoring the breadth of other characteristics of the information collected by the intelligent tutoring system, the lag time of the previous interaction, the number of past attempts to a problem, and situations that students have forgotten the knowledge. Although some studies consider forgetting and rich information when modeling student knowledge, they often ignore student learning sequences. The main contribution of this paper is in two aspects. One is to transform the input into a position feature vector by introducing an auto-encoding network layer and to carry out multiple sets of bad political combinations. The other is to consider repeated time intervals, sequence time intervals, and the number of attempts to simulate forgetting behavior. This paper proposes an adaptive algorithm for the original DKT model. By using the stacked auto-encoder network, the input dimension is reduced to half of the original and the original features are retained and consider the forgetting memory behavior according to the time sequence of students learning. The model proposed in this paper has been experimented on two public data sets to improve the original accuracy.
Key words:  LSTM  knowledge of tracking  DKT  stacked autoencoder  forgetting behavior  feature information
DOI:10.11916/j.issn.1005-9113.2020081
Clc Number:TP301.6
Fund:
Descriptions in Chinese:
  

融合富特征和遗忘行为的深度知识跟踪研究

余腊生,郑晓鹏

(中南大学 计算机学院, 长沙 410083)

中文说明:

计算机辅助教育系统为每个学生提供个性化的学习,为学生提供所需的个性化知识,使每个学生都能获得所需的知识。在很多智能辅导系统中,学生的学习过程是进行实时追踪的,因此提出一些模型,通过这些模型可以对学生下一次的练习达到预测的效果来辅助教学。这些模型包括贝叶斯知识跟踪(BKT)、性能因素分析(PFA),以及最近随着深度学习发展的深度知识追踪(DKT)。基于递归神经网络的DKT模型显示了很好的结果,然而该模型只考虑问题的知识成分和正确性,忽略了智能辅导系统收集的信息的其他特征的广度,并且学生都有着遗忘的情况,产生的原因:前一次互动的滞后时间和过去对一个问题的尝试次数。本文提出一种对原有DKT模型结构的适应算法,通过引入栈式自动编码器网络来对多特征的输入转换为低维特征向量并整合遗忘的互动间隔和尝试次数。在知识追踪的两个公共数据集上表明,本文提出的模型能够提高原本精度。

关键词:LSTM、知识追踪、DKT、栈式自动编码器、遗忘行为、特征信息

LINKS