Last Query Transformer RNN for knowledge tracing
2021-02-10Code Available1· sign in to hype
SeungKee Jeon
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/arshadshk/Last_Query_Transformer_RNN-PyTorchpytorch★ 44
- github.com/bcaitech1/p4-dkt-no_caffeine_no_gainpytorch★ 16
- github.com/pwc-1/Paper-9/tree/main/4/Last-Query-Transformer-RNNmindspore★ 0
- github.com/MindSpore-scientific-2/code-9/tree/main/Last-Query-Transformer-RNNmindspore★ 0
- github.com/pwc-1/Paper-9/tree/main/7/Last-Query-Transformer-RNNmindspore★ 0
- github.com/MindSpore-scientific/code-1/tree/main/Last-Query-Transformer-RNNmindspore★ 0
Abstract
This paper presents an efficient model to predict a student's answer correctness given his past learning activities. Basically, I use both transformer encoder and RNN to deal with time series input. The novel point of the model is that it only uses the last input as query in transformer encoder, instead of all sequence, which makes QK matrix multiplication in transformer Encoder to have O(L) time complexity, instead of O(L^2). It allows the model to input longer sequence. Using this model I achieved the 1st place in the 'Riiid! Answer Correctness Prediction' competition hosted on kaggle.