scholarly journals Knowledge Distillation for Recurrent Neural Network Language Modeling with Trust Regularization

Author(s):  
Yangyang Shi ◽  
Mei-Yuh Hwang ◽  
Xin Lei ◽  
Haoyu Sheng
Sign in / Sign up

Export Citation Format

Share Document