Knowledge distillation-based performance transferring for LSTM-RNN model acceleration
Crossref DOI link: https://doi.org/10.1007/s11760-021-02108-9
Published Online: 2022-01-29
Published Print: 2022-09
Update policy: https://doi.org/10.1007/springer_crossmark_policy
Ma, Hongbin http://orcid.org/0000-0003-1931-7203
Yang, Shuyuan
Wu, Ruowu
Hao, Xiaojun
Long, Huimin
He, Guangjun
Text and Data Mining valid from 2022-01-29
Version of Record valid from 2022-01-29
Article History
Received: 12 June 2020
Revised: 24 October 2021
Accepted: 27 October 2021
First Online: 29 January 2022