Dan, Xi
Yang, Wenjie
Zhang, Fuyan
Zhou, Yihang
Yu, Zhuojun
Qiu, Zhen
Zhao, Boyuan
Dong, Zeyu
Huang, Libo
Yang, Chuanguang
Funding for this research was provided by:
Beijing Natural Science Foundation (4244098)
Article History
Received: 12 March 2024
Accepted: 12 August 2024
First Online: 31 August 2024
Declarations
:
: The authors declare no competing interests.
: The authors declare that there is no conflict of interest regarding the publication of this paper. No financial or personal relationship with other people or organizations has influenced the work reported in this manuscript, titled “PDD: Pruning Neural Networks During Knowledge Distillation,” Submission ID dccb382d-1a4b-4ee3-a28e-dd6137112c47. This research is conducted purely from an academic perspective, and any affiliations or financial supports are disclosed in the manuscript. Our study has been carried out with a commitment to transparency and honesty in all our research findings and discussions related to this work.