Fine-Tuning Channel-Pruned Deep Model via Knowledge Distillation
Crossref DOI link: https://doi.org/10.1007/s11390-023-2386-8
Published Online: 2025-01-16
Published Print: 2024-11
Update policy: https://doi.org/10.1007/springer_crossmark_policy
Zhang, Chong
Wang, Hong-Zhi
Liu, Hong-Wei
Chen, Yi-Lin
Text and Data Mining valid from 2024-11-01
Version of Record valid from 2024-11-01
Article History
Received: 4 April 2022
Accepted: 7 November 2023
First Online: 16 January 2025
Ethics
: Conflict of Interest The authors declare that they have no conflict of interest.