Distilcyphergpt: enhancing large language models for knowledge graph question answering in cypher through knowledge distillation
Crossref DOI link: https://doi.org/10.1007/s10618-025-01157-9
Published Online: 2025-08-23
Published Print: 2025-11
Update policy: https://doi.org/10.1007/springer_crossmark_policy
Chong, You Li
Lee, Chin Poo
Lim, Kian Ming
Text and Data Mining valid from 2025-08-23
Version of Record valid from 2025-08-23
Article History
Received: 26 November 2024
Accepted: 6 August 2025
First Online: 23 August 2025
Declarations
:
: The authors declare no Conflict of interest.