Wird geladen...

Knowledge distillation in deep learning and its applications

Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PeerJ Comput Sci
Hauptverfasser: Alkhulaifi, Abdolmaged, Alsahli, Fahad, Ahmad, Irfan
Format: Artigo
Sprache:Inglês
Veröffentlicht: PeerJ Inc. 2021
Schlagworte:
Online Zugang:https://ncbi.nlm.nih.gov/pmc/articles/PMC8053015/
https://ncbi.nlm.nih.gov/pubmed/33954248
https://ncbi.nlm.nih.govhttp://dx.doi.org/10.7717/peerj-cs.474
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!