Year
2018
Direct or indirect related to Private Intelligence
DIRECT
Conference/Journal
arXiv preprint arXiv:1809.01921
Author(s) from Distilled Foundation
TN Hoang
How the work relates to Private Intelligence
Developing a knowledge distillation method to enhance a predictive model trained on poor data using knowledge distilled from a high-complexity model trained on rich, private data.
Gained Experience