RDPD: rich data helps poor data via imitation

Year
2018
Private Intelligence relation
DIRECT
Link
https://scholar.google.com/citations?view_op=view_citation&hl=en&user=E-kZZeQAAAAJ&cstart=20&pagesize=80&citation_for_view=E-kZZeQAAAAJ:aqlVkmm33-oC
Conference/Journal

arXiv preprint arXiv:1809.01921

Author(s) from Distilled Foundation
TN Hoang
How the work relates to Private Intelligence

Developing a knowledge distillation method to enhance a predictive model trained on poor data using knowledge distilled from a high-complexity model trained on rich, private data.

Gained Experience