Distilled AI
Distilled AI
/
Enabling Hierarchical Dirichlet Processes to Work Better for Short Texts at Large Scale

Enabling Hierarchical Dirichlet Processes to Work Better for Short Texts at Large Scale

Year
2016
Private Intelligence relation
INDIRECT
Link
https://scholar.google.com/citations?view_op=view_citation&hl=en&user=tZ78MoQAAAAJ&citation_for_view=tZ78MoQAAAAJ:hqOjcs7Dif8C
Conference/Journal

Advances in Knowledge Discovery and Data Mining: 20th Pacific-Asia Conference, PAKDD 2016, Auckland, New Zealand, April 19-22, 2016, Proceedings, Part II 20

Author(s) from Distilled Foundation
Linh Ngo Van
How the work relates to Private Intelligence

Gained Experience

A modeling and classification method for short texts at large scale