Skip to content

Scalable, fast, and lightweight system for large-scale topic modeling

License

Notifications You must be signed in to change notification settings

yiylin/lightlda-1

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

#LightLDA

LightLDA is a distributed system for large scale topic modeling. It implements a distributed sampler that enables very large data sizes and models. LightLDA improves sampling throughput and convergence speed via a fast O(1) metropolis-Hastings algorithm, and allows small cluster to tackle very large data and model sizes through model scheduling and data parallelism architecture. LightLDA is implemented with C++ for performance consideration.

We have sucessfully trained big topic models (with trillions of parameters) on big data (Top 10% PageRank values of Bing indexed page, containing billions of documents) in Microsoft. For more technical details, please refer to our WWW'15 paper.

For documents, please view our website http://www.dmtk.io.

##Why LightLDA

The highlight features of LightLDA are

  • Scalable: LightLDA can train models with trillions of parameters on big data with billions of documents, a scale previous implementations cann't handle.
  • Fast: The sampler can sample millions of tokens per second per multi-core node.
  • Lightweight: Such big tasks can be trained with as few as tens of machines.

##Quick Start

Run $ sh build.sh to build lightlda. Run $ sh example/nytimes.sh for a simple example.

##Reference

Please cite LightLDA if it helps in your research:

@inproceedings{yuan2015lightlda,
  title={LightLDA: Big Topic Models on Modest Computer Clusters},
  author={Yuan, Jinhui and Gao, Fei and Ho, Qirong and Dai, Wei and Wei, Jinliang and Zheng, Xun and Xing, Eric Po and Liu, Tie-Yan and Ma, Wei-Ying},
  booktitle={Proceedings of the 24th International Conference on World Wide Web},
  pages={1351--1361},
  year={2015},
  organization={International World Wide Web Conferences Steering Committee}
}

About

Scalable, fast, and lightweight system for large-scale topic modeling

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 94.9%
  • Python 2.1%
  • Shell 1.7%
  • Makefile 1.3%