Skip to content

Latest commit

 

History

History
5 lines (4 loc) · 547 Bytes

File metadata and controls

5 lines (4 loc) · 547 Bytes

IMDB movie reviews binary classification

🌟 Acknowledgments:

  • BERT and its preprocessing were originally published by Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova: "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding", 2018.
  • The original dataset comes from Learning Word Vectors for Sentiment Analysis by Andrew L. Maas, Raymond E. Daly, Peter T. Pham, Dan Huang, Andrew Y. Ng, and Christopher Potts. (2011). The 49th Annual Meeting of the Association for Computational Linguistics (ACL 2011).