State-of-the-art transformer models for natural language processing tasks, implemented using PyTorch. This repository includes pre-trained models and fine-tuning scripts for various NLP applications.
- BERT (Bidirectional Encoder Representations from Transformers)
- GPT (Generative Pre-trained Transformer)
- RoBERTa (Robustly Optimized BERT Pretraining Approach)
- T5 (Text-to-Text Transfer Transformer)