-
Attention Is All You Need
Paper • 1706.03762 • Published • 120 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 26 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper • 1907.11692 • Published • 10 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper • 1910.01108 • Published • 22
Taufiq Dwi Purnomo
taufiqdp
AI & ML interests
SLM, VLM
Recent Activity
upvoted a collection about 6 hours ago
Gemma 4 liked a model 1 day ago
Jackrong/Qwen3.5-27B-Claude-4.6-Opus-Reasoning-Distilled upvoted an article 4 days ago
State of Open Source on Hugging Face: Spring 2026