-
Attention Is All You Need
Paper β’ 1706.03762 β’ Published β’ 120 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper β’ 1810.04805 β’ Published β’ 26 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper β’ 1907.11692 β’ Published β’ 10 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper β’ 1910.01108 β’ Published β’ 22
Taufiq Dwi Purnomo
taufiqdp
AI & ML interests
SLM, VLM
Recent Activity
liked a model about 20 hours ago
netflix/void-model upvoted a collection 2 days ago
Gemma 4 liked a model 3 days ago
Jackrong/Qwen3.5-27B-Claude-4.6-Opus-Reasoning-Distilled