Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
jmyang
's Collections
Meta APO
Meta APO
updated
23 days ago
Model of MetaAPO https://arxiv.org/abs/2509.23371
Upvote
2
jmyang/llama3.1-8b-rm-ultrafeedback
8B
•
Updated
Nov 15, 2025
•
2
jmyang/llama3.1-8b-dpo-ultrafeedback
8B
•
Updated
Nov 15, 2025
•
3
jmyang/MetaAPO-Llama3.1-8B
0.5B
•
Updated
Jan 2
•
13
•
2
jmyang/Qwen2.5-7B-DPO
8B
•
Updated
Jan 6
•
6
jmyang/Qwen2.5-7B-rm
1B
•
Updated
23 days ago
•
12
jmyang/MetaAPO-Qwen2.5-7B
0.5B
•
Updated
23 days ago
•
18
•
1
Upvote
2
Share collection
View history
Collection guide
Browse collections