AmanPriyanshu/gpt-oss-4.2b-specialized-all-pruned-moe-only-4-experts Text Generation • 4B • Updated Aug 13 • 65 • 8
GPT-OSS General (4.2B to 20B) Collection Collection of pruned GPT-OSS models spanning 1-32 experts, maintaining general capabilities across domains while reducing computational requirements. • 29 items • Updated Aug 13 • 9
GPT-OSS Pruned Experts (4.2B-20B) [IF, Science, Math, etc.] Collection Complete collection of domain-specialized GPT-OSS models (1-32 experts) optimized for science, math, medicine, law, safety, and instruction following. • 8 items • Updated Aug 13 • 10
Kimi-Linear-A3B Collection Moonshot's experimental MoE model with Kimi Delta Attention • 3 items • Updated Nov 1 • 17
Kimi-K2 Collection Moonshot's MoE LLMs with 1 trillion parameters, exceptional on agentic intellegence • 5 items • Updated Nov 14 • 160