# To use the model
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("Shengkun/Qwen3-16B-A2B-Pruned")

16B

Model Method Param. SciQ PIQA WG ArcE ArcC HS LogiQA BoolQ MMLU Avg
Qwen3-30B-A3B Dense 30B A3B 97.0 79.7 71.5 79.7 68.8 77.8 34.7 88.8 79.6 75.2
Uniform 16B A2B 94.9 71.4 60.2 73.2 52.6 47.0 33.2 75.0 55.6 62.5
Downloads last month
-
Safetensors
Model size
16B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support