Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
mixtao
/
MixTAO-7Bx2-MoE-Instruct-v5.0
like
0
Follow
MixTAO Labs
1
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
text-generation-inference
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
MixTAO-7Bx2-MoE-Instruct-v5.0
π» Usage
text-generation-webui - Model Tab
MixTAO-7Bx2-MoE-Instruct-v5.0
MixTAO-7Bx2-MoE-Instruct-v5.0 is a Mixure of Experts (MoE).
π» Usage
text-generation-webui - Model Tab
Downloads last month
86
Safetensors
Model size
13B params
Tensor type
BF16
Β·
Files info
Inference Providers
NEW
Text Generation
This model isn't deployed by any Inference Provider.
π
Ask for provider support