-
-
-
-
-
-
Inference Providers
Active filters: sea
Text Generation
• 8B • Updated
• 71
• 8
Text Generation
• 7B • Updated
• 8.95k
• 68
mlx-community/SeaLLM-7B-v2-4bit-mlx
Updated
• 4
• 3
LoneStriker/SeaLLM-7B-v2-GGUF
7B • Updated
• 135
• 6
LoneStriker/SeaLLM-7B-v2-3.0bpw-h6-exl2
Text Generation
• Updated
LoneStriker/SeaLLM-7B-v2-4.0bpw-h6-exl2
Text Generation
• Updated
• 2
LoneStriker/SeaLLM-7B-v2-5.0bpw-h6-exl2
Text Generation
• Updated
• 2
LoneStriker/SeaLLM-7B-v2-6.0bpw-h6-exl2
Text Generation
• Updated
LoneStriker/SeaLLM-7B-v2-8.0bpw-h8-exl2
Text Generation
• Updated
• 1
LoneStriker/SeaLLM-7B-v2-AWQ
Text Generation
• 7B • Updated
• 3
Text Generation
• 8B • Updated
• 94
• 28
Text Generation
• 4B • Updated
• 98
• 6
Text Generation
• 2B • Updated
• 72
• • 8
Text Generation
• 0.6B • Updated
• 66
• 9
Text Generation
• 4B • Updated
• 93
• 2
Text Generation
• 2B • Updated
• 60
• • 6
Text Generation
• 0.6B • Updated
• 169
• 7
sail/Sailor-1.8B-Chat-gguf
2B • Updated
• 306
• 3
sail/Sailor-0.5B-Chat-gguf
0.6B • Updated
• 565
• 4
4B • Updated
• 342
• 3
8B • Updated
• 400
• 5
Text Generation
• 9B • Updated
• 12.6k
• 50
SeaLLMs/SeaLLM-7B-v2.5-GGUF
9B • Updated
• 123
• 8
SeaLLMs/SeaLLM-7B-v2.5-mlx-quantized
Text Generation
• 2B • Updated
• 2
• 2
NikolayKozloff/Sailor-7B-Q8_0-GGUF
8B • Updated
• 19
• 1
QuantFactory/SeaLLM-7B-v2.5-GGUF
Text Generation
• 9B • Updated
• 54
• 1
QuantFactory/SeaLLM-7B-v2-GGUF
Text Generation
• 7B • Updated
• 116
• 1
Image-Text-to-Text
• 8B • Updated
• 5
• 5
NghiemAbe/SeaLLM-7B-v2.5-AWQ
Text Generation
• Updated
• 2