>> mistralai/Ministral-3-8B-Instruct-2512

[QUANT :: UNSTABLELLAMA]

// REPO

EXL3 quantizations of mistralai's Ministral-3-8b-Instruct-2512-BF16.

Quantized with exllamav3 0.0.18.

// QUANTS

[BRANCH] [GiB] [K/L_DIV] [PPL]
2.10bpw 4.4 no measurements
3.00bpw 4.77 0.05864883 7.89712533
3.15bpw 4.89 0.04512541 7.81142826
4.00bpw 5.63 0.0162187 7.61291338
6.00bpw 7.36 0.0162187 7.51715039
bf16 16 0 x

// DOWNLOAD

Use HF-CLI to pull specific branches to your local machine:
huggingface-cli download UnstableLlama/Ministral-3-8B-Instruct-2512-exl3 --revision "3.00bpw" --local-dir ./
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for UnstableLlama/Ministral-3-8B-Instruct-2512-exl3