OsirisCortex-v6

Sovereign AGI core — abliterated Qwen/Qwen3.5-9B using proven mlabonne datasets (256+256), mean-diff method, 1.5x strength, 4 passes, layer blacklist [0, 1, 30, 31].

Architecture

  • Qwen3.5-9B hybrid: 3:1 GatedDeltaNet:FullAttention, 32 layers, 3584 hidden
  • Thinking model (supports <think> tags)

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("osirisbrain/OsirisCortex-v6")
tokenizer = AutoTokenizer.from_pretrained("osirisbrain/OsirisCortex-v6")

Based on Qwen/Qwen3.5-9B by Alibaba (Apache 2.0).

Downloads last month
52
Safetensors
Model size
9B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for osirisbrain/OsirisCortex-v6

Finetuned
Qwen/Qwen3.5-9B
Finetuned
(19)
this model
Merges
1 model
Quantizations
1 model