mistralai/Codestral-22B-v0.1

📊 Model Parameters

Total Parameters 22,247,282,688
Context Length 32,768
Hidden Size 6144
Layers 56
Attention Heads 48
KV Heads 8

💾 Memory Requirements

FP32 (Full) 82.88 GB
FP16 (Half) 41.44 GB
INT8 (Quantized) 20.72 GB
INT4 (Quantized) 10.36 GB

🔑 KV Cache (Inference)

Per Token (FP16) 229.38 KB
Max Context FP32 14.00 GB
Max Context FP16 7.00 GB
Max Context INT8 3.50 GB

⚙️ Model Configuration

Core Architecture

Vocabulary Size32,768
Hidden Size6,144
FFN Intermediate Size16,384
Number of Layers56
Attention Heads48
Head Dimension128
KV Heads8

Context & Position

Max Context Length32,768
Sliding Window SizeNot set

Attention Configuration

Attention Dropout0%
Tied EmbeddingsNo

Activation & Normalization

Activation Functionsilu
RMSNorm Epsilon1e-05

Special Tokens

Pad Token IDNot set
BOS Token ID1
EOS Token ID2

Data Type

Model Dtypebfloat16
Layer Types:
Attention
MLP/FFN
Normalization
Embedding