← All Models
|
Microsoft Phi - Small language models with strong reasoning:
phi-1
phi-1_5
phi-2
Phi-3-mini-4k-instruct
Phi-4
microsoft/Phi-3-mini-4k-instruct
📊 Model Parameters
Total Parameters
3,821,079,552
Context Length
4,096
Hidden Size
3072
Layers
32
Attention Heads
32
KV Heads
32
💾 Memory Requirements
FP32 (Full)
14.23 GB
FP16 (Half)
7.12 GB
INT8 (Quantized)
3.56 GB
INT4 (Quantized)
1.78 GB
🔑 KV Cache (Inference)
Per Token (FP16)
393.22 KB
Max Context FP32
3.00 GB
Max Context FP16
1.50 GB
Max Context INT8
768.0 MB
⚙️ Model Configuration
Core Architecture
Vocabulary Size
32,064
Hidden Size
3,072
FFN Intermediate Size
8,192
Number of Layers
32
Attention Heads
32
KV Heads
32
Context & Position
Max Context Length
4,096
RoPE Base Frequency
10000.0
RoPE Scaling
Not set
Sliding Window Size
2,047
Attention Configuration
Attention Dropout
0%
Tied Embeddings
No
Attention Bias
No
Activation & Normalization
Activation Function
silu
RMSNorm Epsilon
1e-05
Dropout (Training)
Residual Dropout
0%
Embedding Dropout
0%
Special Tokens
BOS Token ID
1
Pad Token ID
32,000
EOS Token ID
32000
Data Type
Model Dtype
bfloat16
Layer Types:
Attention
MLP/FFN
Normalization
Embedding
Attention
MLP
Norm
Embedding
Clear
Expand All
Collapse All