Skip to content

Commit 53b14c7

Browse files
authored
Update README.md
1 parent 942b4ad commit 53b14c7

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ All pre-trained checkpoints are hosted on the [Hugging Face Hub](https://hugging
126126

127127
| Model Artifact | Step | Description | Download |
128128
| :--- | :--- | :--- | :--- |
129-
| **Aetheris-Base** | 10k | Early convergence checkpoint (Loss ~3.66). Good for analyzing router behavior. | [🤗 Hugging Face](https://huggingface.co/Pomilon-Intelligence-Lab-lab/Aetheris-MoE-300M-A125M-base) |
129+
| **Aetheris-Base** | 10k | Early convergence checkpoint (Loss ~3.66). Good for analyzing router behavior. | [🤗 Hugging Face](https://huggingface.co/pomilon-lab/Aetheris-MoE-300M-A125M-base) |
130130
| **Aetheris-Chat** | -- | *Coming Soon (Post-SFT)* | -- |
131131

132132
> **⚠️ Important:** Aetheris uses a custom Hybrid Mamba-MoE architecture. You **cannot** load it directly with `transformers.AutoModel`. You must use the interface provided in this repository.

0 commit comments

Comments
 (0)