diff --git a/README.md b/README.md
index ff712e2..44d072e 100644
--- a/README.md
+++ b/README.md
@@ -37,8 +37,8 @@
- [Groups](#-Groups)
- [News](#-news)
- [Installation](#%EF%B8%8F-installation)
-- [Quick Start](#-quick-Start)
- [Model List](#-Model-List)
+- [Quick Start](#-quick-Start)
- [License](#-License)
@@ -75,6 +75,27 @@ pip install -e .
uv pip install -e . --torch-backend=auto
```
+## ✨ Model List
+
+The following is the list of models supported by MCore-Bridge:
+
+| Series | model_type |
+| -------- | ------------------------------------------------------------ |
+| Qwen | qwen2, qwen2_moe
qwen2_vl, qwen2_5_vl, qwen2_5_omni
qwen3, qwen3_moe
qwen3_vl, qwen3_vl_moe, qwen3_omni_moe
qwen3_next, qwen3_5, qwen3_5_moe |
+| DeepSeek | deepseek_v3, deepseek_v32 |
+| GLM | glm4, glm4_moe, glm4_moe_lite
glm4v, glm4v_moe,
glm_moe_dsa |
+| MiniMax | minimax_m2 |
+| Kimi | kimi_k2, kimi_vl |
+| InternLM | internlm3, internvl_chat, internvl |
+| Ovis | ovis2_5 |
+| Llama | llama, llama4 |
+| GPT-OSS | gpt_oss |
+| ERNIE | ernie4_5, ernie4_5_moe |
+| MiMo | mimo |
+| Dots | dots1 |
+| OLMoE | olmoe |
+
+
## 🚀 Quick Start
How to use MCore-Bridge for training can be referred to the [ms-swift project](https://swift.readthedocs.io/en/latest/Megatron-SWIFT/Mcore-Bridge.html). Here we introduce how to use MCore-Bridge programmatically.
@@ -217,26 +238,6 @@ model = Qwen3_5ForConditionalGeneration.from_pretrained(model_dir)
peft_model = PeftModel.from_pretrained(model, 'Qwen3.5-4B-LoRA')
```
-## ✨ Model List
-
-The following is the list of models supported by MCore-Bridge:
-
-| Series | model_type |
-| -------- | ------------------------------------------------------------ |
-| Qwen | qwen2, qwen2_moe
qwen2_vl, qwen2_5_vl, qwen2_5_omni
qwen3, qwen3_moe
qwen3_vl, qwen3_vl_moe, qwen3_omni_moe
qwen3_next, qwen3_5, qwen3_5_moe |
-| DeepSeek | deepseek_v3, deepseek_v32 |
-| GLM | glm4, glm4_moe, glm4_moe_lite
glm4v, glm4v_moe,
glm_moe_dsa |
-| MiniMax | minimax_m2 |
-| Kimi | kimi_k2, kimi_vl |
-| InternLM | internlm3, internvl_chat, internvl |
-| Ovis | ovis2_5 |
-| Llama | llama, llama4 |
-| GPT-OSS | gpt_oss |
-| ERNIE | ernie4_5, ernie4_5_moe |
-| MiMo | mimo |
-| Dots | dots1 |
-| OLMoE | olmoe |
-
## 🏛 License
This framework is licensed under the [Apache License (Version 2.0)](https://github.com/modelscope/mcore-bridge/blob/master/LICENSE). For models and datasets, please refer to the original resource page and follow the corresponding License.
diff --git a/README_zh.md b/README_zh.md
index c502ad4..801b4af 100644
--- a/README_zh.md
+++ b/README_zh.md
@@ -37,8 +37,8 @@
- [用户群](#-用户群)
- [新闻](#-新闻)
- [安装](#%EF%B8%8F-安装)
-- [快速开始](#-快速开始)
- [模型列表](#-模型列表)
+- [快速开始](#-快速开始)
- [License](#-license)
## ☎ 用户群
@@ -74,6 +74,26 @@ pip install -e .
uv pip install -e . --torch-backend=auto
```
+## ✨ 模型列表
+
+以下为MCore-Bridge支持的模型列表:
+
+| 系列 | model_type |
+| -------- | ------------------------------------------------------------ |
+| Qwen | qwen2, qwen2_moe
qwen2_vl, qwen2_5_vl, qwen2_5_omni
qwen3, qwen3_moe
qwen3_vl, qwen3_vl_moe, qwen3_omni_moe
qwen3_next, qwen3_5, qwen3_5_moe |
+| DeepSeek | deepseek_v3, deepseek_v32 |
+| GLM | glm4, glm4_moe, glm4_moe_lite
glm4v, glm4v_moe,
glm_moe_dsa |
+| MiniMax | minimax_m2 |
+| Kimi | kimi_k2, kimi_vl |
+| InternLM | internlm3, internvl_chat, internvl |
+| Ovis | ovis2_5 |
+| Llama | llama, llama4 |
+| GPT-OSS | gpt_oss |
+| ERNIE | ernie4_5, ernie4_5_moe |
+| MiMo | mimo |
+| Dots | dots1 |
+| OLMoE | olmoe |
+
## 🚀 快速开始
如何使用MCore-Bridge进行训练可以参考[ms-swift项目](https://swift.readthedocs.io/zh-cn/latest/Megatron-SWIFT/Mcore-Bridge.html)。这里介绍如何使用代码方式使用Mcore-Bridge。
@@ -214,26 +234,6 @@ model = Qwen3_5ForConditionalGeneration.from_pretrained(model_dir)
peft_model = PeftModel.from_pretrained(model, 'Qwen3.5-4B-LoRA')
```
-## ✨ 模型列表
-
-以下为MCore-Bridge支持的模型列表:
-
-| 系列 | model_type |
-| -------- | ------------------------------------------------------------ |
-| Qwen | qwen2, qwen2_moe
qwen2_vl, qwen2_5_vl, qwen2_5_omni
qwen3, qwen3_moe
qwen3_vl, qwen3_vl_moe, qwen3_omni_moe
qwen3_next, qwen3_5, qwen3_5_moe |
-| DeepSeek | deepseek_v3, deepseek_v32 |
-| GLM | glm4, glm4_moe, glm4_moe_lite
glm4v, glm4v_moe,
glm_moe_dsa |
-| MiniMax | minimax_m2 |
-| Kimi | kimi_k2, kimi_vl |
-| InternLM | internlm3, internvl_chat, internvl |
-| Ovis | ovis2_5 |
-| Llama | llama, llama4 |
-| GPT-OSS | gpt_oss |
-| ERNIE | ernie4_5, ernie4_5_moe |
-| MiMo | mimo |
-| Dots | dots1 |
-| OLMoE | olmoe |
-
## 🏛 License