From de794e6c96a306cf6bfa730da39c9c97c75693db Mon Sep 17 00:00:00 2001 From: Chen Meng Date: Tue, 15 Jul 2025 21:26:11 +0800 Subject: [PATCH] remove python wqdependencies --- README.md | 18 +++++++++--------- README_CN.md | 16 ++++++++-------- 2 files changed, 17 insertions(+), 17 deletions(-) diff --git a/README.md b/README.md index d0b62a3f..352e6b68 100644 --- a/README.md +++ b/README.md @@ -201,14 +201,14 @@ For more details, please see API document and [examples](examples/README.md). ### Python API(PyTorch) Firstly, please install the dependencies. - Python dependencies - ```bash -cmake==3.26.1 -sentencepiece==0.2.0 -torch==2.7.0+cpu -transformers==4.50.0 -accelerate==1.5.1 -protobuf==5.29.3 -tiktoken==0.9.0 + ``` + cmake==3.26.1 + sentencepiece==0.2.0 + torch==2.7.0+cpu + transformers==4.50.0 + accelerate==1.5.1 + protobuf==5.29.3 + tiktoken==0.9.0 ``` ***PS: Due to the potential compatibility issues between the model file and the `transformers` version, please select the appropriate `transformers` version.*** - oneCCL (For multi ranks) @@ -455,4 +455,4 @@ and ***A***: Try downgrading `transformer` to an appropriate version. This is because different versions of Transformer may change the names of certain variables. - ***Q***: I encountered an error saying that `mkl.h` could not be found during compilation. What should I do? -***A***: Please check if the `onednn` folder under `3rdparty/` is empty. If it is, delete it and rerun CMake. Additionally, if the `3rdparty/mkl/` folder contains only a `local` directory, move all contents from `mkl/local/*` to `mkl/`. \ No newline at end of file +***A***: Please check if the `onednn` folder under `3rdparty/` is empty. If it is, delete it and rerun CMake. Additionally, if the `3rdparty/mkl/` folder contains only a `local` directory, move all contents from `mkl/local/*` to `mkl/`. diff --git a/README_CN.md b/README_CN.md index a99b85f5..12e4b63e 100644 --- a/README_CN.md +++ b/README_CN.md @@ -201,14 +201,14 @@ xFasterTransformer 支持的模型格式与 Huggingface 有所不同,但与 Fa ### Python API(PyTorch) 首先,请安装依赖项。 - Python 依赖项 - ```bash -cmake==3.26.1 -sentencepiece==0.2.0 -torch==2.7.0+cpu -transformers==4.50.0 -accelerate==1.5.1 -protobuf==5.29.3 -tiktoken==0.9.0 + ``` + cmake==3.26.1 + sentencepiece==0.2.0 + torch==2.7.0+cpu + transformers==4.50.0 + accelerate==1.5.1 + protobuf==5.29.3 + tiktoken==0.9.0 ``` ***PS: 由于模型文件和 `transformers`版本之间可能存在兼容性问题,请选择适当的 `transformers`版本。*** - oneCCL (用于多进程)