From fb0876f9076ba1cd48fc19b81a266ca4e47bd4ea Mon Sep 17 00:00:00 2001 From: Chen Meng Date: Tue, 15 Jul 2025 21:26:11 +0800 Subject: [PATCH] remove python wqdependencies --- README.md | 10 ++++++++-- README_CN.md | 10 ++++++++-- requirements.txt | 8 -------- 3 files changed, 16 insertions(+), 12 deletions(-) delete mode 100644 requirements.txt diff --git a/README.md b/README.md index 281e1662..d0b62a3f 100644 --- a/README.md +++ b/README.md @@ -202,7 +202,13 @@ For more details, please see API document and [examples](examples/README.md). Firstly, please install the dependencies. - Python dependencies ```bash - pip install -r requirements.txt +cmake==3.26.1 +sentencepiece==0.2.0 +torch==2.7.0+cpu +transformers==4.50.0 +accelerate==1.5.1 +protobuf==5.29.3 +tiktoken==0.9.0 ``` ***PS: Due to the potential compatibility issues between the model file and the `transformers` version, please select the appropriate `transformers` version.*** - oneCCL (For multi ranks) @@ -446,7 +452,7 @@ and ***A***:This is because the program launched through MPI reads `OMP_NUM_THREADS=1`, which cannot correctly retrieve the appropriate value from the environment. It is necessary to manually set the value of `OMP_NUM_THREADS` based on the actual situation. - ***Q***: Why do I still encounter errors when converting already supported models? -***A***: Try downgrading `transformer` to an appropriate version, such as the version specified in the `requirements.txt`. This is because different versions of Transformer may change the names of certain variables. +***A***: Try downgrading `transformer` to an appropriate version. This is because different versions of Transformer may change the names of certain variables. - ***Q***: I encountered an error saying that `mkl.h` could not be found during compilation. What should I do? ***A***: Please check if the `onednn` folder under `3rdparty/` is empty. If it is, delete it and rerun CMake. Additionally, if the `3rdparty/mkl/` folder contains only a `local` directory, move all contents from `mkl/local/*` to `mkl/`. \ No newline at end of file diff --git a/README_CN.md b/README_CN.md index 75d4b67c..a99b85f5 100644 --- a/README_CN.md +++ b/README_CN.md @@ -202,7 +202,13 @@ xFasterTransformer 支持的模型格式与 Huggingface 有所不同,但与 Fa 首先,请安装依赖项。 - Python 依赖项 ```bash - pip install -r requirements.txt +cmake==3.26.1 +sentencepiece==0.2.0 +torch==2.7.0+cpu +transformers==4.50.0 +accelerate==1.5.1 +protobuf==5.29.3 +tiktoken==0.9.0 ``` ***PS: 由于模型文件和 `transformers`版本之间可能存在兼容性问题,请选择适当的 `transformers`版本。*** - oneCCL (用于多进程) @@ -449,7 +455,7 @@ and ***答***:这是因为通过 MPI 启动的程序读取的是 `OMP_NUM_THREADS=1`,无法从环境中正确获取相应的值。有必要根据实际情况手动设置 `OMP_NUM_THREADS` 的值。 - ***问***: 为什么在转换已支持的模型时仍会遇到错误? -***答***: 尝试将 `transformer` 降级到合适的版本,如 `requirements.txt` 中指定的版本。这是因为不同版本的 Transformer 可能会更改某些变量的名称。 +***答***: 尝试将 `transformer` 降级到合适的版本。这是因为不同版本的 Transformer 可能会更改某些变量的名称。 - ***问***:编译时遇到错误,提示找不到 `mkl.h`,我该怎么办? ***答***:请检查 `3rdparty/` 目录下的 `onednn` 文件夹是否为空。如果为空,请将其删除并重新运行 CMake。此外,如果 `3rdparty/mkl/` 文件夹内仅包含 `local` 目录,请将 `mkl/local/*` 中的所有内容移动到 `mkl/` 目录下。 diff --git a/requirements.txt b/requirements.txt deleted file mode 100644 index 3dd34fc0..00000000 --- a/requirements.txt +++ /dev/null @@ -1,8 +0,0 @@ --f https://download.pytorch.org/whl/torch_stable.html -cmake==3.26.1 -sentencepiece==0.2.0 -torch==2.7.1 -transformers==4.50.0 -accelerate==1.5.1 -protobuf==5.29.5 -tiktoken==0.9.0