transformers安装避坑

1.4 下载rust编辑器

看到这里你肯定会疑惑了,我们不是要用python的吗

这个我也不知道,你下了就对了,不然后面的transformers无法安装

因为是windows到官网选择推荐的下载方式https://www.rust-lang.org/tools/install

在这里插入图片描述
执行文件,输入1即可。

在这里插入图片描述
测试是否安装成功。
在这里插入图片描述

避坑2:所有的下载都必须git clone,不要本地下载

这是因为git clone下载时会自动有个node tree,我不知道这hash值具体有什么用的,应该类似钥匙,如果本地下载后,它们就会没有这个hash值,运行webui-user.bat时就会提示错误。

错误类似:

reference is not a tree: 24268930bf1dce879235a7fddd0b2355b84d7ea6

2.1 在完成上述步骤后,就运行webui-user.bat

等一会,等到命令行工具显示下载installing GFPGAN时,就Ctrl+C停止批量工具操作,这时会产生一个\venv\Scripts的路径。

2.2 git clone命令与相应操作如下(看自己安装目录更改)

使用git bash工具

git clone https://github.com/TencentARC/GFPGAN.git "D:\SD\venv\Scripts\GFPGAN"

之后在命令行到D:\SD\venv\Scripts\GFPGAN输入如下的命令。

扫描二维码关注公众号,回复: 17340844 查看本文章
D:\SD\venv\Scripts\python.exe -m pip install basicsr facexlib 
D:\SD\venv\Scripts\python.exe -m pip install -r requirements.txt 
D:\SD\venv\Scripts\python.exe setup.py develop
D:\SD\venv\Scripts\python.exe -m pip install realesrgan

然后到下载open_clip,也是同样的git bash

git clone https://github.com/mlfoundations/open_clip "D:\SD\venv\Scripts\open_clip"

命令行到D:\SD\venv\Scripts\open_clip输入如下的命令。

D:\SD\venv\Scripts\python.exe setup.py build install

也是同样的操作到CLIP

git clone https://github.com/openai/CLIP.git "D:\SD\venv\Scripts\CLIP"

到相应的目录下,输入如下命令

D:\SD\venv\Scripts\python.exe -m pip install ftfy regex tqdm
D:\SD\venv\Scripts\python.exe setup.py build install
2.3 gitclone与安装依赖

其实操作也跟2.2类型,相应操作也是合起来如下,没有repositories文件夹,就自己新建一个:

git clone https://github.com/Stability-AI/stablediffusion.git "D:\SD\repositories\stable-diffusion-stability-ai" 

git clone https://github.com/CompVis/taming-transformers.git "D:\SD\repositories\taming-transformers" 

git clone https://github.com/crowsonkb/k-diffusion.git "D:\SD\repositories\k-diffusion"

git clone https://github.com/sczhou/CodeFormer.git "D:\SD\repositories\CodeFormer" 

git clone https://github.com/salesforce/BLIP.git "D:\SD\repositories\BLIP"

但是要到上面涉及到的每个目录下,检查是否有requirements.txt文件,有就到对应目录的命令行,执行如下命令

D:\SD\venv\Scripts\python.exe -m pip install -r requirements.txt 
2.4 pip transformers时,麻烦的tokenziers错误
解决方案:网络问题,试多几次(这里我用了差不多3~4h弄成功了)

尽管之前已经安装好了rust编辑器,但是它还会提示各种千奇百怪的错误,比如我下面这个。

Building wheels for collected packages: tokenizers
  Building wheel for tokenizers (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for tokenizers (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [62 lines of output]
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib.win-amd64-cpython-310
      creating build\lib.win-amd64-cpython-310\tokenizers
      copying py_src\tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\models
      copying py_src\tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\models
      creating build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying py_src\tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      creating build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying py_src\tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      creating build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying py_src\tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      creating build\lib.win-amd64-cpython-310\tokenizers\processors
      copying py_src\tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\processors
      creating build\lib.win-amd64-cpython-310\tokenizers\trainers
      copying py_src\tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      creating build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\sentencepiece_unigram.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      copying py_src\tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
      creating build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\tools\visualizer.py -> build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\tools\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\tools
      copying py_src\tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers
      copying py_src\tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\models
      copying py_src\tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\decoders
      copying py_src\tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
      copying py_src\tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
      copying py_src\tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\processors
      copying py_src\tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\trainers
      copying py_src\tokenizers\tools\visualizer-styles.css -> build\lib.win-amd64-cpython-310\tokenizers\tools
      running build_ext
      running build_rust
      cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --
      warning: unused manifest key: target.x86_64-apple-darwin.rustflags
          Updating crates.io index
      warning: spurious network error (2 tries remaining): failed to send request: 鎿嶄綔瓒呮椂
      ; class=Os (2)
      warning: spurious network error (1 tries remaining): failed to send request: 鎿嶄綔瓒呮椂
      ; class=Os (2)
      error: failed to get `env_logger` as a dependency of package `tokenizers-python v0.10.3 (C:\Users\LENOVO\AppData\Local\Temp\pip-install-bw47mt33\tokenizers_3b1650888a634bdab4e4c98d147e7833)`

      Caused by:
        failed to load source for dependency `env_logger`

      Caused by:
        Unable to update registry `crates-io`

      Caused by:
        failed to fetch `https://github.com/rust-lang/crates.io-index`

      Caused by:
        failed to send request: 鎿嶄綔瓒呮椂
        ; class=Os (2)
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers

后面我找了网上不少答案,发现它们的回答根本就不统一,后面我问了一下chatGPT,人家的回答如下:

说人话:chatGPT不知道为什么会这样啊?

注意

其实就是网络问题,跟你的工具版本没关系的,你多试几次,或者找找有没有更快下载的通道

在这里插入图片描述

错误信息表明在安装过程中构建tokenizers包的wheel文件出现了问题。错误发生在运行cargo rustc命令时,该命令用于编译Rust代码。

具体的错误信息如下:

error: failed to get `env_logger` as a dependency of package `tokenizers-python v0.10.3`

这个错误表明构建过程无法获取tokenizers包所需的env_logger依赖项。

错误进一步说明了以下内容:

Caused by:
  Unable to update registry `crates-io`

Caused by:
  failed to fetch `https://github.com/rust-lang/crates.io-index`

Caused by:
  failed to send request: 鎿嶄綔瓒呮椂
; class=Os (2)

这些消息表明更新注册表和获取所需的依赖项时出现了问题。错误信息failed to send request: 鎿嶄綔瓒呮椂表明可能存在网络问题或者crates.io索引的临时问题。

以下是解决该问题的几个步骤:

  1. 检查网络连接:确保您拥有稳定的互联网连接,并能够访问外部资源。

  2. 重试安装:有时网络问题可能是暂时的。再次尝试运行安装命令,看看问题是否能够解决。

  3. 清除pip缓存:pip会缓存软件包以提高安装速度。然而,缓存文件有时会引发问题。您可以尝试通过运行以下命令清除pip缓存:

    pip cache purge
    
  4. 指定其他软件包源:您可以尝试从其他软件包源安装该软件包。例如,您可以使用--index-url选项来指定另一个软件包索引:

    pip install --index-url https://pypi.python.org/simple/ tokenizers
    
  5. 升级pip和setuptools:过时的pip和setuptools版本有时会引发问题。尝试运行以下命令升级它们:

    pip install --upgrade pip
    pip install --upgrade setuptools
    
  6. 安装Rust:tokenizers包依赖于Rust。确保您的系统上已安装Rust并且是最新版本。您可以按照官方Rust网站(https://www.rust-lang.org/tools/install)上的说明安装Rust。

如果尝试了这些步骤后问题仍然存在,建议搜索与tokenizers包相关的类似问题或错误消息,看看是否有已知的解决方案或变通方法。您可以在论坛或问题跟踪器上寻找相关信息。

猜你喜欢

转载自blog.csdn.net/chaishen10000/article/details/134354776