解决Huggingface被墙下载模型的问题
pip install -U huggingface_hub
# 只在当前终端生效,如果想长期生效改source文件
export HF_ENDPOINT=https://hf-mirror.com
huggingface-cli download --resume-download utrobinmv/t5_translate_en_ru_zh_large_1024
这样就会下载模型到本地,然后可以从本地加载
from transformers import T5ForConditionalGeneration, T5Tokenizer
print("t5_translate_en_ru_zh_large_1024 模型加载中...")
device = 'cuda'
model_name = 'utrobinmv/t5_translate_en_ru_zh_large_1024'
model = T5ForConditionalGeneration.from_pretrained(model_name)
model.to(device)
tokenizer = T5Tokenizer.from_pretrained(model_name)
# ...
Reference
https://blog.csdn.net/weixin_43303286/article/details/134342476
发表评论