首页
读书
网课

huggingface模型下载python代码

好邻居 2024-12-25 18:22:05

###################################################################################

# Use a pipeline as a high-level helper

# from transformers import pipeline

# messages = [

#     {"role": "user", "content": "Who are you?"},

# ]

# pipe = pipeline("text-generation", model="meta-llama/Meta-Llama-3.1-405B-Instruct")

# pipe(messages)

#########################################################################

# Load model directly

# from transformers import AutoTokenizer, AutoModelForCausalLM

# tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-405B-Instruct")

# model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-405B-Instruct")


##########################################################################333

from huggingface_hub import snapshot_download


snapshot_download(repo_id="baichuan-inc/Baichuan2-7B-Chat-4bits",  # 模型ID

                  local_dir="./models/Baichuan2-7B-Chat-4bits")  # 指定本地地址保存模型


声明:本网站所提供的信息仅供参考之用,并不代表本网站赞同其观点,也不代表本网站对其真实性负责。 




分享到:
最新评论(0)
更多
匿名者
顶楼