书生·浦语大模型-第二节课笔记/作业

笔记

在这里插入图片描述
在这里插入图片描述

实验一

  • cli-demo
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM


model_name_or_path = "../models"

tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, trust_remote_code=True, device_map='cuda:0')
model = AutoModelForCausalLM.from_pretrained(model_name_or_path, trust_remote_code=True, torch_dtype=torch.bfloat16, device_map='cuda:0')
model = model.eval()


system_prompt = """You are an AI assistant whose name is InternLM (书生·浦语).
- InternLM (书生·浦语) is a conversational language model that is developed by Shanghai AI Laboratory (上海人工智能实验室). It is designed to be helpful, honest, and harmless.
- InternLM (书生·浦语) can understand and communicate fluently in the language chosen by the user such as English and 中文.
"""

messages = [(system_prompt, '')]

print("=============Welcome to InternLM chatbot, type 'exit' to exit.=============")

while True:
    input_text = input("\nUser  >>> ")
    input_text = input_text.replace(' ', '')
    if input_text == "exit":
        break

    length = 0
    for response, _ in model.stream_chat(tokenizer, input_text, messages):
        if response is not None:
            print(response[length:], flush=True, end="")
            length = len(response)

请添加图片描述

  • web_demo
    请添加图片描述

实验二

实战:Agent

请添加图片描述
请添加图片描述

原理

实战:实践部署 浦语·灵笔2 模型

请添加图片描述
请添加图片描述
原理

猜你喜欢

转载自blog.csdn.net/weixin_38812492/article/details/137246262