Use LangChain to build a question-and-answer chat robot case (3)

Use LangChain to build a question-and-answer chat robot case

Analysis of the whole process of LangChain development
Next, let's go back to the "get_prompt()" method. In this method, there are system prompts (system prompts) and user prompts (user prompts), which are read from the corresponding files, read the system prompts (system_template) from the "system.prompt" file, Read the user prompt word (human_template) from the "user.prompt" file. In line 7, we define two input variables: "query" and "df_info". Then, encapsulate these prompt word information and input variables in a "ChatPromptTemplate" object, call the "SystemMessagePromptTemplate.from_template()" method and the "HumanMessagePromptTemplate.from_template()" method, and return the object.
Among them, the from_template method is a class method of BaseStringMessagePromptTemplate, which accepts a string template "template" and a string "template_format", as well as other keyword parameters. This method returns an object of type "MessagePromptTemplateT". We create a PromptTemplate object from a string template using the "PromptTemplate.from_template()" method. Then, use this object as a parameter to create an object of type "MessagePromptTemplateT". This object will contain the parsed content from the template, the source itself is very simple. Gavin big coffee WeChat:
code implementation of the from_template method of NLP_Matrix_Space chat.py:

 

Guess you like

Origin blog.csdn.net/duan_zhihua/article/details/132003405