本文将协助你经常使用低代码前端、用于会话治理的LangChain以及用于生成照应的Bedrock LLM来创立聊天机器人。
在始终开展的AI畛域,聊天机器人已成为一种无法或缺的工具,用于增强用户介入度和简化信息传递。本文将逐渐引见构建交互式聊天机器人的详细环节,经常使用Streamlit作为前端、经常使用LangChain用于协调交互,以及经常使用基于Amazon Bedrock的AnthropicClaude模型作为大言语模型(LLM)后端。咱们将深化钻研前端和后端的代码片段,并解释使这个聊天机器人实际可行的关键组件。
**组件
图1. 聊天机器人体系结构
体系结构的概念论述
代码片段
前端(Streamlit)
import streamlitimport chatbot_backendfrom langchain.chains import ConversationChain from langchain.memory import ConversationSummaryBufferMemory import boto3 from langchain_aws import ChatBedrockimport pandas as pd# 2 Set Title for Chatbot - streamlit.title("Hi, This is your Chatbott")# 3 LangChain memory to the session cache - Session State -if 'memory' not in streamlit.session_state:streamlit.session_state.memory = demo.demo_memory()# 4 Add the UI chat history to the session cache - Session Stateif 'chat_history' not in streamlit.session_state:streamlit.session_state.chat_history = []# 5 Re-render the chat historyfor message in streamlit.session_state.chat_history:with streamlit.chat_message(message["role"]):streamlit.markdown(message["text"])# 6 Enter the details for chatbot input boxinput_text = streamlit.chat_input("Powered by Bedrock")if input_text:with streamlit.chat_message("user"):streamlit.markdown(input_text)streamlit.session_state.chat_history.append({"role": "user", "text": input_text})chat_response = demo.demo_conversation(input_text=input_text,memory=streamlit.session_state.memory)with streamlit.chat_message("assistant"):streamlit.markdown(chat_response)streamlit.session_state.chat_history.append({"role": "assistant", "text": chat_response})
后端(LangChain和LLM)
from langchain.chains import ConversationChain from langchain.memory import ConversationSummaryBufferMemory import boto3 from langchain_aws import ChatBedrock# 2a Write a function for invoking model- client connection with Bedrock with profile, model_id def demo_chatbot():boto3_session = boto3.Session(# Your aws_access_key_id,# Your aws_secret_access_key,region_name='us-east-1')llm = ChatBedrock(model_id="anthropic.claude-3-sonnet-20240229-v1:0",client=boto3_session.client('bedrock-runtime'),model_kwargs={"anthropic_version": "bedrock-2023-05-31","max_tokens": 20000,"temperature": .3,"top_p": 0.3,"stop_sequences": ["\n\nHuman:"]})return llm# 3 Create a Function forConversationSummaryBufferMemory(llm and max token limit) def demo_memory():llm_data = demo_chatbot()memory = ConversationSummaryBufferMemory(llm=llm_data, max_token_limit=20000)return memory# 4 Create a Function for Conversation Chain - Input text + Memory def demo_conversation(input_text, memory):llm_chain_data = demo_chatbot()# Initialize ConversationChain with proper llm and memoryllm_conversation = ConversationChain(llm=llm_chain_data, memory=memory, verbose=True)# Call the invoke methodfull_input = f" \nHuman: {input_text}"llm_start_time = time.time()chat_reply = llm_conversation.invoke({"input": full_input})llm_end_time = time.time()llm_elapsed_time = llm_end_time - llm_start_timememory.save_context({"input": input_text}, {"output": chat_reply.get('response', 'No Response')})return chat_reply.get('response', 'No Response')
论断
咱们在下面讨论了用Streamlit、LangChain和弱小的LLM后端构建的交互式聊天机器人的基本模块。这个基础为从客户允许智能化到共性化学习体验的有限或者关上了大门。你可以轻易实验、改良和部署这个聊天机器人,以满足自己的特定需求和经常使用场景。
原文题目:,作者:Karan Bansal
© 版权声明