Welcome back! 🚀 Now that we’ve laid the foundation for your chatbot—setting up Langchain, Streamlit, and crafting the processing chain—it’s time to dive into the next steps: handling user input and fetching responses from the language model. 💬
In this part, we’ll show you how to:
🖊️ Capture user queries and seamlessly integrate them into the chatbot's flow.
🧠 Leverage the power of the LLM to generate concise, user-friendly responses.
🗂️ Ensure that the chatbot maintains context using session states and the memory buffer.
With these steps, your chatbot will truly come to life, ready to engage and assist users with clarity . Let’s get coding and bring your chatbot to full functionality! 🚀✨
4 : Dynamically Updating the Chat history
Now, we’re setting up the chat interface! This part displays the conversation history dynamically, with user messages and bot responses, and adds an input box for users to type their queries.
Line 78
st.subheader("Chat with the Support Bot")
st.subheader
: Displays a subheading in the Streamlit app.
Line 81 : Function Definition
for i, chat in enumerate(st.session_state["chat_history"]):
for i, chat in enumerate(...)
: Loops through thechat_history
stored in the session state.chat_history
: A list containing dictionaries where each dictionary represents one exchange (user message and bot response).chat
: The dictionary holding the current exchange's content.
Line 82 : User bubble
message(chat["user"], is_user=True, key=f"user_{i}") # User's message
message(chat["user"])
: Displays the user's message in the chat interface.chat["user"]
: Retrieves the user's input message from the current chat dictionary.is_user=True
: Indicates this is a user message, so Streamlit renders it in the UI as a user bubble.key=f"user_{i}"
: Assigns a unique key for each user message. Keys prevent Streamlit from re-rendering the wrong elements when the session state updates.
Line 83 : Bot’s bubble
message(chat["agent"], is_user=False, key=f"agent_{i}") # Bot's response
message(chat["agent"])
: Displays the chatbot's response in the chat interface.chat["agent"]
: Retrieves the bot's response from the current chat dictionary.is_user=False
: Indicates this is not a user message (i.e., the agent's message), so Streamlit renders it in the UI as a bot bubble.key=f"agent_{i}"
: Assigns a unique key for each bot response to maintain proper UI state.
Line 86
user_input = st.text_input("Type your message:", key="chat_input", label_visibility="collapsed")
st.text_input
: Creates a single-line text input field for user interaction."Type your message:"
: This is the label for the input box, guiding users to enter their message.key="chat_input"
: The key ensures this input field is tied to a specific session state, allowing persistence and dynamic updates.label_visibility="collapsed"
: Hides the label from being displayed in the UI while still associating the text with the input box. The field appears clean and minimalistic.
5 : Process after sending the Query
This is the most crucial part of the code because it is responsible for the key functionality of the bot.
Line 88: Button Trigger for Sending a Message
if st.button("Send Message"):
A Streamlit button is created to trigger the message-sending process.
When the "Send Message" button is clicked, the logic inside this block executes.
Line 89 : Function to Validate User Input
if user_input.strip():
Checks if the user input is not empty or just whitespace.
If the input is valid, it proceeds; otherwise, a warning is displayed (handled later in Block 6).
Line 90-93: Handle "Quit" Command
if user_input.lower() == "quit":
st.success("Conversation ended. Thank you for reaching out!")
st.session_state["chat_history"].append({"user": user_input, "agent": "Goodbye!"})
memory.clear()
If the user types "quit":
A success message is displayed, acknowledging the end of the conversation.
The message "Goodbye!" is added to the chat history as the bot's response.
The memory buffer (
memory
) is cleared, resetting the context for the next session.
Line 94-105 : Generate Response from the LLM
else:
try:
chat_history_text = "\n".join(
[f"User: {item['user']}\nAgent: {item['agent']}" for item in st.session_state["chat_history"]]
)
response = chain.run(
chat_history=chat_history_text,
user_input=user_input,
product_problem=st.session_state["product_problem"]
)
st.session_state["chat_history"].append({"user": user_input, "agent": response.strip()})
If the user input is valid and not "quit," the chatbot generates a response:
Format Chat History: Combines the existing chat history into a formatted string where each exchange is labeled with "User" and "Agent."
Call LLM Chain: Passes the formatted history, user input, and the product problem description to the
chain.run
method to generate a response from the LLM.Update Chat History: Appends the user's message and the LLM's response to the chat history stored in the session state.
Line 106-107 : Handle Errors
except Exception as e:
st.error(f"An error occurred: {e}")
- If an error occurs during the LLM response generation, it is caught, and a descriptive error message is displayed.
Line 108-109 : Handle Empty Input
else:
st.warning("Please enter a message.")
- If the user clicks "Send Message" but the input is empty or whitespace, a warning prompts them to provide a valid message.
6 : Adding a Reset Button
Line 111-116
# Reset chat
if st.button("Reset Chat"):
st.session_state["product_problem"] = ""
st.session_state["chat_history"] = []
memory.clear()
st.success("Chat reset successfully!")
A Streamlit button labeled "Reset Chat" is created.
When clicked, it executes the logic within this block.
Resets the
product_problem
session state to an empty string, clearing any previously stored problem description.Resets the
chat_history
session state to an empty list, removing all previous user and agent messages from the session.Clears the
ConversationBufferMemory
to reset the conversational context for the chatbot.Displays a success message in the UI to inform the user that the chat session has been reset.
Conclusion
That was all about building a fully functional, efficient customer care chatbot . Hope you liked the implementation and the explanation was clear enough.
I would also like to add the complete code here , just for the type of people who want to cut to the chase and directly try it themselves ^_^
import streamlit as st
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
from langchain.memory import ConversationBufferMemory
from langchain_google_genai import ChatGoogleGenerativeAI
from dotenv import load_dotenv
import os
from io import BytesIO
from streamlit_chat import message
# Load environment variables
load_dotenv()
# Initialize the LLM with Google Generative AI
llm = ChatGoogleGenerativeAI(
temperature=0.7,
model="gemini-1.5-flash",
google_api_key=os.environ["GOOGLE_API_KEY"]
)
# Initialize conversation memory
memory = ConversationBufferMemory()
# Define the prompt template for the chatbot
prompt = PromptTemplate(
input_variables=["chat_history", "user_input", "product_problem"],
template=(
"You are a highly skilled customer care representative dedicated to resolving users' product-related issues. "
"The user has described the following problem with their product: {product_problem}. "
"Provide clear and practical solutions to address their concerns, including troubleshooting steps, potential timelines for resolution, warranty information, and any additional support they might need. "
"Feel free to invent plausible details where necessary to offer a seamless customer service experience. "
"Engage with the user in a polite, professional, and conversational tone, ensuring their satisfaction. "
"Ensure that you give small answers in short points , that is not too long to read "
"\n\n"
"Chat History:\n{chat_history}\n\n"
"User: {user_input}\n\n"
"Agent:"
)
)
# Create the chain
chain = LLMChain(
llm=llm,
prompt=prompt
)
# Streamlit UI
st.set_page_config(page_title="Customer Care Chatbot", layout="centered")
st.title("Customer Care Chatbot")
st.write(
"""
**Disclaimer**
We appreciate your engagement! Please note, this bot is designed to assist with product-related issues.
Type your queries below to get started.
"""
)
# Initialize session state
if "chat_history" not in st.session_state:
st.session_state["chat_history"] = []
if "product_problem" not in st.session_state:
st.session_state["product_problem"] = ""
# Step 1: Input the product and problem
if not st.session_state["product_problem"]:
st.subheader("Step 1: Describe Your Problem")
product_problem_input = st.text_area("What issue are you facing with your product?")
if st.button("Submit Problem"):
if product_problem_input.strip():
st.session_state["product_problem"] = product_problem_input.strip()
st.success(f"Problem noted: {st.session_state['product_problem']}")
else:
st.warning("Please describe your problem.")
else:
st.subheader(f"Problem: {st.session_state['product_problem']}")
# Step 2: Chat UI
st.subheader("Chat with the Support Bot")
# Display chat history dynamically
for i,chat in enumerate (st.session_state["chat_history"]):
message(chat["user"], is_user=True, key=f"user_{i}") # User's message
message(chat["agent"], is_user=False, key=f"agent_{i}") # Bot's response
# Input box at the bottom
user_input = st.text_input("Type your message:", key="chat_input", label_visibility="collapsed")
if st.button("Send Message"):
if user_input.strip():
if user_input.lower() == "quit":
st.success("Conversation ended. Thank you for reaching out!")
st.session_state["chat_history"].append({"user": user_input, "agent": "Goodbye!"})
memory.clear()
else:
# Generate response from the LLM
try:
chat_history_text = "\n".join(
[f"User: {item['user']}\nAgent: {item['agent']}" for item in st.session_state["chat_history"]]
)
response = chain.run(
chat_history=chat_history_text,
user_input=user_input,
product_problem=st.session_state["product_problem"]
)
st.session_state["chat_history"].append({"user": user_input, "agent": response.strip()})
except Exception as e:
st.error(f"An error occurred: {e}")
else:
st.warning("Please enter a message.")
# Reset chat
if st.button("Reset Chat"):
st.session_state["product_problem"] = ""
st.session_state["chat_history"] = []
memory.clear()
st.success("Chat reset successfully!")
Hope you like the information shared in this article and stay connected for more.
Connect with me on Linkedin : https://www.linkedin.com/in/bebandhshrivastava/
Visit my Linktree : https://linktr.ee/bebandh
Visit my Github Profile for more LLM applications like this one : https://github.com/BEBANDH/Streamlit-Apps/tree/main
Link to Part 1 of this article : https://candyman5757.hashnode.dev/how-to-build-a-simple-chatbot-with-langchain-part-1-of-2
THANKYOU !