Chatbot Combining Knowledge Base, Chat Session, and LLM with Langchain and Streamlit

I’m new with langchain. Just a question:
I want to create an LLM chatbot where it has capabilities to read from knowledge base and remember session.

Currently, I’m using Langchain and Streamlit, but I have difficulty to combine both reading from knowledge base AND session. I can only choose one of those.

Here’s the code snippet for session mode:

def generate_response(query):
    similar_responses = retrieve_info(query)
    response = chain.run(question=query, rag_text=similar_responses)
    return response
if user_input:
    message(user_input, is_user=True)
    # Add user message to chat history
    st.session_state.messages.append(HumanMessage(content=user_input))
    # response = generate_response(message)
    response = generate_response(st.session_state.messages[-1].content)
    print(response)
    message(response, is_user=False)
    st.session_state.messages.append(AIMessage(content=response))

I’m expecting the LLM can check the history first, then KB, then general knowledge. Example:

User: ‘My name is Alex.’
Bot : ‘Hi Alex, How can I help you?’
User: ‘Who is the most performing salesman this week?’
Bot : ‘Alejandro is the most performing sales this week.’
User: ‘What is my name?’
Bot : ‘Your name is Alex.’

Leave a Comment