Chatbot Memory with Langchain

Conversational memory is how a chatbot can respond to multiple queries in a chat-like manner. It enables a coherent conversation, and without it, every query would be treated as an entirely independent input without considering past interactions.

The LLM with and without conversational memory. The blue boxes are user prompts and in grey are the LLMs responses. Without conversational memory (right), the LLM cannot respond using knowledge of previous interactions.


This is a companion discussion topic for the original entry at https://www.pinecone.io/learn/langchain-conversational-memory/

the markdown is missing something in this page, near:

There are several ways that we can implement conversational memory. In the context of [LangChain](/learn/langchain-intro/,

1 Like

This text block has a typo:

print(conversation_sum.memory.prompt.template)

Should be:

print(conversation.memory.prompt.template)

By far the best article on this topic. I finally understand the different types of memory.

Please write in the same style about templates :slight_smile: