Custom memory in Langchain
Implementing custom memory in Langchain is dead simple using the ChatMessageHistory
class.
How to implement custom memory in Langchain (including LCEL)
One of the easiest methods for storing and retrieving messages with Langchain is using the ChatMessageHistory
class that is provided from the langchain.memory
module.
It’s simple to get started with:
1from langchain.memory import ChatMessageHistory
2
3history = ChatMessageHistory()
4
5history.add_user_message("Hello!")
6
7history.add_ai_message("Yo!")
8
9print(history.messages)
10
11# [HumanMessage(content='Hello!'), AIMessage(content='Yo!')]
The interface for ChatMessageHistory
is: add_user_message
and add_ai_message
.
For the sake of this example, I’m using a .json
file with preloaded messages. A database, file, etc. may be used instead.
1{
2 "messages": [
3 {
4 "sender": "human",
5 "body": "Hello!"
6 },
7 {
8 "sender": "ai",
9 "body": "Yo!"
10 }
11 ]
12}
1import json
2
3from langchain.memory import ChatMessageHistory
4
5with open("messages.json") as json_file:
6 content = json.load(json_file)
7
8messages = content["messages"]
9
10chat_history = ChatMessageHistory()
11for m in messages:
12 if m["sender"] == "human":
13 chat_history.add_user_message(m["body"])
14 elif m["sender"] == "ai":
15 chat_history.add_ai_message(m["body"])
16
17print(chat_history.messages)
18# [HumanMessage(content='Hello!'), AIMessage(content='Yo!')]
The chat_history
may be used for instantiating other types of memory!
Here is an example from the langchain documentation using the ChatMessageHistory
with Langchain Expression Language (LCEL):
1import json
2
3from operator import itemgetter
4
5from langchain.chat_models import ChatOpenAI
6from langchain.memory import ConversationBufferMemory
7from langchain.schema.runnable import RunnablePassthrough, RunnableLambda
8from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
9
10with open("messages.json") as json_file:
11 content = json.load(json_file)
12
13messages = content["messages"]
14
15chat_history = ChatMessageHistory()
16for m in messages:
17 if m["sender"] == "human":
18 chat_history.add_user_message(m["body"])
19 elif m["sender"] == "ai":
20 chat_history.add_ai_message(m["body"])
21
22model = ChatOpenAI()
23prompt = ChatPromptTemplate.from_messages(
24 [
25 ("system", "You are a helpful chatbot"),
26 MessagesPlaceholder(variable_name="history"),
27 ("human", "{input}"),
28 ]
29)
30
31memory = ConversationBufferMemory(return_messages=True, chat_memory=chat_history)
32
33chain = (
34 RunnablePassthrough.assign(
35 history=RunnableLambda(memory.load_memory_variables) | itemgetter("history")
36 )
37 | prompt
38 | model
39)
40
41inputs = {"input": "hi im bob"}
42response = chain.invoke(inputs)