Skip to content

Long-term Memory

Long-term memory provides persistent knowledge storage using LangGraph’s Store system. It enables agents to remember important information across sessions through memory tools that the AI can proactively call to save and retrieve information.

from langcrew import Crew, Agent
from langcrew.memory import MemoryConfig, LongTermMemoryConfig
# Enable long-term memory
memory_config = MemoryConfig(
provider="sqlite",
connection_string="sqlite:///long_term.db",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="my-app" # RECOMMENDED for production
)
)
crew = Crew(agents=[agent], memory=memory_config)
ParameterTypeDescriptionDefault
enabledboolEnable long-term memoryFalse
providerstr | NoneStorage provider override (inherits if None)None
connection_stringstr | NoneConnection string override (inherits if None)None
app_idstr | NoneApplication identifier (RECOMMENDED for production)None
indexIndexConfig | NoneVector search configurationNone
user_memoryMemoryScopeConfigUser-specific memory configurationenabled
app_memoryMemoryScopeConfigApplication-wide memory (⚠️ experimental)disabled
search_response_formatstrSearch result format (“content” or “content_and_artifact”)“content”

Long-term memory operates in two scopes, each with dedicated memory tools that agents can call:

Stores personal user preferences, information, and context.

Automatically created tools:

  • manage_user_memory: Save, update, or delete user memories
  • search_user_memory: Search and retrieve user memories

How it works:

from langcrew.memory import MemoryScopeConfig
from langgraph.store.base import IndexConfig
memory_config = MemoryConfig(
provider="postgres",
connection_string="postgresql://localhost/memory",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="my-app-prod",
user_memory=MemoryScopeConfig(
enabled=True,
manage_instructions="""Save memories when user:
1. Expresses preferences (I like/love/prefer)
2. Shares personal info (job, location, hobbies)
3. Explicitly asks you to remember something
""",
search_instructions="""Search when you need to:
1. Recall user preferences or context
2. Provide personalized recommendations
"""
)
)
)
crew = Crew(agents=[agent], memory=memory_config)
# When user says "I'm vegetarian and I love Italian food"
# -> AI automatically calls manage_user_memory tool
crew.kickoff(
inputs={"user_input": "I'm vegetarian and I love Italian food"},
thread_id="user_alice",
config={"configurable": {"user_id": "alice"}}
)
# Later session - AI calls search_user_memory when needed
crew.kickoff(
inputs={"user_input": "Recommend a restaurant"},
thread_id="user_alice_new_session",
config={"configurable": {"user_id": "alice"}}
)

Memory namespace isolation:

  • With app_id: ("user_memories", "my-app-prod", "alice")
  • Without app_id: ("user_memories", "alice")

App Memory (Default: Disabled, ⚠️ Experimental)

Section titled “App Memory (Default: Disabled, ⚠️ Experimental)”

Stores application-wide insights shared across all users.

Automatically created tools:

  • manage_app_memory: Save application-level insights
  • search_app_memory: Search shared insights
memory_config = MemoryConfig(
provider="postgres",
connection_string="postgresql://localhost/memory",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="saas-app-v1",
app_memory=MemoryScopeConfig(
enabled=True, # ⚠️ Experimental
manage_instructions="Store application-wide patterns and insights only",
search_instructions="Search for common patterns to improve assistance"
)
)
)

Memory namespace isolation:

  • With app_id: ("app_memories", "saas-app-v1")
  • Without app_id: ("app_memories",)

⚠️ Important: App memory is experimental and should be carefully monitored to ensure it only stores aggregated insights, not personal user data.

Long-term memory uses LangMem tools that are automatically added to your agents:

When you enable long-term memory, these tools are created and added to each agent:

# When you configure:
memory_config = MemoryConfig(
long_term=LongTermMemoryConfig(
enabled=True,
app_id="my-app",
user_memory=MemoryScopeConfig(enabled=True)
)
)
# Behind the scenes, agents get these tools:
# - manage_user_memory: For saving/updating/deleting memories
# - search_user_memory: For retrieving memories

The AI automatically calls these tools based on the instructions you configure:

When to save (manage_instructions):

manage_instructions="Call this tool when user:
1. Expresses preferences (I like/love/prefer)
2. Shares personal information
3. Explicitly asks you to remember something"

When to search (search_instructions):

search_instructions="Call this tool when:
1. User asks about their preferences
2. You need to personalize responses
3. User asks 'What do you know about me?'"
User: "I love pizza"
AI detects preference expression
AI calls manage_user_memory(content="User loves pizza")
Memory saved to Store with namespace ("user_memories", "my-app", "{user_id}")
AI responds: "Got it! I'll remember you love pizza."
---
User: "What food do I like?"
AI detects need to recall preference
AI calls search_user_memory(query="food preferences")
Store returns: "User loves pizza"
AI responds: "You mentioned you love pizza!"

Enable semantic search for better memory retrieval:

from langgraph.store.base import IndexConfig
memory_config = MemoryConfig(
provider="postgres",
connection_string="postgresql://localhost/memory",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="my-app",
index=IndexConfig(
dims=1536,
embed="openai:text-embedding-3-small"
)
)
)

Supported Embedding Models:

  • openai:text-embedding-3-small (1536 dims)
  • openai:text-embedding-3-large (3072 dims)
  • openai:text-embedding-ada-002 (1536 dims)
  • Custom models (see LangGraph IndexConfig docs)

The app_id parameter provides namespace isolation when multiple applications share the same database:

# App 1
memory_config_app1 = MemoryConfig(
provider="postgres",
connection_string="postgresql://shared-db/memory",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="chatbot-v1" # Isolated namespace
)
)
# App 2 (same database, different namespace)
memory_config_app2 = MemoryConfig(
provider="postgres",
connection_string="postgresql://shared-db/memory",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="assistant-v1" # Different isolated namespace
)
)

Data Isolation:

  • With app_id: User “alice” in App 1 is completely separate from user “alice” in App 2
  • Without app_id: All applications share the same user namespace (not recommended for production)
memory_config = MemoryConfig(
provider="sqlite",
connection_string="sqlite:///assistant.db",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="personal-assistant"
)
)
assistant = Agent(
role="Personal Assistant",
goal="Provide personalized assistance based on user preferences",
backstory="You remember user preferences and provide tailored recommendations"
)
crew = Crew(agents=[assistant], memory=memory_config)
# First interaction
crew.kickoff(
inputs={"user_input": "I prefer morning meetings and hate Mondays"},
thread_id="user_123_session_1"
)
# Later - agent remembers preferences
crew.kickoff(
inputs={"user_input": "Schedule a team meeting"},
thread_id="user_123_session_2"
)
# Agent will suggest Tuesday-Friday mornings
memory_config = MemoryConfig(
provider="postgresql",
connection_string="postgresql://localhost/crm",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="crm-system",
index=IndexConfig(
dims=1536,
embed="openai:text-embedding-3-small"
)
)
)
crm_agent = Agent(
role="Customer Success Manager",
goal="Build lasting customer relationships",
backstory="You remember customer history, preferences, and past interactions"
)
crew = Crew(agents=[crm_agent], memory=memory_config)

Long-term memory supports all LangCrew storage providers:

memory_config = MemoryConfig(
provider="sqlite",
connection_string="sqlite:///long_term.db",
long_term=LongTermMemoryConfig(enabled=True, app_id="my-app")
)
memory_config = MemoryConfig(
provider="postgresql",
connection_string="postgresql://user:pass@localhost:5432/memory_db",
long_term=LongTermMemoryConfig(
enabled=True,
app_id="my-app-prod",
index=IndexConfig(dims=1536, embed="openai:text-embedding-3-small")
)
)
memory_config = MemoryConfig(
provider="mysql",
connection_string="mysql://user:pass@localhost:3306/memory_db",
long_term=LongTermMemoryConfig(enabled=True, app_id="my-app")
)
  • Verify long-term config has enabled: True
  • Check database connection and permissions
  • Ensure store is properly configured
  • Verify index configuration is set for semantic search
  • Check embedding model is accessible
  • Ensure memories contain relevant content
  • Always set app_id in production environments
  • Use unique app_id for each application
  • Verify app_id is consistent across sessions