Use this file to discover all available pages before exploring further.
The Supermemory AI SDK provides native integration with Vercel’s AI SDK through two approaches: User Profiles for automatic personalization and Memory Tools for agent-based interactions.
containerTag — who the memories belong to. Use a stable identifier per user, workspace, or tenant (e.g. "user-123", "acme-workspace"). Memory search and writes are scoped to this tag.
customId — which conversation this turn belongs to. Use it to group messages from the same chat session into a single document (e.g. "chat-2026-04-25", a thread ID, or a UUID per session).
Memory saving is enabled by default (addMemory: "always"). New conversations are persisted automatically. To opt out, set addMemory: "never":
Customize how memories are formatted. The template receives userMemories, generalSearchMemories, and searchResults (raw array for filtering by metadata):
When Supermemory errors (default: continue without memories)
If the Supermemory API returns an error, is unreachable, or retrieval hits the internal time limit, memory injection is skipped. skipMemoryOnError defaults to true, so the LLM call still runs with the original prompt (no injected memories). Use verbose: true if you want console output when that happens.To fail the call when memory retrieval fails instead, set skipMemoryOnError: false:
Add memory capabilities to AI agents with search, add, and fetch operations.
import { streamText } from "ai"import { createAnthropic } from "@ai-sdk/anthropic"import { supermemoryTools } from "@supermemory/tools/ai-sdk"const anthropic = createAnthropic({ apiKey: "YOUR_ANTHROPIC_KEY" })const result = await streamText({ model: anthropic("claude-3-sonnet"), prompt: "Remember that my name is Alice", tools: supermemoryTools("YOUR_SUPERMEMORY_KEY")})
Search Memories - Semantic search through user memories:
const result = await streamText({ model: openai("gpt-5"), prompt: "What are my dietary preferences?", tools: supermemoryTools("API_KEY")})// AI will call: searchMemories({ informationToGet: "dietary preferences" })
Add Memory - Store new information:
const result = await streamText({ model: anthropic("claude-3-sonnet"), prompt: "Remember that I'm allergic to peanuts", tools: supermemoryTools("API_KEY")})// AI will call: addMemory({ memory: "User is allergic to peanuts" })