Choose from direct updates, idempotent upserts, single deletions, and powerful bulk operations.
Direct Updates
Update existing memories by their ID when you know the specific memory you want to modify. Changes trigger reprocessing through the full pipeline.
import Supermemory from 'supermemory';
const client = new Supermemory({
apiKey: process.env.SUPERMEMORY_API_KEY!
});
// Update by memory ID
const updated = await client.memories.update('memory_id_123', {
content: 'Updated content here',
metadata: { version: 2, updated: true }
});
console.log(updated.status); // "queued" for reprocessing
console.log(updated.id); // "memory_id_123"
Upserts Using customId
Use customId
for idempotent operations where the same customId
with add()
will update existing memory instead of creating duplicates.
import Supermemory from 'supermemory';
const client = new Supermemory({
apiKey: process.env.SUPERMEMORY_API_KEY!
});
const customId = 'user-note-001';
// First call creates memory
const created = await client.memories.add({
content: 'Initial content',
customId: customId,
metadata: { version: 1 }
});
console.log('Created memory:', created.id);
// Second call with same customId updates existing
const updated = await client.memories.add({
content: 'Updated content',
customId: customId, // Same customId = upsert
metadata: { version: 2 }
});
The customId
enables idempotency across all endpoints. The memoryId
doesn’t support idempotency, only the customId
does.
Single Delete
Delete individual memories by their ID. This is a permanent hard delete with no recovery mechanism.
// Hard delete - permanently removes memory
await client.memories.delete('memory_id_123');
console.log('Memory deleted successfully');
Bulk Delete by IDs
Delete multiple memories at once by providing an array of memory IDs. Maximum of 100 IDs per request.
// Bulk delete by memory IDs
const result = await client.memories.bulkDelete({
ids: [
'memory_id_1',
'memory_id_2',
'memory_id_3',
'non_existent_id' // This will be reported in errors
]
});
console.log('Bulk delete result:', result);
// Output: {
// success: true,
// deletedCount: 3,
// errors: [
// { id: "non_existent_id", error: "Memory not found" }
// ]
// }
Delete all memories within specific container tags. This is useful for cleaning up entire projects or user data.
// Delete all memories in specific container tags
const result = await client.memories.bulkDelete({
containerTags: ['user-123', 'project-old', 'archived-content']
});
console.log('Bulk delete by tags result:', result);
// Output: {
// success: true,
// deletedCount: 45,
// containerTags: ["user-123", "project-old", "archived-content"]
// }
Advanced Patterns
Soft Delete Implementation
For applications requiring audit trails or recovery mechanisms, implement soft delete patterns using metadata:
// Soft delete pattern using metadata
await client.memories.update('memory_id', {
metadata: {
deleted: true,
deletedAt: new Date().toISOString(),
deletedBy: 'user_123'
}
});
// Filter out deleted memories in searches
const activeMemories = await client.memories.list({
filters: JSON.stringify({
AND: [
{ key: "deleted", value: "true", negate: true }
]
})
});
console.log('Active memories:', activeMemories.results.length);
Batch Processing for Large Operations
// Batch delete large numbers of memories safely
async function batchDeleteMemories(memoryIds: string[], batchSize = 100) {
const results = [];
for (let i = 0; i < memoryIds.length; i += batchSize) {
const batch = memoryIds.slice(i, i + batchSize);
console.log(`Processing batch ${Math.floor(i/batchSize) + 1} of ${Math.ceil(memoryIds.length/batchSize)}`);
try {
const result = await client.memories.bulkDelete({ ids: batch });
results.push(result);
// Brief delay between batches to avoid rate limiting
if (i + batchSize < memoryIds.length) {
await new Promise(resolve => setTimeout(resolve, 1000));
}
} catch (error) {
console.error(`Batch ${Math.floor(i/batchSize) + 1} failed:`, error);
results.push({ success: false, error: error.message, batch });
}
}
// Aggregate results
const totalDeleted = results
.filter(r => r.success)
.reduce((sum, r) => sum + (r.deletedCount || 0), 0);
console.log(`Total deleted: ${totalDeleted} out of ${memoryIds.length}`);
return { totalDeleted, results };
}
Best Practices
Update Operations
- Use customId for idempotent updates - Prevents duplicate memories and enables safe retries
- Monitor processing status - Updates trigger full reprocessing pipeline
- Handle metadata carefully - Updates replace specified metadata keys
- Implement proper error handling - Memory may be deleted between operations
Delete Operations
- Hard delete is permanent - No recovery mechanism exists
- Use bulk operations efficiently - Maximum 100 IDs per bulk delete request
- Consider soft delete patterns - Use metadata flags for recoverable deletion
- Batch large operations - Avoid rate limits with proper batching
- Clean up application state - Update your UI/cache after deletions