Big update for Claude Desktop and Cursor users! Now you can connect all AI apps via a common memory layer in a minute. I used the Graphiti MCP server that runs 100% locally to cross-operate across AI apps like Claude Desktop and Cursor without losing context. (setup below)

Nov 5, 2025 · 6:31 AM UTC

1) Docker Setup Deploy the Graphiti MCP server using Docker Compose. This setup starts the MCP server with Server-Sent Events (SSE) transport, and it includes a Neo4j container, which launches the database as a local instance. This configuration also lets you query and visualize the knowledge graph using the Neo4j browser preview. You can also use FalkorDB. 2) Connect MCP server to Cursor With tools and our server ready, let's integrate it with our Cursor IDE! Go to: File → Preferences → Cursor Settings → MCP → Add new global MCP server. In the JSON file, add what's shown below: 3) Connect MCP server with Claude Similarly, integrate this with Claude Desktop Go to: File → Settings → Developer → Edit Config In the JSON file, add what's shown below: Done! Now you can chat with Claude Desktop, share facts/info, store the response in memory, and retrieve them from Cursor, and vice versa, as demonstrated in the video earlier. Why Graphiti? Agents forget everything after each task! Graphiti builds live temporal knowledge graphs so your AI agents always reason on real-time info. Integrating its MCP server with Claude/Cursor adds a powerful memory layer to all your interactions. 100% open-source with 20k+ stars. I have shared the GitHub repo below.
1
2
1
23
If you found it insightful, reshare it with your network. Find me → @_avichawla Every day, I share tutorials and insights on DS, ML, LLMs, and RAGs.
Big update for Claude Desktop and Cursor users! Now you can connect all AI apps via a common memory layer in a minute. I used the Graphiti MCP server that runs 100% locally to cross-operate across AI apps like Claude Desktop and Cursor without losing context. (setup below)
1
4
Replying to @_avichawla
Sounds like a solid upgrade! Keeping context across apps is a game changer. Kudos for making it seamless!
Thanks. The steps are pretty straightforward as well.
1
Replying to @_avichawla
This is indeed very powerful! It lets you switch apps without switching context! 🔥
1
2
Replying to @_avichawla
AI apps talking to each other like old friends is the tech version of a group chat finally making sense. It’s memory sharing without the gossip. Your tools syncing context means less copy-paste, more brain power, and maybe fewer existential crises mid-project.
Replying to @_avichawla
This is a game changer for AI integration
Replying to @_avichawla
Sounds like a game changer! Love seeing tools that simplify the process while keeping everything local. Smart move!
Replying to @_avichawla
Us lay folk would love to understand this now that it's all becoming accessible.
Replying to @_avichawla
the dream of all my tabs sharing their collective braincells is finally real
Replying to @_avichawla
Oh wow that’s brilliant is it a community MCP sever?
Replying to @_avichawla
But why?
Replying to @_avichawla
Connecting AI apps without losing context feels like a missing puzzle piece for productivity workflows. Excited to see this in action.
1
Replying to @_avichawla
That's a clever setup, Avi! Using a local server for cross-app AI integration is quite a jugaad, no?
Replying to @_avichawla
real build hours, let's see the magic
Replying to @_avichawla
This is huge for workflow continuity. The friction of context switching between AI tools is finally disappearing. Local memory layers that persist across apps = actual stateful automation instead of isolated conversations.
Replying to @_avichawla
Brilliant MCP implementation for cross app memory! Local execution with persistent context is game changing for AI workflows. Ship it!
Replying to @_avichawla
Does this require a Docker container running ?
Replying to @_avichawla
Local processing helps address some privacy concerns.
1
Replying to @_avichawla
This is a game-changer for AI workflows! Having a common memory layer that runs locally means no context loss between tools. Graphiti MCP server sounds like the perfect solution for seamless AI collaboration! 🚀🧠
Replying to @_avichawla
Who cares
Replying to @_avichawla
local context sharing across tools without token bloat is the real unlock here
Replying to @_avichawla
Your AI apps just became best friends with shared memories, making context switching feel like magic ✨
Replying to @_avichawla
Next up: A tool that auto appends "use MCP", "save into memory using MCP" to your prompts. Amazing