Getting Started
Quick Installation
For standard usage without the advanced worker node, install the core package:
Starting the Local MCP Gateway
If your AI agent supports configuring MCP servers via standard I/O streams:
{
"mcpServers": {
"marvin": {
"command": "marvin",
"args": [
"--vault-path",
"~/.marvin_vault",
"--transport",
"stdio"
]
}
}
}
Running the Advanced Cluster (Docker)
To utilize the Background Brain Worker (for automatic consolidation and deep knowledge graph extraction via Google's langextract):
-
Clone the repository:
-
Start the cluster:
-
Download the NLP model into your local instance (Only required on first boot!):
-
Load the Marvin Skill. Marvin works best when your agent knows when to use it. Copy the contents of
src/marvin/skill.mdinto your agent's custom system prompt or instructions field. This teaches the agent the K-Lines philosophy and instructs it to autonomously trigger sleep cycles and log episodes.
Configuring Your Agent (MCP Clients)
Marvin communicates via the Model Context Protocol (MCP). Here is how to configure the most popular agentic harnesses to connect to the Dockerized Marvin cluster (which runs on http://localhost:8421/sse by default).
Goose
Add the following to your ~/.config/goose/config.yaml:
Claude Desktop
Add the following to your Claude configuration file (claude_desktop_config.json):
{
"mcpServers": {
"marvin": {
"command": "curl",
"args": ["-s", "http://localhost:8421/sse"],
"env": {}
}
}
}
stdio command method shown in the Quick Installation above).
Cursor
In Cursor, go to Settings > Features > MCP Servers and add a new server:
1. Click Add New MCP Server
2. Type: sse
3. URL: http://localhost:8421/sse
OpenCode
For OpenCode CLI agents, provide the server via the configuration block or CLI flags depending on your version:
Gemini
If you are using a Gemini-powered agent loop that supports MCP: