Memory, knowledge search, structured reasoning, logic solving (memory-stack, knowledge-stack)
  • Python 99.6%
  • Dockerfile 0.4%
Find a file
g1admin 2f42e0026b
Some checks failed
Commit Summary / summarize (push) Failing after 32s
fix(graphiti-mcp): enforce group_id tenant isolation on delete/get operations
BRN-27: Add group_id parameter + ownership verification to
delete_entity_edge, delete_episode, and get_entity_edge.
Adds effective_group_id + group_id match check before
destructive/read operations, matching the pattern already
used by archive_fact, update_fact, and classify_fact.
2026-03-23 23:44:22 +00:00
.forgejo/workflows ci: add Coolify sync workflow for directus-cms 2026-03-22 18:48:24 +00:00
.gitea/issue_template chore: add infrastructure change issue template 2026-03-04 13:47:51 +00:00
compose fix: correct Langfuse container hostname in OTel config 2026-03-23 23:35:30 +00:00
config BRN-16: Add asp_logic_guidelines.md u2014 solver-in-the-loop + ASP-Bench patterns 2026-03-17 01:34:47 +00:00
docs docs: add lab report — Neo4j CE tenant isolation (Kuzu, FalkorDB, alternatives) 2026-03-10 21:51:11 +00:00
graphiti-api fix: correct Langfuse container hostname in OTel config 2026-03-23 23:35:30 +00:00
knowledge-mcp feat(otel): instrument knowledge-mcp core pipeline with OTel spans 2026-03-22 00:49:23 +00:00
reasoning-tool fix(reasoning-tools): bump Neo4j query timeout 10s→30s 2026-03-23 23:38:10 +00:00
services/brain-sync fix: correct Langfuse container hostname in OTel config 2026-03-23 23:35:30 +00:00
src fix(graphiti-mcp): enforce group_id tenant isolation on delete/get operations 2026-03-23 23:44:22 +00:00
.env.example feat: sync live compose from production (2026-02-27) 2026-02-26 22:09:29 +00:00
.gitignore Add .gitignore 2026-02-25 15:39:19 +00:00
CLAUDE.md Update CLAUDE.md: Brain Lead agent naming 2026-02-26 22:12:11 +00:00
docker-compose.yml feat(otel): add OTel env vars and observe network to knowledge-mcp compose 2026-03-22 00:54:23 +00:00
LICENSE Add MIT LICENSE 2026-02-25 15:39:41 +00:00
README.md docs: premium README overhaul 2026-03-22 06:12:42 +00:00

g1-brain 🧬

The memory and intelligence layer — graph memory, 4-stream knowledge retrieval, and structured reasoning for the Generate One platform.

Status License Platform Neo4j AI Powered Docker


Overview

g1-brain provides the platform's memory and intelligence layer. It combines graph-based persistent memory (Graphiti + Neo4j) for context across sessions, vector search (Qdrant) with 4-stream Reciprocal Rank Fusion for document retrieval, and structured reasoning tools for multi-step analysis. This is what gives the Generate One platform long-term memory, deep knowledge retrieval, and the ability to reason over facts over time.

Two compose stacks are managed from this repo: memory-stack (Neo4j, Graphiti API, Redis, FalkorDB) and knowledge-mcp (knowledge-mcp server, async worker, Valkey queue).


🏗️ Architecture

graph TD
    subgraph "Memory Stack (z0ww84wkwwss8sw4kgsw88gc)"
        Neo4j["Neo4j 5.26\n(graph DB, port 7474/7687)"]
        GA["Graphiti API\n:8000 → memory.generate.one"]
        Redis["Redis 7-alpine\n(Graphiti cache, 256MB LRU)"]
        FalkorDB["FalkorDB\n(graph query engine)"]
        GA --> Neo4j
        GA --> Redis
    end

    subgraph "Knowledge Stack (kp3basi7wsdztrq1fgm7t543)"
        KM["knowledge-mcp\n:8000"]
        KW["knowledge-worker\n(async ingest pipeline)"]
        KV["Valkey 9.0.1\n(task queue)"]
        KW --> KV
        KM --> KV
    end

    subgraph "MCP Tools"
        GMCP["graphiti-mcp"]
        RMCP["reasoning-tools"]
        LMCP["logic-lm"]
        CMCP["cms-tools\n(Directus)"]
    end

    GMCP --> GA
    KM -.-> Qdrant["g1-llm / Qdrant"]
    KW -.-> Qdrant
    KW -.-> LiteLLM["g1-llm / LiteLLM"]

    Client["MCP Clients"] --> GMCP
    Client --> KM
    Client --> RMCP
    Client --> LMCP
    Client --> CMCP

📦 Services

Memory Stack (compose/memory-stack.yml)

Service Image Port Description
graphiti-api Custom build 8000 REST API for Graphiti graph operations → memory.generate.one
neo4j neo4j:5.26-community 7474, 7687 Graph database for entity/relationship storage
redis redis:7-alpine 6379 Graphiti internal cache (256 MB LRU)
falkordb falkordb/falkordb:latest Graph query engine (Cypher-compatible)

Knowledge Stack (docker-compose.yml — service kp3basi7wsdztrq1fgm7t543)

Service Image Port Description
knowledge-mcp git.generate.one/generate-one/knowledge-mcp:latest 8000 MCP server with 7 search/ingest tools
worker git.generate.one/generate-one/knowledge-worker:latest Document ingest pipeline (OCR, chunking, embedding)
knowledge-valkey valkey/valkey:9.0.1-alpine 6379 Task queue for async ingestion

🔍 Knowledge Search: 4-Stream RRF

Queries are processed through 4 parallel retrieval streams, fused via Reciprocal Rank Fusion:

  1. Dense vectors — Semantic similarity via Qdrant
  2. BM25 sparse vectors — Keyword matching with IDF weighting
  3. Graphiti — Entity/relationship graph traversal
  4. Reranker — Cross-encoder rescoring (mxbai-rerank-large-v2)

Post-fusion temporal decay (30-day half-life) prioritizes recent information.


🔧 Configuration

Variable Description Default
NEO4J_AUTH Neo4j authentication (neo4j/password)
GRAPHITI_API_KEY API key for Graphiti REST API
LITELLM_API_KEY LiteLLM key for LLM operations
VLM_OCR_ENABLED Enable VLM-based OCR for scanned PDFs true
VLM_OCR_MODEL Model tier for OCR g1-vlm
CLASSIFY_LLM_MODEL Model for document classification g1-llm-turbo
CONTEXTUAL_CHUNKING_ENABLED LLM-enriched chunk context true
CONTEXTUAL_LLM_MODEL Model for chunk enrichment g1-llm-micro
GRAPH_DB_PASSWORD Neo4j database password
FALKORDB_PASSWORD FalkorDB auth password

🏠 Multi-Tenant Isolation

Service Parameter Default
graphiti-mcp group_id main
reasoning-tools group_id main
knowledge-mcp tenant_id shared

Group ID format uses dashes only (colons rejected): main, tenant-{id}, org-{org_id}-user-{user_id}


🚀 Quick Start

# Memory stack (managed by Coolify)
cd /data/coolify/services/z0ww84wkwwss8sw4kgsw88gc
docker compose up -d

# Knowledge stack (locally-built images)
cd /data/coolify/services/kp3basi7wsdztrq1fgm7t543
docker build -t git.generate.one/generate-one/knowledge-worker:latest worker/
docker build -t git.generate.one/generate-one/knowledge-mcp:latest .
docker compose up -d

# Health checks
curl https://memory.generate.one/health

Note: knowledge-mcp and knowledge-worker use locally-built images with pull_policy: never. Always rebuild before docker compose up -d.


🔗 Dependencies

Depends on:

  • g1-llm — Qdrant (vector storage), LiteLLM (inference for classification, enrichment, query rewriting)

Depended on by:

  • g1-mcp — graphiti-mcp and reasoning-tools connect to memory/knowledge backends
  • g1-gpt — LibreChat file search routes through knowledge-mcp RAG compatibility layer

Repo Relationship
g1-llm Qdrant + LiteLLM for embeddings and inference
g1-mcp graphiti-mcp, reasoning-tools hosted in mcp-stack
g1-gpt LibreChat file search via knowledge-mcp
g1-core Valkey for task queueing

🛡️ Part of Generate One

Generate One — AI infrastructure that answers to you.

Self-hosted, sovereign AI platform. generate.one

Licensed under AGPL-3.0.