Enterprise Knowledge Platform - Full Integration Plan¶
Date: 2025-11-05 Decision: Full Integration (Option 3) Estimated Effort: 2-3 days Goal: Merge web UI with Memory MCP for complete feature set
Executive Summary¶
Objective: Create unified system with dual access modes (CLI via MCP + Web UI) sharing the same memory infrastructure.
Target Architecture:
┌──────────────────────────────────────────────────────────┐
│ User Interfaces │
│ ┌─────────────────┐ ┌─────────────────┐ │
│ │ Claude Code │ │ Web Browser │ │
│ │ (CLI) │ │ (Visual) │ │
│ └────────┬────────┘ └────────┬────────┘ │
│ │ │ │
│ ↓ ↓ │
│ ┌─────────────────┐ ┌─────────────────┐ │
│ │ Memory Agent │←────────│ Next.js │ │
│ │ MCP Server │ Uses │ Frontend │ │
│ │ (FastMCP 2.0) │ │ (Port 3002) │ │
│ └────────┬────────┘ └────────┬────────┘ │
│ │ │ │
│ └──────────┬─────────────────┘ │
│ ↓ │
│ ┌──────────────────────┐ │
│ │ Unified Backend │ │
│ │ (Hybrid FastAPI + │ │
│ │ MCP Bridge) │ │
│ │ (Port 8002) │ │
│ └──────────┬───────────┘ │
│ ↓ │
│ ┌────────────────┼────────────────┐ │
│ ↓ ↓ ↓ │
│ ┌────────┐ ┌────────┐ ┌────────┐ │
│ │Qdrant │ │Files │ │Postgres│ │
│ │978k │ │15.6k │ │Trading │ │
│ │vectors │ │entities│ │(shared)│ │
│ │SHARED │ │SHARED │ │ │ │
│ └────────┘ └────────┘ └────────┘ │
└──────────────────────────────────────────────────────────┘
Key Changes: - ✅ Enterprise Platform runs on new ports (3002, 8002) - ✅ Shares existing Qdrant (6333) - ✅ Shares existing file storage (~/Documents/memory/) - ✅ Uses existing trading-postgres - ✅ Memory MCP becomes bridge between CLI and web - ✅ Web uploads create markdown entities automatically
Phase 1: Infrastructure Setup (Day 1, Morning - 4 hours)¶
1.1 Port Conflict Resolution¶
Changes to enterprise-knowledge-platform/docker-compose.yml:
version: '3.8'
services:
# REMOVE: postgres (use existing trading-postgres)
# postgres:
# ...REMOVED
# REMOVE: qdrant (use existing qdrant)
# qdrant:
# ...REMOVED
# REMOVE: redis (use existing trading-redis)
# redis:
# ...REMOVED
# KEEP: MinIO (unique to enterprise platform)
minio:
image: minio/minio:latest
container_name: knowledge-platform-minio
ports:
- "9000:9000"
- "9001:9001"
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
volumes:
- minio_data:/data
command: server /data --console-address ":9001"
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"]
interval: 10s
timeout: 5s
retries: 5
# MODIFIED: Backend (use external services)
backend:
build:
context: ./backend
dockerfile: Dockerfile
container_name: knowledge-platform-backend
ports:
- "8002:8000" # CHANGED from 8000 to 8002
environment:
# Use existing trading-postgres
DATABASE_URL: postgresql://trading_user:trading_pass@trading-postgres:5432/trading_db
# Use existing qdrant
QDRANT_HOST: host.docker.internal # Access host's Qdrant
QDRANT_PORT: 6333
# Use existing trading-redis
REDIS_URL: redis://host.docker.internal:6379
# Use MinIO from this stack
MINIO_ENDPOINT: minio:9000
MINIO_ACCESS_KEY: minioadmin
MINIO_SECRET_KEY: minioadmin
# NEW: Memory system integration
MEMORY_ENTITIES_PATH: /memory-entities
MCP_SERVER_URL: http://host.docker.internal:8003 # Memory MCP bridge
depends_on:
minio:
condition: service_healthy
volumes:
- ./backend:/app
- ~/Documents/memory/entities:/memory-entities:ro # READ access to entities
extra_hosts:
- "host.docker.internal:host-gateway"
command: uvicorn main:app --host 0.0.0.0 --port 8000 --reload
# MODIFIED: Frontend
frontend:
build:
context: ./frontend
dockerfile: Dockerfile
container_name: knowledge-platform-frontend
ports:
- "3002:3000" # CHANGED from 3000 to 3002
environment:
NEXT_PUBLIC_API_URL: http://localhost:8002 # Point to new backend port
depends_on:
- backend
volumes:
- ./frontend:/app
- /app/node_modules
- /app/.next
command: npm run dev
volumes:
minio_data:
networks:
default:
name: knowledge-platform-network
Verification Commands:
# After changes
cd enterprise-knowledge-platform
docker-compose down -v
docker-compose up -d
# Verify no port conflicts
lsof -i :3002 # Should show frontend
lsof -i :8002 # Should show backend
lsof -i :9000 # Should show MinIO
# Verify external services accessible
docker-compose exec backend curl http://host.docker.internal:6333/collections
docker-compose exec backend redis-cli -h host.docker.internal ping
1.2 Create MCP Bridge Service¶
New file: mcp_bridge/server.py
"""
MCP Bridge - FastAPI service that exposes Memory MCP functionality
to the Enterprise Platform backend via HTTP.
This allows the web UI to access memory functions without direct MCP integration.
"""
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
import subprocess
import json
import sys
from pathlib import Path
app = FastAPI(title="Memory MCP Bridge", version="1.0.0")
class MemoryQuery(BaseModel):
question: str
limit: int = 10
class MemoryStore(BaseModel):
content: str
entity_type: str = "note"
metadata: dict = {}
class SearchResult(BaseModel):
score: float
entity_id: str
content: str
metadata: dict
@app.get("/health")
async def health_check():
"""Health check endpoint"""
return {"status": "healthy", "service": "mcp-bridge"}
@app.post("/search", response_model=list[SearchResult])
async def search_memory(query: MemoryQuery):
"""
Search memory using MCP agent
Uses the mcp__memory-agent-stdio__use_memory_agent tool
to perform semantic search.
"""
try:
# Call the memory query tool
cmd = [
"uv", "run", "python3",
"tools/query_memory.py",
query.question,
"--limit", str(query.limit),
"--format", "json"
]
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=30,
cwd="/Users/bertfrichot/mem-agent-mcp"
)
if result.returncode != 0:
raise HTTPException(
status_code=500,
detail=f"Memory search failed: {result.stderr}"
)
results = json.loads(result.stdout)
return results
except subprocess.TimeoutExpired:
raise HTTPException(status_code=504, detail="Memory search timeout")
except json.JSONDecodeError as e:
raise HTTPException(
status_code=500,
detail=f"Failed to parse memory results: {str(e)}"
)
except Exception as e:
raise HTTPException(
status_code=500,
detail=f"Unexpected error: {str(e)}"
)
@app.post("/store")
async def store_memory(item: MemoryStore):
"""
Store content to memory system
Creates a new markdown entity with YAML frontmatter.
"""
try:
cmd = [
"uv", "run", "python3",
"tools/store_memory.py",
"--type", item.entity_type,
"--content", item.content,
"--metadata", json.dumps(item.metadata)
]
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=30,
cwd="/Users/bertfrichot/mem-agent-mcp"
)
if result.returncode != 0:
raise HTTPException(
status_code=500,
detail=f"Memory store failed: {result.stderr}"
)
return {"status": "success", "message": result.stdout.strip()}
except Exception as e:
raise HTTPException(
status_code=500,
detail=f"Store failed: {str(e)}"
)
@app.get("/stats")
async def get_stats():
"""Get memory system statistics"""
try:
entities_path = Path("~/Documents/memory/entities").expanduser()
stats = {
"total_entities": sum(1 for _ in entities_path.rglob("*.md")),
"entity_types": {},
"recent_updates": []
}
# Count by type
for entity_type_dir in entities_path.iterdir():
if entity_type_dir.is_dir():
count = sum(1 for _ in entity_type_dir.glob("*.md"))
stats["entity_types"][entity_type_dir.name] = count
return stats
except Exception as e:
raise HTTPException(
status_code=500,
detail=f"Stats failed: {str(e)}"
)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8003)
Start MCP Bridge:
# Create directory
mkdir -p mcp_bridge
# Save server.py
# Start as background service
uv run python3 mcp_bridge/server.py &
# Or create LaunchAgent for persistent service
# ~/Library/LaunchAgents/com.mem-agent.mcp-bridge.plist
Test MCP Bridge:
# Health check
curl http://localhost:8003/health
# Test search
curl -X POST http://localhost:8003/search \
-H "Content-Type: application/json" \
-d '{"question": "trading strategies", "limit": 5}'
# Test stats
curl http://localhost:8003/stats
Phase 2: Backend Integration (Day 1, Afternoon - 4 hours)¶
2.1 Modify Enterprise Backend to Use MCP Bridge¶
File: enterprise-knowledge-platform/backend/services/memory_integration.py (NEW)
"""
Memory Integration Service
Connects Enterprise Platform backend to Memory MCP Bridge,
enabling web UI to access the existing memory system.
"""
import httpx
from typing import List, Dict, Optional
from pydantic import BaseModel
import os
class MemorySearchResult(BaseModel):
score: float
entity_id: str
content: str
metadata: Dict
class MemoryIntegrationService:
"""Service to interact with Memory MCP Bridge"""
def __init__(self):
self.bridge_url = os.getenv("MCP_SERVER_URL", "http://localhost:8003")
self.client = httpx.AsyncClient(timeout=30.0)
async def search(
self,
query: str,
limit: int = 10
) -> List[MemorySearchResult]:
"""
Search memory system via MCP bridge
Args:
query: Search query
limit: Maximum results
Returns:
List of search results with scores
"""
try:
response = await self.client.post(
f"{self.bridge_url}/search",
json={"question": query, "limit": limit}
)
response.raise_for_status()
results = response.json()
return [MemorySearchResult(**r) for r in results]
except httpx.HTTPError as e:
print(f"Memory search failed: {e}")
return []
async def store(
self,
content: str,
entity_type: str = "document",
metadata: Optional[Dict] = None
) -> bool:
"""
Store content to memory system
Args:
content: Content to store
entity_type: Type of entity (document, note, lesson, etc.)
metadata: Additional metadata
Returns:
True if successful
"""
try:
response = await self.client.post(
f"{self.bridge_url}/store",
json={
"content": content,
"entity_type": entity_type,
"metadata": metadata or {}
}
)
response.raise_for_status()
return True
except httpx.HTTPError as e:
print(f"Memory store failed: {e}")
return False
async def get_stats(self) -> Dict:
"""Get memory system statistics"""
try:
response = await self.client.get(f"{self.bridge_url}/stats")
response.raise_for_status()
return response.json()
except httpx.HTTPError:
return {}
async def close(self):
"""Close HTTP client"""
await self.client.aclose()
# Singleton instance
memory_service = MemoryIntegrationService()
2.2 Add Hybrid Search Endpoint¶
File: enterprise-knowledge-platform/backend/main.py (MODIFY)
Add hybrid search that queries BOTH Qdrant (enterprise) and Memory MCP:
from services.memory_integration import memory_service
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from typing import List, Optional
# ... existing imports ...
class HybridSearchRequest(BaseModel):
query: str
limit: int = 10
min_score: float = 0.7
search_scope: str = "both" # "enterprise", "memory", "both"
class HybridSearchResult(BaseModel):
source: str # "enterprise" or "memory"
score: float
text: str
filename: Optional[str] = None
entity_id: Optional[str] = None
metadata: Optional[dict] = None
@app.post("/search/hybrid", response_model=List[HybridSearchResult])
async def hybrid_search(request: HybridSearchRequest):
"""
Hybrid search across Enterprise Platform and Memory System
Combines results from:
1. Enterprise Platform (uploaded documents in Qdrant)
2. Memory System (markdown entities via MCP)
"""
results = []
# Search enterprise platform (if scope includes it)
if request.search_scope in ["enterprise", "both"]:
try:
enterprise_results = await search_service.search(
query=request.query,
limit=request.limit,
min_score=request.min_score
)
for result in enterprise_results:
results.append(HybridSearchResult(
source="enterprise",
score=result.score,
text=result.text,
filename=result.filename,
metadata=result.metadata
))
except Exception as e:
print(f"Enterprise search failed: {e}")
# Search memory system (if scope includes it)
if request.search_scope in ["memory", "both"]:
try:
memory_results = await memory_service.search(
query=request.query,
limit=request.limit
)
for result in memory_results:
if result.score >= request.min_score:
results.append(HybridSearchResult(
source="memory",
score=result.score,
text=result.content,
entity_id=result.entity_id,
metadata=result.metadata
))
except Exception as e:
print(f"Memory search failed: {e}")
# Sort by score (descending)
results.sort(key=lambda x: x.score, reverse=True)
# Return top N results
return results[:request.limit]
@app.get("/stats/complete")
async def complete_stats():
"""Combined statistics from both systems"""
# Enterprise stats
enterprise_stats = await metrics_service.get_metrics()
# Memory stats
memory_stats = await memory_service.get_stats()
return {
"enterprise": enterprise_stats,
"memory": memory_stats,
"combined": {
"total_documents": (
enterprise_stats.get("total_documents", 0) +
memory_stats.get("total_entities", 0)
)
}
}
@app.on_event("shutdown")
async def shutdown_event():
"""Clean shutdown"""
await memory_service.close()
2.3 Document Upload → Markdown Entity Conversion¶
File: enterprise-knowledge-platform/backend/services/document_service.py (MODIFY)
Add functionality to create markdown entities for uploaded documents:
from services.memory_integration import memory_service
import hashlib
from datetime import datetime
class DocumentService:
# ... existing code ...
async def process_document_to_memory(
self,
document_id: str,
filename: str,
text_content: str,
chunks: List[str]
) -> bool:
"""
Create markdown entity for uploaded document
Stores in ~/Documents/memory/entities/documents/
with proper YAML frontmatter.
"""
try:
# Create metadata
metadata = {
"source": "enterprise_platform_upload",
"document_id": document_id,
"filename": filename,
"upload_date": datetime.utcnow().isoformat(),
"chunk_count": len(chunks),
"content_hash": hashlib.sha256(text_content.encode()).hexdigest()[:16]
}
# Format content with frontmatter
content = f"""---
title: {filename}
type: document
source: enterprise_upload
document_id: {document_id}
uploaded: {metadata['upload_date']}
chunks: {len(chunks)}
---
# {filename}
## Original Content
{text_content[:5000]} # Limit to first 5000 chars for summary
## Chunks
This document was automatically chunked into {len(chunks)} segments for vector search.
## Metadata
- **Document ID**: {document_id}
- **Filename**: {filename}
- **Upload Date**: {metadata['upload_date']}
- **Total Chunks**: {len(chunks)}
- **Content Hash**: {metadata['content_hash']}
"""
# Store to memory system
success = await memory_service.store(
content=content,
entity_type="document",
metadata=metadata
)
return success
except Exception as e:
print(f"Failed to create memory entity: {e}")
return False
Phase 3: Frontend Enhancement (Day 2, Morning - 4 hours)¶
3.1 Add Memory System Toggle to Search UI¶
File: enterprise-knowledge-platform/frontend/app/search/page.tsx (MODIFY)
'use client';
import { useState } from 'react';
import { Search, FileText, Brain } from 'lucide-react';
type SearchScope = 'enterprise' | 'memory' | 'both';
export default function SearchPage() {
const [query, setQuery] = useState('');
const [scope, setScope] = useState<SearchScope>('both');
const [results, setResults] = useState([]);
const [isLoading, setIsLoading] = useState(false);
const handleSearch = async () => {
if (!query.trim()) return;
setIsLoading(true);
try {
const response = await fetch('http://localhost:8002/search/hybrid', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
query,
limit: 20,
min_score: 0.7,
search_scope: scope
})
});
const data = await response.json();
setResults(data);
} catch (error) {
console.error('Search failed:', error);
} finally {
setIsLoading(false);
}
};
return (
<div className="max-w-6xl mx-auto p-8">
<h1 className="text-4xl font-bold mb-8">Hybrid Search</h1>
{/* Search Scope Selector */}
<div className="mb-6 flex gap-4">
<button
onClick={() => setScope('both')}
className={`flex items-center gap-2 px-4 py-2 rounded-lg ${
scope === 'both'
? 'bg-blue-600 text-white'
: 'bg-gray-200 hover:bg-gray-300'
}`}
>
<FileText className="w-5 h-5" />
<Brain className="w-5 h-5" />
Both Systems
</button>
<button
onClick={() => setScope('enterprise')}
className={`flex items-center gap-2 px-4 py-2 rounded-lg ${
scope === 'enterprise'
? 'bg-blue-600 text-white'
: 'bg-gray-200 hover:bg-gray-300'
}`}
>
<FileText className="w-5 h-5" />
Uploaded Documents
</button>
<button
onClick={() => setScope('memory')}
className={`flex items-center gap-2 px-4 py-2 rounded-lg ${
scope === 'memory'
? 'bg-blue-600 text-white'
: 'bg-gray-200 hover:bg-gray-300'
}`}
>
<Brain className="w-5 h-5" />
Memory System
</button>
</div>
{/* Search Input */}
<div className="flex gap-4 mb-8">
<input
type="text"
value={query}
onChange={(e) => setQuery(e.target.value)}
onKeyPress={(e) => e.key === 'Enter' && handleSearch()}
placeholder="Search across all knowledge..."
className="flex-1 px-4 py-3 border rounded-lg text-lg"
/>
<button
onClick={handleSearch}
disabled={isLoading}
className="px-8 py-3 bg-blue-600 text-white rounded-lg hover:bg-blue-700 disabled:opacity-50"
>
<Search className="w-6 h-6" />
</button>
</div>
{/* Results */}
<div className="space-y-4">
{results.map((result, idx) => (
<div
key={idx}
className="p-6 border rounded-lg hover:shadow-lg transition-shadow"
>
<div className="flex items-center gap-3 mb-3">
{result.source === 'enterprise' ? (
<FileText className="w-5 h-5 text-blue-600" />
) : (
<Brain className="w-5 h-5 text-purple-600" />
)}
<span className="text-sm font-semibold text-gray-600 uppercase">
{result.source}
</span>
<span className="ml-auto text-sm font-mono bg-gray-100 px-3 py-1 rounded">
{(result.score * 100).toFixed(1)}% match
</span>
</div>
<p className="text-gray-800 leading-relaxed">{result.text}</p>
{result.filename && (
<p className="mt-3 text-sm text-gray-500">
From: {result.filename}
</p>
)}
{result.entity_id && (
<p className="mt-3 text-sm text-gray-500">
Entity: {result.entity_id}
</p>
)}
</div>
))}
</div>
{results.length === 0 && !isLoading && (
<div className="text-center py-12 text-gray-500">
No results yet. Try searching for something!
</div>
)}
</div>
);
}
3.2 Add Memory Stats to Dashboard¶
File: enterprise-knowledge-platform/frontend/app/page.tsx (MODIFY)
Add memory system stats to home dashboard.
Phase 4: Testing & Validation (Day 2, Afternoon - 4 hours)¶
4.1 Integration Test Suite¶
File: tests/integration/test_hybrid_system.py (NEW)
"""
Integration tests for hybrid Enterprise Platform + Memory MCP system
"""
import pytest
import httpx
from pathlib import Path
BASE_URL = "http://localhost:8002"
MCP_BRIDGE_URL = "http://localhost:8003"
@pytest.mark.asyncio
async def test_mcp_bridge_health():
"""Test MCP bridge is accessible"""
async with httpx.AsyncClient() as client:
response = await client.get(f"{MCP_BRIDGE_URL}/health")
assert response.status_code == 200
assert response.json()["status"] == "healthy"
@pytest.mark.asyncio
async def test_enterprise_backend_health():
"""Test enterprise backend is accessible"""
async with httpx.AsyncClient() as client:
response = await client.get(f"{BASE_URL}/health")
assert response.status_code == 200
@pytest.mark.asyncio
async def test_hybrid_search():
"""Test hybrid search across both systems"""
async with httpx.AsyncClient() as client:
response = await client.post(
f"{BASE_URL}/search/hybrid",
json={
"query": "trading strategies",
"limit": 10,
"search_scope": "both"
}
)
assert response.status_code == 200
results = response.json()
assert isinstance(results, list)
# Verify both sources present (if data exists)
sources = {r["source"] for r in results}
# Should have results from at least one source
@pytest.mark.asyncio
async def test_document_upload_creates_entity():
"""Test that document upload creates memory entity"""
# Upload test document
test_file = Path("tests/fixtures/test.txt")
test_file.write_text("This is a test document for integration testing.")
async with httpx.AsyncClient() as client:
with open(test_file, "rb") as f:
response = await client.post(
f"{BASE_URL}/documents/upload",
files={"file": f}
)
assert response.status_code == 200
doc_id = response.json()["document_id"]
# Wait for processing
await asyncio.sleep(2)
# Verify entity created in memory system
memory_stats = await client.get(f"{MCP_BRIDGE_URL}/stats")
stats = memory_stats.json()
assert "document" in stats["entity_types"]
# Cleanup
test_file.unlink()
@pytest.mark.asyncio
async def test_combined_stats():
"""Test combined stats from both systems"""
async with httpx.AsyncClient() as client:
response = await client.get(f"{BASE_URL}/stats/complete")
assert response.status_code == 200
stats = response.json()
assert "enterprise" in stats
assert "memory" in stats
assert "combined" in stats
4.2 Manual Testing Checklist¶
Test Scenario 1: Web Upload → Claude Access 1. ✅ Upload PDF via web UI (http://localhost:3002/upload) 2. ✅ Verify document appears in web UI documents list 3. ✅ Open Claude Code CLI 4. ✅ Query memory MCP: "What documents were uploaded recently?" 5. ✅ Verify PDF content appears in results
Test Scenario 2: Claude Store → Web Access 1. ✅ In Claude Code: Store new entity via MCP 2. ✅ Open web UI search (http://localhost:3002/search) 3. ✅ Search for entity content 4. ✅ Verify entity appears in search results 5. ✅ Verify source shows as "memory"
Test Scenario 3: Hybrid Search 1. ✅ Upload document via web UI 2. ✅ Store entity via Claude Code 3. ✅ Search web UI with query matching both 4. ✅ Verify results from both sources appear 5. ✅ Verify correct source labels 6. ✅ Verify score sorting
Test Scenario 4: Analytics 1. ✅ Perform multiple searches 2. ✅ Upload multiple documents 3. ✅ Check http://localhost:3002/metrics 4. ✅ Verify combined stats show both systems 5. ✅ Verify entity counts are accurate
Phase 5: Documentation & Rollout (Day 3, Morning - 3 hours)¶
5.1 Update System Documentation¶
Files to Update: 1. CLAUDE.md - Add hybrid system usage 2. ARCHITECTURE.md - Update architecture diagrams 3. enterprise-knowledge-platform/README.md - Add integration notes 4. tools/CLAUDE.md - Document new workflows
New Commands Section for CLAUDE.md:
## Hybrid Knowledge System
### Access Modes
**Claude Code (CLI)**:
```bash
# Semantic search via MCP
uv run python3 tools/query_memory.py "trading strategies"
# Store new entity
uv run python3 tools/store_memory.py --type lesson --content "Always paper test first"
Web UI: - Upload: http://localhost:3002/upload - Search: http://localhost:3002/search - Browse: http://localhost:3002/documents - Analytics: http://localhost:3002/metrics
Workflows¶
Upload Document (Web → Claude Access): 1. Drag PDF to http://localhost:3002/upload 2. Document auto-indexed to Qdrant + creates markdown entity 3. Accessible via Claude Code memory queries
Create Entity (Claude → Web Access):
1. Store via MCP: mcp__memory-agent-stdio__use_memory_agent
2. Auto-indexed to Qdrant
3. Searchable in web UI immediately
Hybrid Search: - Toggle search scope: Both | Uploaded Docs | Memory System - Results show source (enterprise/memory) - Sorted by relevance score
### 5.2 Create Startup Script
**File: `scripts/start_hybrid_system.sh` (NEW)**
```bash
#!/bin/bash
# Start Hybrid Knowledge System
# Starts both Enterprise Platform and MCP Bridge
set -e
echo "🚀 Starting Hybrid Knowledge System..."
# Check prerequisites
echo "Checking prerequisites..."
if ! command -v docker &> /dev/null; then
echo "❌ Docker not found. Please install Docker Desktop."
exit 1
fi
if ! docker info &> /dev/null; then
echo "❌ Docker not running. Please start Docker Desktop."
exit 1
fi
# Start MCP Bridge
echo "Starting MCP Bridge (port 8003)..."
cd ~/mem-agent-mcp
uv run python3 mcp_bridge/server.py &
MCP_PID=$!
echo "✅ MCP Bridge started (PID: $MCP_PID)"
# Wait for MCP bridge to be ready
sleep 2
# Start Enterprise Platform
echo "Starting Enterprise Platform (ports 3002, 8002)..."
cd ~/mem-agent-mcp/enterprise-knowledge-platform
docker-compose up -d
# Wait for services
echo "Waiting for services to be ready..."
sleep 10
# Health checks
echo "Running health checks..."
curl -f http://localhost:8003/health > /dev/null 2>&1 && echo "✅ MCP Bridge healthy" || echo "❌ MCP Bridge failed"
curl -f http://localhost:8002/health > /dev/null 2>&1 && echo "✅ Backend healthy" || echo "❌ Backend failed"
curl -f http://localhost:3002 > /dev/null 2>&1 && echo "✅ Frontend healthy" || echo "❌ Frontend failed"
echo ""
echo "🎉 Hybrid Knowledge System is running!"
echo ""
echo "Access Points:"
echo " - Web UI: http://localhost:3002"
echo " - API Docs: http://localhost:8002/docs"
echo " - MCP Bridge: http://localhost:8003/health"
echo " - Claude Code: Use Memory MCP as usual"
echo ""
echo "To stop: docker-compose -f ~/mem-agent-mcp/enterprise-knowledge-platform/docker-compose.yml down"
echo " kill $MCP_PID"
Make executable:
5.3 Create LaunchAgent for MCP Bridge¶
File: ~/Library/LaunchAgents/com.mem-agent.mcp-bridge.plist
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.mem-agent.mcp-bridge</string>
<key>ProgramArguments</key>
<array>
<string>/Users/bertfrichot/.local/bin/uv</string>
<string>run</string>
<string>python3</string>
<string>mcp_bridge/server.py</string>
</array>
<key>WorkingDirectory</key>
<string>/Users/bertfrichot/mem-agent-mcp</string>
<key>StandardOutPath</key>
<string>/tmp/mcp_bridge.log</string>
<key>StandardErrorPath</key>
<string>/tmp/mcp_bridge_error.log</string>
<key>RunAtLoad</key>
<true/>
<key>KeepAlive</key>
<true/>
</dict>
</plist>
Load LaunchAgent:
launchctl load ~/Library/LaunchAgents/com.mem-agent.mcp-bridge.plist
launchctl start com.mem-agent.mcp-bridge
Phase 6: Optimization (Day 3, Afternoon - 2 hours)¶
6.1 Caching Strategy¶
Add Redis caching for frequent queries:
# In memory_integration.py
import redis
import json
class MemoryIntegrationService:
def __init__(self):
self.redis = redis.Redis(host='localhost', port=6379, decode_responses=True)
self.cache_ttl = 300 # 5 minutes
async def search(self, query: str, limit: int = 10):
# Check cache first
cache_key = f"memory_search:{query}:{limit}"
cached = self.redis.get(cache_key)
if cached:
return json.loads(cached)
# Perform search
results = await self._do_search(query, limit)
# Cache results
self.redis.setex(cache_key, self.cache_ttl, json.dumps(results))
return results
6.2 Performance Monitoring¶
Add Prometheus metrics:
from prometheus_client import Counter, Histogram
# Metrics
hybrid_searches = Counter('hybrid_searches_total', 'Total hybrid searches')
search_duration = Histogram('search_duration_seconds', 'Search duration')
memory_mcp_calls = Counter('memory_mcp_calls_total', 'MCP bridge calls')
Rollback Plan¶
If Integration Fails¶
Step 1: Stop New Services
# Stop MCP bridge
launchctl stop com.mem-agent.mcp-bridge
# Stop Enterprise Platform
cd enterprise-knowledge-platform
docker-compose down -v
Step 2: Restore Original Configuration
# Restore docker-compose.yml
git checkout enterprise-knowledge-platform/docker-compose.yml
# Remove integration code
rm mcp_bridge/server.py
rm enterprise-knowledge-platform/backend/services/memory_integration.py
Step 3: Verify Original System
# Test memory MCP still works
uv run python3 tools/query_memory.py "test query"
# Test Qdrant still accessible
curl http://localhost:6333/collections
Success Criteria¶
Must Have (Day 3 End)¶
- ✅ Web UI accessible on port 3002
- ✅ Backend API accessible on port 8002
- ✅ MCP Bridge running on port 8003
- ✅ Hybrid search returns results from both systems
- ✅ Document upload creates markdown entities
- ✅ Claude Code can access web-uploaded documents
- ✅ Web UI can search memory entities
- ✅ No port conflicts
- ✅ All health checks pass
Nice to Have¶
- ⚠️ Redis caching implemented
- ⚠️ Prometheus metrics added
- ⚠️ Advanced analytics dashboard
- ⚠️ Real-time sync (currently batch)
Timeline Summary¶
Day 1 (8 hours): - Morning: Infrastructure setup, port resolution (4h) - Afternoon: Backend integration, MCP bridge (4h)
Day 2 (8 hours): - Morning: Frontend enhancements (4h) - Afternoon: Testing & validation (4h)
Day 3 (5 hours): - Morning: Documentation & rollout (3h) - Afternoon: Optimization (2h)
Total: 21 hours (~2.5 days)
Next Steps¶
- NOW: Review this plan
- After approval: Execute Phase 1 (infrastructure)
- Then: Phase 2 (backend integration)
- Then: Phases 3-6
Ready to proceed with Phase 1 infrastructure setup?
Version: 1.0 Status: Ready for execution Estimated Completion: 2-3 days Risk Level: Medium (well-defined rollback plan)