Welcome to kinetic-context
An MCP server for getting information about open-source dependencies
kinetic-context
kinetic-context is an MCP (Model Context Protocol) server that helps developers get information about open-source dependencies. It provides a Docker image that you can run locally, exposing an MCP server that AI coding tools can connect to for querying dependency information.
What is kinetic-context?
kinetic-context allows you to:
- Query dependencies - Ask questions about how to use open-source packages
- Manage projects - Organize dependencies for different projects
- Version control - Use specific versions/tags of dependencies per project
- AI-powered answers - Uses OpenCode AI to provide intelligent answers about dependencies
The server clones repository code for dependencies and uses OpenCode (an AI code editor) to answer questions about how to use dependencies.
Quick Start
Get started with kinetic-context in minutes. See the Installation Guide for detailed manual setup instructions (recommended), or use the automated script:
# Manual installation is recommended - see Installation guide
# Alternative: Automated script (use at your own risk)
bash <(curl -fsSL https://raw.githubusercontent.com/christopher-kapic/kinetic-context/master/setup.sh)
# Authenticate to GitHub Container Registry (optional - for pre-built OpenCode image)
docker login ghcr.io
# Start kinetic-context
kctx startThen access the web UI at http://localhost:7167
Getting Started
Learn how to set up and use kinetic-context
Installation
Detailed installation instructions
CLI Reference
Complete kctx command reference
Configuration
Configure opencode, packages, and projects
MCP Server
Learn about the MCP server tools
Features
- MCP Server - Model Context Protocol server for dependency queries
- Web Interface - Full-featured browser-based UI with:
- Chat Interface - Interactive chat for querying dependencies with conversation history
- Chat History - Persistent chat sessions stored locally in your browser
- Model Management - Configure AI providers and models for dependency queries
- Settings - Customize agent prompts and system behavior
- Package Management - Create, edit, import, and export package configurations
- Project Management - Organize dependencies across multiple projects
- Project Scanning - Automatically discover dependencies from your projects
- Docker Support - Easy deployment with Docker
- Version Management - Pin specific versions of dependencies per project
- AI Integration - Powered by OpenCode for intelligent code analysis
- Conversation Continuity - Maintain context across multiple queries using session IDs
How It Works
- Configure Packages - Add package configurations to define which dependencies are available (via web UI or JSON files)
- Create Projects - Set up projects and associate them with specific dependencies (via web UI or JSON files)
- Configure Models - Set up AI providers and models in the Models page (web UI) or opencode.json config file
- Query Dependencies - Use the MCP server or web UI chat interface to ask questions about dependencies
- Get Answers - OpenCode analyzes the dependency code and provides intelligent answers
- Continue Conversations - Use session IDs to maintain context across multiple queries for follow-up questions
Architecture
- Server - Hono-based API server with MCP endpoint
- Web Frontend - React application with TanStack Router
- OpenCode Integration - AI-powered code analysis
- Docker - Containerized deployment for easy setup