Context Engine for AI Coding Tools · Open Source

Your AI reads 10,000 lines
to find 1 function.

Indexa gives it exactly what it needs.

Index your codebase once. Return only relevant symbols, dependencies, and execution flows. 50% fewer tokens. Better answers.

50-70%
Token reduction
19
MCP tools
22
CLI commands
<2s
Query time
Terminal
$ npx indexa-mcp setup
Project detected: my-app (typescript/react)
Indexed 8,500 chunks in 15s
MCP configured for Claude Code
Ready! Try: "explain the auth flow"

One query. Full system understanding.

You ask:
"trace the login flow"
Indexa returns:
VendorAuthGuard
  → VendorFlowGuard
    → verifyPkceSession
      → getAppSessionCookie
        → useVcAuthStore
9 steps · 5 files · 2,500 tokens · Done.
You ask:
"what breaks if I change UserService?"
Indexa returns:
12 references across 8 files
Blast radius: LoginController, AuthMiddleware,
SessionManager, TokenValidator, +4 more
Impact mapped · Before you refactor.
You ask:
"explain the theme system"
Indexa returns:
ThemeSwitcher → useUIStore → ThemeApplier
4 themes: default, cyberpunk, minimal, matrix
CSS variables swap via data-theme attribute
3 symbols · 1,789 tokens · Complete picture.

Most tools help you find code. Indexa helps AI understand it.

AI agents waste 95% of tokens reading full files

Every time an AI reads your code, it consumes thousands of tokens on imports, boilerplate, and irrelevant functions. Indexa changes that.

Without Indexa

Full File Reading

AI agent reads auth.ts (450 lines) just to understand one function.

Reads 12 files to answer a single question.

Burns through context window in minutes.

~10,000 tokens
With Indexa

Smart Code Retrieval

Returns only the relevant validateToken() function and its dependencies.

Semantic search finds exactly what the AI needs.

Context window stays efficient and focused.

~3,000 tokens

Three steps to smarter AI coding

From install to intelligent code retrieval in under 30 seconds.

1

Install

One command installs Indexa globally and configures it for your AI tools.

$ npm install -g indexa-mcp
2

Index

Automatically parses your codebase into semantic chunks with ML embeddings.

$ npx indexa-mcp setup
3

Query

AI agents get exactly the code they need through 19 specialized MCP tools.

"explain the auth flow"

Everything you need for intelligent code retrieval

Built for developers who want their AI agents to work smarter, not harder.

50-70% Token Reduction

Returns only relevant code chunks instead of full files. Your AI agents use fewer tokens and produce better results.

Local-First

Runs entirely on your machine. No cloud dependency, no API keys, no recurring costs. Your code never leaves your device.

ML Embeddings

Powered by Transformers.js with gte-small (384-dim) for genuine semantic understanding of your code.

Hybrid Search

Combines 35% semantic, 25% BM25, 15% name matching, and 25% path scoring for precise, relevant results.

Query Intent Classification

Automatically detects whether you want flow analysis, explanations, references, debugging, or general search.

VS Code Extension

Inline AI commands: Explain This, Fix This, Refactor, Generate Tests. Right-click menu + keyboard shortcuts. Auto-index on save.

Zero-Config Setup

npx indexa-mcp setup — indexes, configures MCP, creates CLAUDE.md, verifies. Claude Code uses Indexa automatically.

Code Quality Built In

Dead code detection, circular dependencies, unused exports, duplicate code finder, security scan — all in one tool.

Battle-Tested Reliability

Binary file protection, corrupt index recovery, embedding fallback, buffer overflow guards. 23 integration tests. Built for daily use.

Real numbers, real savings

Measured across real-world codebases. Indexa consistently delivers 50-70% token reduction.

Query Type Without Indexa With Indexa Savings
Explain a function ~8,500 tokens ~2,800 tokens 67%
Trace auth flow ~15,000 tokens ~5,200 tokens 65%
Find references ~12,000 tokens ~4,100 tokens 66%
Debug an issue ~10,000 tokens ~3,500 tokens 65%
Search for patterns ~6,000 tokens ~2,100 tokens 65%
Average ~10,300 tokens ~3,540 tokens 66%

19 specialized tools for every query

Search, trace, analyze, and review code. Dead code detection, circular deps, duplicate finder, security scans, PR review, and more.

context_bundle
Smart context packaging that bundles only the relevant code chunks for a query.
search
Hybrid semantic + keyword search across your entire codebase with ranked results.
explain
Retrieves code with surrounding context optimized for explanation queries.
flow
Traces execution flow across files, following function calls and data paths.
symbol
Looks up specific symbols — functions, classes, interfaces, types — by name.
file
Returns the full content of a specific file when complete context is needed.
references
Finds all references and usages of a symbol across your project files.
index
Re-indexes your codebase to pick up new changes and update embeddings.
stats
Shows indexing statistics — chunk counts, file coverage, embedding status.
dead_code
Find unreferenced functions, methods, and classes across the entire codebase.
blast_radius
What breaks if you change a symbol? Direct refs, transitive impact, affected files.
circular_deps
Detect circular dependencies between files. Break import cycles before they break you.
duplicates
Find near-duplicate code via embedding similarity. Spot copy-paste patterns instantly.
code_grep
Regex pattern search across all indexed source files. Find console.log, TODO, hardcoded URLs.
security_scan
Deep security analysis grouped by OWASP domains — auth, XSS, secrets, crypto.
review_pr
Context-aware PR review — changed files, blast radius, connections, review checklist.
impact_chain
Full transitive impact analysis — trace every symbol affected across all depths.

Up and running in 30 seconds

Three commands. That is all it takes.

bash
# Install globally $ npm install -g indexa-mcp # Setup in your project (detects, indexes, configures MCP) $ npx indexa-mcp setup # Or run directly without installing $ npx indexa-mcp setup
Claude Code MCP config (~/.mcp.json)
{ "mcpServers": { "indexa": { "command": "npx", "args": ["-y", "indexa-mcp", "stdio"] } } }

How Indexa processes your code

A lightweight pipeline that transforms raw source files into semantically searchable chunks.

Input
Source Files
Parse
AST Chunker
Embed
Transformers.js
Store
Local Index
Serve
MCP Server
Smart Parsing
AST-aware chunking preserves function boundaries and semantic meaning
384-dim Embeddings
GTE-small model via Transformers.js for semantic understanding
Hybrid Ranking
Semantic + BM25 + name + path scoring for best results
100% Local
Everything runs on your machine. Zero network calls required
0+
Chunks indexed
0%
Token reduction verified
0
MCP tools available
0s
Average query time

Why not just use RAG / Copilot / Grep?

Different tools. Different capabilities. See where Indexa wins.

Capability Generic RAG Copilot Grep / Glob Indexa
Returns Raw text chunks Full file reads Line matches Symbols + deps + connections
Understands structure AST-parsed
Traces execution Call chains across files
Blast radius What breaks if you change X
Dead code detection Unreferenced symbols
Circular dependencies Import cycle detection
Duplicate code finder Embedding similarity
Token efficiency Poor Poor Medium 50-70% reduction
Runs locally Sometimes Always
Free Sometimes Free forever

Built by Prashant Singh

Module Lead & Frontend–AI Specialist with 6+ years building scalable systems.

Prashant Singh

Module Lead | Frontend – AI & Migration Specialist

Experienced Senior Frontend Engineer skilled in full-stack development, API integration, and Agile methodologies. Proficient in Angular, React, Next.js, TypeScript, and modern DevOps. Expert in building scalable, high-performance web applications with focus on AI-powered tooling.

6+Years Exp.
50+Features
35+Projects
847Contributions
AngularJS → React migrations at scale
AI tools that cut dev effort by 40%
Performance wins: LCP 4.2s → 1.8s
Systems serving 100K+ users

Like Indexa? Give it a star

Help other developers discover Indexa. A star on GitHub goes a long way.

Star on GitHub GitHub stars

Start saving tokens today

One command. Zero configuration. Free forever.