Dev Tools

XiFanCoder

An extensible AI Agent CLI coding tool that provides a unified AI coding experience — freely switchable LLM backends, a horizontally scalable plugin ecosystem, and MCP as the universal tool bus.

Node.js 18+ TypeScript MCP MIT License

Core Features

Agent Loop

Context build → request construction → tool dispatch → result feedback. A complete, inspectable loop with sub-agent parallelism and prompt-injection detection.

100+ LLM Backends

Built-in drivers for Anthropic, OpenAI, and Ollama, plus an optional LiteLLM Proxy driver routing to 100+ models. Switch backends with a single /model command.

Plugins & MCP Bus

First-class plugins: Aider (Git-aware multi-file edits), Open Interpreter (code execution), smol-dev (project scaffolding). MCP tool bus exposes tools to any IDE via stdio or WebSocket.

Cross-Session Memory

xifan-mem auto-captures observations, performs three-step progressive retrieval, and generates session summaries. SQLite + FTS5 for local, private persistence.

Token & Cost Tracking

Real-time token metering, per-session cost summaries, and budget limits with warn / pause / stop policies. Know what every request costs before you run it.

Tiered Permissions

Four-tier tool permissions (L0 read → L1 write → L2 shell → L3 dangerous). Headless mode denies elevated tools by default; API keys are read from env only, never written to disk.

Architecture

C4 Level 2 container view. L1 core (Node.js only) runs independently; L2 enhancements (Python-based) plug in as optional capability extensions.

graph TB subgraph CLI["📦 cli — User Interaction"] REPL["REPL / TUI (Ink)"] Slash["Slash Commands Router"] SessionMgr["Session Manager"] end subgraph Core["📦 core — Agent Loop Engine"] Loop["Agent Loop
State Machine"] CB["ContextBuilder"] TD["ToolDispatcher
+ Permissions"] SAM["SubAgentManager"] subgraph LLMLayer["LLM Drivers"] Builtin["BuiltinTSDriver
Anthropic / OpenAI / Ollama"] LiteLLM["LiteLLMProxyDriver (L2)"] end subgraph ToolsLayer["Built-in Tools"] Tools["read_file / write_file
bash_execute / web_fetch / list_dir"] end MCPClient["MCP Client"] end subgraph Mem["📦 xifan-mem — Memory"] MemMgr["MemoryManager"] MemDB["SQLite + FTS5"] end subgraph Extension["🔌 Extension Boundary (Optional)"] subgraph PluginBus["plugin-bus"] SmolDev["smol-dev (Node)"] Aider["Aider (Python, L2)"] OI["Open Interpreter (L2)"] end MCPServer["MCP Server
WebSocket + stdio"] end subgraph Storage["💾 ~/.xifan/"] SDB["sessions.db"] MDB["memory.db"] Config["config.yaml"] end REPL --> Loop Slash --> Loop SessionMgr --> Loop Loop --> CB --> TD Loop --> SAM TD --> ToolsLayer TD --> MCPClient Loop --> LLMLayer Loop --> MemMgr MemMgr --> MemDB TD --> PluginBus MCPServer --> TD Core --> Storage Mem --> Storage style CLI fill:#1a2a4a,stroke:#00F0FF,color:#fff style Core fill:#2a1a3a,stroke:#B026FF,color:#fff style Mem fill:#1a3a2a,stroke:#00C853,color:#fff style Extension fill:#3a1a2a,stroke:#ff6b9d,color:#fff style Storage fill:#1a1a1a,stroke:#666,color:#fff

Why XiFan-Coder

Open & Extensible

  • Source-readable codebase, no minified bundles
  • First-class plugin SDK for tools, drivers, and sub-agents
  • MCP tool bus — bring your own MCP server, instantly usable
  • Cross-session memory layer (xifan-mem)

Built for Practitioners

  • Token usage and cost tracking out of the box
  • IDE integration (VS Code and beyond)
  • Documentation and version governance built-in
  • Stable in restricted networks — no Anthropic-only lock-in

Quick Start

L1 core layer runs on Node.js ≥ 18 alone — no Python environment required.

1

Install globally via npm

# npm
npm install -g @xifan-coder/cli

# or pnpm
pnpm add -g @xifan-coder/cli
2

Configure your LLM backend

# Set API key via environment variable
export ANTHROPIC_API_KEY=sk-ant-...
# or OPENAI_API_KEY, or point at a local Ollama / LiteLLM Proxy

# Generate project-level XIFAN.md
xifan-coder init --config
3

Start the interactive agent

# Interactive REPL
xifan-coder

# One-shot task
xifan-coder "refactor src/auth.ts"

# Use a specific model
xifan-coder --model qwen2.5-coder

# Headless mode for CI (denies write/shell by default)
xifan-coder --headless "run tests and fix failures"

Build from source or contribute:

github.com/e2ec-it/XiFanCoder

Privacy Policy

XiFan-Coder runs entirely on your local machine. Configuration, API keys, session history, and memory are stored locally — nothing is uploaded to E2E Consulting or any third-party telemetry service.

XiFan-Coder communicates only with the LLM endpoints and MCP servers you configure. No analytics, crash reporting, or telemetry SDKs are included.

Contact

If you have questions about this privacy policy, contact us at tech-support01@e2ec.biz

Last updated: April 7, 2026