OpenFeature MCP Server
The OpenFeature Model Context Protocol (MCP) Server enables AI coding assistants to interact with OpenFeature through a standardized protocol. It provides SDK installation guidance and feature flag evaluation capabilities directly within your AI-powered development environment.
The OpenFeature MCP Server is a local tool that connects AI coding assistants (like Cursor, Claude Code, VS Code, and Windsurf) to OpenFeature functionality. It acts as a bridge between your AI assistant and OpenFeature capabilities, enabling intelligent code generation and migration, SDK installation guidance, and feature flag evaluation.
This server is published to the MCP Registry under dev.openfeature/mcp.
⚠️ AI Agent Behavior: AI agents are non-deterministic and may not complete tasks correctly. Always manually review their changes before committing. If you encounter issues, please open an issue with details about your AI agent (e.g., Claude Code + Sonnet 4.5, Cursor + gpt-5-codex) with the commands you used and the behavior you saw.
Quick Start
NPX Install
The easiest way to use the OpenFeature MCP Server is through NPX, which requires no installation:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
NPM Global Install
You can install the MCP server globally:
npm install -g @openfeature/mcp
Then configure your AI assistant to use the global installation:
{
"mcpServers": {
"OpenFeature": {
"command": "openfeature-mcp"
}
}
}
AI Assistant Configuration
Cursor
📦 Install in CursorTo open Cursor and automatically add the OpenFeature MCP, click the install button above.
Alternatively, navigate to Cursor Settings -> Tools & MCP -> New MCP Server and add to ~/.cursor/mcp_settings.json:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
VS Code
📦 Install in VS CodeTo open VS Code and automatically add the OpenFeature MCP, click the install button above.
Alternatively, add to .continue/config.json:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
Claude Code
Add the server via the Claude Code CLI:
claude mcp add --transport stdio openfeature npx -y @openfeature/mcp
Then manage the connection with /mcp in the CLI.
Windsurf
In the Manage MCP servers raw config, add:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
Codex CLI
Edit ~/.codex/config.toml:
[mcp_servers.openfeature]
command = "npx"
args = ["-y", "@openfeature/mcp"]
Restart Codex CLI after saving.
Gemini CLI
Edit ~/.gemini/settings.json:
{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
Restart Gemini CLI after saving.
Claude Desktop
Edit your Claude Desktop config at:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Add the following configuration:
{
"mcpServers": {
"openfeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}
Restart Claude Desktop after saving.
Available Tools
The OpenFeature MCP Server provides two main tools accessible to AI assistants:
SDK Installation Guide: install_openfeature_sdk
Fetches installation instructions for OpenFeature SDKs in various languages and frameworks. Optionally includes provider-specific setup documentation.
SDK Tool Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
technology | string | Yes | Target language/framework (see supported list below) |
providers | string[] | No | Provider identifiers to include installation instructions |
Supported Technologies
The technologies list is built from the available prompts/*.md, updated automatically using scripts/build-prompts.js
| Technology | SDK |
|---|---|
android | Android Kotlin SDK |
dotnet | .NET SDK |
go | Go SDK |
ios | iOS Swift SDK |
java | Java SDK |
javascript | JavaScript Web SDK |
nestjs | NestJS SDK |
nodejs | Node.js SDK |
php | PHP SDK |
python | Python SDK |
react | React SDK |
ruby | Ruby SDK |
Supported Providers
The provider list is automatically sourced from the OpenFeature ecosystem (open-feature/openfeature.dev repo).
See scripts/build-providers.js for details on how the provider list is maintained.
OFREP Flag Evaluation: ofrep_flag_eval
Evaluate feature flags using the OpenFeature Remote Evaluation Protocol (OFREP). Supports both single flag and bulk evaluation.
OFREP Tool Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
base_url | string | No | Base URL of your OFREP-compatible flag service |
flag_key | string | No | Flag key for single evaluation (omit for bulk) |
context | object | No | Evaluation context (e.g., { targetingKey: "user-123" }) |
etag | string | No | ETag for bulk evaluation caching |
auth | object | No | Authentication configuration |
auth.bearer_token | string | No | Bearer token for authorization |
auth.api_key | string | No | API key for authorization |
OFREP Configuration
To use OFREP flag evaluation features, configure authentication and endpoint details. The server checks configuration in this priority order:
-
Environment Variables
OPENFEATURE_OFREP_BASE_URLorOFREP_BASE_URLOPENFEATURE_OFREP_BEARER_TOKENorOFREP_BEARER_TOKENOPENFEATURE_OFREP_API_KEYorOFREP_API_KEY
-
Configuration File:
~/.openfeature-mcp.json
Example ~/.openfeature-mcp.json:
{
"OFREP": {
"baseUrl": "https://flags.example.com",
"bearerToken": "<your-token>",
"apiKey": "<your-api-key>"
}
}
You can override the config file path using the OPENFEATURE_MCP_CONFIG_PATH environment variable.
Note: All logs are written to stderr. The MCP protocol messages use stdout.
MCP Usage Examples
SDK Installation Example
"install the OpenFeature SDK for Node.js with the flagd provider"
The AI will use the MCP to fetch relevant installation instructions and attempt to install the OpenFeature SDK with the correct provider.
Flag Evaluation Example
When interacting with your AI assistant:
"Can you check the value of the 'new-checkout-flow' feature flag for 'user-123'?"
The AI will use the MCP to evaluate the flag using OFREP and provide you with the result, along with additional metadata like variant and reason.