Skip to main content

OpenFeature MCP Server

The OpenFeature Model Context Protocol (MCP) Server enables AI coding assistants to interact with OpenFeature through a standardized protocol. It provides SDK installation guidance and feature flag evaluation capabilities directly within your AI-powered development environment.

The OpenFeature MCP Server is a local tool that connects AI coding assistants (like Cursor, Claude Code, VS Code, and Windsurf) to OpenFeature functionality. It acts as a bridge between your AI assistant and OpenFeature capabilities, enabling intelligent code generation and migration, SDK installation guidance, and feature flag evaluation.

This server is published to the MCP Registry under dev.openfeature/mcp.

Quick Startโ€‹

NPX Installโ€‹

The easiest way to use the OpenFeature MCP Server is through NPX, which requires no installation:

{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}

NPM Global Installโ€‹

You can install the MCP server globally:

npm install -g @openfeature/mcp

Then configure your AI assistant to use the global installation:

{
"mcpServers": {
"OpenFeature": {
"command": "openfeature-mcp"
}
}
}

AI Assistant Configurationโ€‹

Cursorโ€‹

๐Ÿ“ฆ Install in Cursor

To open Cursor and automatically add the OpenFeature MCP, click the install button above.

Alternatively, navigate to Cursor Settings -> Tools & MCP -> New MCP Server and add to ~/.cursor/mcp_settings.json:

{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}

VS Codeโ€‹

๐Ÿ“ฆ Install in VS Code

To open VS Code and automatically add the OpenFeature MCP, click the install button above.

Alternatively, add to .continue/config.json:

{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}

Claude Codeโ€‹

Add the server via the Claude Code CLI:

claude mcp add --transport stdio openfeature npx -y @openfeature/mcp

Then manage the connection with /mcp in the CLI.

Windsurfโ€‹

In the Manage MCP servers raw config, add:

{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}

Codex CLIโ€‹

Edit ~/.codex/config.toml:

[mcp_servers.openfeature]
command = "npx"
args = ["-y", "@openfeature/mcp"]

Restart Codex CLI after saving.

Gemini CLIโ€‹

Edit ~/.gemini/settings.json:

{
"mcpServers": {
"OpenFeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}

Restart Gemini CLI after saving.

Claude Desktopโ€‹

Edit your Claude Desktop config at:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Add the following configuration:

{
"mcpServers": {
"openfeature": {
"command": "npx",
"args": ["-y", "@openfeature/mcp"]
}
}
}

Restart Claude Desktop after saving.

Available Toolsโ€‹

The OpenFeature MCP Server provides two main tools accessible to AI assistants:

SDK Installation Guide: install_openfeature_sdkโ€‹

Fetches installation instructions for OpenFeature SDKs in various languages and frameworks. Optionally includes provider-specific setup documentation.

SDK Tool Parametersโ€‹

ParameterTypeRequiredDescription
technologystringYesTarget language/framework (see supported list below)
providersstring[]NoProvider identifiers to include installation instructions

Supported Technologiesโ€‹

The technologies list is built from the available prompts/*.md, updated automatically using scripts/build-prompts.js

TechnologySDK
androidAndroid Kotlin SDK
dotnet.NET SDK
goGo SDK
iosiOS Swift SDK
javaJava SDK
javascriptJavaScript Web SDK
nestjsNestJS SDK
nodejsNode.js SDK
phpPHP SDK
pythonPython SDK
reactReact SDK
rubyRuby SDK

Supported Providersโ€‹

The provider list is automatically sourced from the OpenFeature ecosystem (open-feature/openfeature.dev repo).

See scripts/build-providers.js for details on how the provider list is maintained.

OFREP Flag Evaluation: ofrep_flag_evalโ€‹

Evaluate feature flags using the OpenFeature Remote Evaluation Protocol (OFREP). Supports both single flag and bulk evaluation.

OFREP Tool Parametersโ€‹

ParameterTypeRequiredDescription
base_urlstringNoBase URL of your OFREP-compatible flag service
flag_keystringNoFlag key for single evaluation (omit for bulk)
contextobjectNoEvaluation context (e.g., { targetingKey: "user-123" })
etagstringNoETag for bulk evaluation caching
authobjectNoAuthentication configuration
auth.bearer_tokenstringNoBearer token for authorization
auth.api_keystringNoAPI key for authorization

OFREP Configurationโ€‹

To use OFREP flag evaluation features, configure authentication and endpoint details. The server checks configuration in this priority order:

  1. Environment Variables

    • OPENFEATURE_OFREP_BASE_URL or OFREP_BASE_URL
    • OPENFEATURE_OFREP_BEARER_TOKEN or OFREP_BEARER_TOKEN
    • OPENFEATURE_OFREP_API_KEY or OFREP_API_KEY
  2. Configuration File: ~/.openfeature-mcp.json

Example ~/.openfeature-mcp.json:

{
"OFREP": {
"baseUrl": "https://flags.example.com",
"bearerToken": "<your-token>",
"apiKey": "<your-api-key>"
}
}

You can override the config file path using the OPENFEATURE_MCP_CONFIG_PATH environment variable.

Note: All logs are written to stderr. The MCP protocol messages use stdout.

MCP Usage Examplesโ€‹

SDK Installation Exampleโ€‹

"install the OpenFeature SDK for Node.js with the flagd provider"

The AI will use the MCP to fetch relevant installation instructions and attempt to install the OpenFeature SDK with the correct provider.

Flag Evaluation Exampleโ€‹

When interacting with your AI assistant:

"Can you check the value of the 'new-checkout-flow' feature flag for 'user-123'?"

The AI will use the MCP to evaluate the flag using OFREP and provide you with the result, along with additional metadata like variant and reason.