Docs

Documentation versions (currently viewingVaadin 24)
Documentation translations (currently viewingEnglish)

Setup Guide

Learn how to configure OpenAI Codex to access Vaadin documentation through the Model Context Protocol server

OpenAI Codex provides native support for HTTP-based MCP servers, making it straightforward to integrate with the Vaadin MCP server for enhanced code generation capabilities.

Requirements: Codex CLI version 0.43 or later

Configuration

Add the following to ~/.codex/config.toml:

Source code
toml
[mcp_servers.vaadin]
url = "https://mcp.vaadin.com/docs"

After configuration, restart Codex to load the new MCP server.

Note

If you see "missing field command" errors, upgrade Codex with:

Source code
bash
npm install -g @openai/codex@latest

Configuration File Format

The Codex configuration uses TOML format:

  • Each MCP server is defined under [mcp_servers.name]

  • The url field points to the HTTP endpoint

  • Multiple servers can be configured by adding additional sections

Example with multiple servers:

Source code
toml
[mcp_servers.vaadin]
url = "https://mcp.vaadin.com/docs"

[mcp_servers.other_server]
url = "https://example.com/mcp"

Verify the Setup

After restarting Codex:

  1. Start a new coding session

  2. Ask Codex about Vaadin development (e.g., "Generate a Vaadin login form")

  3. Codex queries the Vaadin MCP server to access relevant documentation

  4. Generated code is based on current Vaadin best practices

You can verify active MCP servers by running:

Source code
bash
codex config list