Skip to main content

Overview

Codex CLI is OpenAI’s open-source command-line coding agent. It can read your codebase, propose changes, execute commands, and iterate based on feedback — all from the terminal. By pointing Codex CLI to FoxAPI, you can use it with any OpenAI-compatible model available through FoxAPI’s API.
Codex CLI supports a config.toml configuration file for defining custom model providers, making it easy to connect to FoxAPI without modifying environment variables for the base URL.

Prerequisites

FoxAPI API Key

An active FoxAPI API key with access to OpenAI-compatible models.

Node.js 22+

Codex CLI requires Node.js version 22 or later.

Installation & Configuration

1

Install Codex CLI

npm install -g @openai/codex
Verify the installation:
codex --version
2

Set Your API Key

Codex CLI reads the OPENAI_API_KEY environment variable for authentication. Set it to your FoxAPI API key:
Add the following line to your shell profile (~/.bashrc, ~/.zshrc, etc.):
export OPENAI_API_KEY="sk-your-foxapi-api-key"
Then reload your shell configuration:
source ~/.zshrc  # or source ~/.bashrc
3

Configure FoxAPI as a Model Provider

Create or edit the Codex CLI configuration file:
Edit ~/.codex/config.toml:
model = "gpt-4.1"
model_reasoning_effort = "medium"
model_provider = "foxapi"

[model_providers.foxapi]
name = "FoxAPI"
base_url = "https://api.foxapi.cc/v1"
env_key = "OPENAI_API_KEY"
wire_api = "responses"
The wire_api field must be set to "responses". The "chat" option is deprecated and may cause unexpected behavior.
Configuration fields explained:
FieldDescription
modelDefault model to use (can be overridden with --model flag)
model_reasoning_effortReasoning effort level: "low", "medium", or "high"
model_providerName of the custom provider (must match the [model_providers.xxx] section)
base_urlFoxAPI’s API endpoint
env_keyEnvironment variable name that holds your API key
wire_apiAPI protocol to use ("responses" is required)
4

Start Using Codex

Navigate to your project and launch Codex:
cd /path/to/your/project
codex
Or run a one-off command:
codex "Add input validation to the signup form"

Verify the Connection

Test the connection with a simple prompt:
codex "Say hello and confirm you are working"
If the configuration is correct, Codex will respond through FoxAPI’s API.
ModelBest For
gpt-4.1Complex multi-step coding tasks (recommended default)
gpt-4oGeneral coding tasks with balanced speed and quality
o4-miniFast, cost-effective coding assistance
You can override the default model from config.toml when launching Codex:
codex --model gpt-4o "Refactor the auth module"

Troubleshooting

  • Verify your OPENAI_API_KEY is a valid FoxAPI key.
  • Ensure the key has not expired and has sufficient balance.
  • If using config.toml, confirm the env_key field matches the environment variable name you set (default: OPENAI_API_KEY).
  • Check that base_url in config.toml is set to https://api.foxapi.cc/v1 (with /v1).
  • Verify your network can reach api.foxapi.cc.
  • Confirm the model name is correct and available on FoxAPI.
  • Check your FoxAPI account for model access permissions.
  • Confirm the file is located at ~/.codex/config.toml (macOS/Linux) or C:\Users\{username}\.codex\config.toml (Windows).
  • Validate the TOML syntax — ensure strings are properly quoted and section headers use square brackets.
  • Check that model_provider value matches the [model_providers.xxx] section name exactly.