Available

Open and Proprietary Mistral models

Open and Proprietary Mistral models — free model from Mistral (La Plateforme).

Open and Proprietary Mistral models — Free API Specifications

Context 256K
Max Output 8K
Modality text
Rate Limit See provider page
Card Required No
OpenAI Compatible Yes

How to Configure Open and Proprietary Mistral models for Free

Base URL https://api.mistral.ai/v1
How to get an API key Get API Key →

One-Click Config for Claude Code, Cursor & More

Claude Code

# Claude Code works via OpenRouter's Anthropic-compatible API.
# Note: Only paid Anthropic Claude models are supported (e.g. claude-sonnet-4.6, claude-opus-4).
# Browse available Claude models at: https://openrouter.ai/models?q=anthropic

# Add to ~/.zshrc or ~/.bashrc
export OPENROUTER_API_KEY="<your-openrouter-api-key>"  # Get at https://openrouter.ai/settings/keys
export ANTHROPIC_BASE_URL="https://openrouter.ai/api"
export ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY"
export ANTHROPIC_API_KEY=""  # Must be explicitly empty to avoid conflicts

# Optional: pin specific models for each role
# export ANTHROPIC_DEFAULT_SONNET_MODEL="anthropic/claude-sonnet-4.6"
# export ANTHROPIC_DEFAULT_HAIKU_MODEL="anthropic/claude-haiku-4.5"

# Then simply run: claude

Cursor

# Cursor → Settings (⚙️) → Models → Add Model
# Enter the model name exactly as shown, then fill in:
#   Override OpenAI Base URL: https://api.mistral.ai/v1
#   OpenAI API Key: <your-api-key>   # Get at https://console.mistral.ai/
# Click "Verify" to confirm the connection, then enable the model.
#
# Model name to add: Open and Proprietary Mistral models

Codex

# Add to ~/.zshrc or ~/.bashrc
export OPENAI_BASE_URL="https://api.mistral.ai/v1"
export OPENAI_API_KEY="<your-api-key>"  # Get at https://console.mistral.ai/

# Then run:
codex --model "Open and Proprietary Mistral models"

Gemini CLI

# ~/.gemini/settings.json
{
  "apiKey": "<your-api-key>",
  "model": "Open and Proprietary Mistral models"
}
# Get API key at https://console.mistral.ai/

OpenCode

// ~/.config/opencode/opencode.json
{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "free-llm": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Free LLM",
      "options": {
        "baseURL": "https://api.mistral.ai/v1",
        "apiKey": "<your-api-key>"
      },
      "models": {
        "Open and Proprietary Mistral models": { "name": "Open and Proprietary Mistral models" }
      }
    }
  }
}
// Get API key at https://console.mistral.ai/

Hermes

# Step 1 — Edit config.yaml
# Windows: C:\Users\<you>\AppData\Local\hermes\config.yaml
# macOS/Linux: ~/.config/hermes/config.yaml

model:
  default: Open and Proprietary Mistral models
  provider: custom
  base_url: ${CUSTOM_BASE_URL}
  api_key: ${CUSTOM_API_KEY}
  model_aliases:
    Open and Proprietary Mistral models:
      model: "Open and Proprietary Mistral models"
      provider: "custom"

# Step 2 — Edit .env (same directory as config.yaml)
# Windows: C:\Users\<you>\AppData\Local\hermes\.env
# macOS/Linux: ~/.config/hermes/.env

# ========================
# Custom API (OpenAI-compatible)
# ========================
CUSTOM_API_KEY=<your-api-key>        # Get at https://console.mistral.ai/
CUSTOM_BASE_URL=https://api.mistral.ai/v1

OpenClaw

// ~/.openclaw/openclaw.json  (JSON5 format)
{
  "agents": {
    "defaults": {
      "model": {
        "primary": "Open and Proprietary Mistral models",
      },
    },
  },
  "models": {
    "providers": {
      // Option A — Built-in provider (OpenAI, Anthropic, Google…)
      // Just add apiKey; OpenClaw handles the baseUrl automatically
      // "openai": { "apiKey": "<your-api-key>" },

      // Option B — Custom OpenAI-compatible base URL (e.g. OpenRouter, NVIDIA)
      "free-llm": {
        "baseUrl": "https://api.mistral.ai/v1",
        "apiKey": "<your-api-key>",  // Get at https://console.mistral.ai/
        "api": "openai-completions", // openai-completions | anthropic-messages | …
        "models": [
          { "id": "Open and Proprietary Mistral models", "name": "Open and Proprietary Mistral models" },
        ],
      },
    },
  },
}
// Apply: openclaw gateway restart
// Verify: openclaw doctor --fix

Frequently Asked Questions about Open and Proprietary Mistral models

Is Open and Proprietary Mistral models free to use?

Yes. Open and Proprietary Mistral models is available on a permanently free tier via Mistral (La Plateforme). No credit card is required — simply sign up and get your API key. The free tier includes a rate limit of See provider page.

What is Open and Proprietary Mistral models best for?

Open and Proprietary Mistral models is optimized for chat tasks. It supports text modalities, with a context window of 256K tokens and a maximum output of 8K tokens. Open and Proprietary Mistral models — free model from Mistral (La Plateforme).

Is Open and Proprietary Mistral models OpenAI-compatible?

Yes. Open and Proprietary Mistral models uses an OpenAI-compatible API endpoint at https://api.mistral.ai/v1. You can use it with the OpenAI Python/JS SDK, or any tool that accepts a custom baseURL — including Claude Code (cc), Cursor, Codex, and OpenCode.

How do I get an API key for Open and Proprietary Mistral models?

Visit Mistral (La Plateforme)'s API key page to register and generate a free API key. Once you have the key, use the configuration snippets above to set up Claude Code, Cursor, or your preferred AI coding tool.