Skip to main content

Publishing the configuration file

The AI Translations package includes a configuration file that allows you to customize various aspects of the translation behavior. To publish the configuration file to your Laravel application, run:
php artisan vendor:publish --tag=ai-translations-config
This will create a config/ai-translations.php file in your application where you can customize the package settings.

Configuration file structure

The configuration file is organized into the following sections:

LLM provider

Configure which AI provider to use (OpenAI, Anthropic, or Gemini)

Languages

Define which languages your application supports

Default configuration

Here’s the complete default configuration file:
config/ai-translations.php
<?php

return [
    // reads existing languages from the lang_path() directory
    "languages" => null,

    "provider" => "gemini",
    "openai" => [
        "model" => "gpt-5-mini",
        "fast_model" => "gpt-5-nano",
        "api_key" => env("OPENAI_API_KEY"),
    ],

    "gemini" => [
        "model" => "gemini-flash-latest",
        "fast_model" => "gemini-flash-lite-latest",
        "api_key" => env("GEMINI_API_KEY"),
    ],

    "anthropic" => [
        "model" => "claude-sonnet-4-5",
        "fast_model" => "claude-haiku-4-5",
        "api_key" => env("ANTHROPIC_API_KEY"),
    ],
];

Configuration options

languages
array|null
default:"null"
An array of language codes your application supports. When set to null, the package automatically detects languages from your lang directory.
provider
string
default:"gemini"
The LLM provider to use for translations. Available options: openai, anthropic, or gemini.
openai
array
Configuration options for the OpenAI provider.
  • model (string): The primary model to use for translations
  • fast_model (string): A faster, less expensive model used with the --fast flag
  • api_key (string): Your OpenAI API key
anthropic
array
Configuration options for the Anthropic provider.
  • model (string): The primary model to use for translations
  • fast_model (string): A faster model used with the --fast flag
  • api_key (string): Your Anthropic API key
gemini
array
Configuration options for the Gemini provider.
  • model (string): The primary model to use for translations
  • fast_model (string): A faster model used with the --fast flag
  • api_key (string): Your Gemini API key

Next steps

Configure LLM providers

Set up your AI provider credentials and models

Configure languages

Define which languages to translate

Build docs developers (and LLMs) love