Skip to main content

Your First Translation

Let’s translate a document in three simple steps:
1

Set Your API Key

Choose your preferred LLM provider and set the API key:
export OPENAI_API_KEY="your-api-key-here"
Don’t have an API key? See the Installation guide for details on getting one.
2

Run Your First Translation

Translate a text document to German:
tinbox translate --to de --model openai:gpt-5-2025-08-07 document.txt
The translation will be printed to your console.
3

Save to a File

Add the --output flag to save the result:
tinbox translate --to de --model openai:gpt-5-2025-08-07 --output document_de.txt document.txt

Basic Examples

Text Documents

Translating text files is the simplest use case:
tinbox translate --to es --model openai:gpt-5-2025-08-07 story.txt

PDF Documents

PDF translation requires:
  1. Poppler installed on your system (see Installation)
  2. A vision-capable model like GPT-4o, Claude Sonnet, or Gemini Pro Vision
tinbox translate --to de --algorithm page --model openai:gpt-4o document.pdf
Use --algorithm page for PDFs to process them page-by-page, which works best for maintaining formatting and context.

Word Documents (DOCX)

# Translate a Word document
tinbox translate --to de --model openai:gpt-5-2025-08-07 report.docx

# Save to file
tinbox translate --to es --output report_es.txt --model openai:gpt-5-2025-08-07 report.docx

Model Providers

Tinbox supports multiple LLM providers:
# GPT-5 (latest)
tinbox translate --to de --model openai:gpt-5-2025-08-07 document.txt

# GPT-4o (vision-capable for PDFs)
tinbox translate --to es --model openai:gpt-4o document.pdf
GPT-4o and GPT-5 both support vision for PDF translation.

Real-World Example

Let’s translate a story from English to German with checkpointing:
1

Preview the Cost

Use --dry-run to estimate costs before translating:
tinbox translate --to de --model openai:gpt-5-2025-08-07 --dry-run story.txt
Output:
📊 Translation Estimate:
- Input tokens: ~1,200
- Estimated output tokens: ~1,400
- Estimated cost: $0.03
- Estimated time: 15 seconds
2

Run with Checkpointing

For large documents, enable checkpointing to resume if interrupted:
tinbox translate --to de \
  --model openai:gpt-5-2025-08-07 \
  --checkpoint-dir ./checkpoints \
  --output story_de.txt \
  story.txt
3

Monitor Progress

Tinbox shows real-time progress:
🔄 Translating: story.txt
📄 Processing chunk 1/8...
💰 Cost so far: $0.01
⏱️  Elapsed: 5s

✅ Translation complete!
📊 Final stats:
- Total tokens: 2,600
- Total cost: $0.03
- Total time: 18s

Advanced Features

Glossary for Consistent Terminology

Maintain consistent translations of technical terms:
# Tinbox will detect and save important terms
tinbox translate --to es \
  --glossary \
  --save-glossary terms.json \
  --model openai:gpt-5-2025-08-07 \
  technical_doc.txt

Glossary File Format

Create a JSON file with term mappings:
terms.json
{
  "entries": {
    "API": "Interface de programmation",
    "CPU": "Processeur",
    "GPU": "Carte graphique",
    "Machine Learning": "Apprentissage automatique"
  }
}

Cost Control

Prevent unexpected expenses:
# Set a maximum cost limit
tinbox translate --to de \
  --max-cost 5.00 \
  --model openai:gpt-5-2025-08-07 \
  document.txt

# Translation will stop if cost exceeds $5.00

Reasoning Effort

Control translation quality vs. cost:
tinbox translate --to es \
  --reasoning-effort minimal \
  --model openai:gpt-5-2025-08-07 \
  document.txt
Start with minimal reasoning effort. Only increase to low or high if you need better quality for complex documents.

Output Formats

Choose how you want your translation output:
tinbox translate --to es --model openai:gpt-5-2025-08-07 document.txt
Simple text output, perfect for most use cases.

Best Practices by Document Type

PDFs

tinbox translate --to es \
  --algorithm page \
  --model openai:gpt-4o \
  document.pdf
  • Use --algorithm page
  • Requires vision-capable model
  • Consider --pdf-dpi 300 for higher quality

Large Text Files

tinbox translate --to de \
  --context-size 2000 \
  --checkpoint-dir ./checkpoints \
  --model openai:gpt-5-2025-08-07 \
  large_file.txt
  • Use context-aware algorithm (default)
  • Enable checkpointing
  • Adjust --context-size as needed

Technical Documents

tinbox translate --to fr \
  --glossary \
  --save-glossary terms.json \
  --model openai:gpt-5-2025-08-07 \
  tech_doc.pdf
  • Enable glossary support
  • Save terms for future use
  • Consider higher reasoning effort

Cost-Sensitive Projects

tinbox translate --to es \
  --dry-run \
  --max-cost 5.00 \
  --reasoning-effort minimal \
  --model openai:gpt-5-2025-08-07 \
  document.txt
  • Always use --dry-run first
  • Set --max-cost limits
  • Use minimal reasoning effort
  • Consider Ollama for local models

Common Workflows

Batch Translation

Translate multiple documents efficiently:
# Loop through all text files
for file in *.txt; do
  tinbox translate --to de \
    --model openai:gpt-5-2025-08-07 \
    --output "${file%.txt}_de.txt" \
    "$file"
done

Resume Interrupted Translation

If a translation is interrupted, simply rerun with the same checkpoint directory:
# First run (interrupted)
tinbox translate --to de \
  --checkpoint-dir ./checkpoints \
  --model openai:gpt-5-2025-08-07 \
  large_document.txt

# Resume from checkpoint
tinbox translate --to de \
  --checkpoint-dir ./checkpoints \
  --model openai:gpt-5-2025-08-07 \
  large_document.txt
Tinbox automatically detects and resumes from the last checkpoint.

Troubleshooting

Try these improvements:
  1. Increase reasoning effort: --reasoning-effort high
  2. Use a better model: openai:gpt-5-2025-08-07 or anthropic:claude-3-sonnet
  3. Enable glossary: --glossary for consistent terminology
  4. Specify source language: --from en (don’t rely on auto-detect)
Check these requirements:
  1. Poppler installed: Run tinbox doctor to verify
  2. Using vision-capable model: GPT-4o, Claude Sonnet, or Gemini Pro Vision
  3. PDF extras installed: pip install tinbox[pdf]
Reduce costs with:
  1. Use local Ollama models (free)
  2. Set --reasoning-effort minimal
  3. Set --max-cost limits
  4. Use --dry-run to preview costs
  5. Smaller context size: --context-size 1500
For large documents:
  1. Enable checkpointing: --checkpoint-dir ./checkpoints
  2. Reduce chunk size: --context-size 1500
  3. For PDFs, use: --algorithm page

Next Steps

Command Reference

Complete guide to all CLI options and flags

Translation Algorithms

Learn about different translation strategies

Model Providers

Detailed comparison of LLM providers

Advanced Usage

Custom splitting, glossaries, and optimization
Need help? Run tinbox doctor to diagnose issues or check our troubleshooting guide.

Build docs developers (and LLMs) love