Skip to main content
The SDK’s tuning implementation is experimental, and may change in future versions.

Method

client.tunings.tune(
    base_model: str,
    training_dataset: TuningDataset,
    config: Optional[CreateTuningJobConfig] = None
) -> TuningJob
Creates a tuning job and returns the TuningJob object. This method initiates a fine-tuning process for a specified base model using the provided training dataset.
base_model
string
required
The name of the model to tune. For Vertex AI, this can also be a pre-tuned model resource name starting with projects/.
training_dataset
TuningDataset
required
The training dataset to use for tuning. Can be one of:
  • gcs_uri: GCS bucket path (Vertex AI only)
  • vertex_dataset_resource: Vertex AI Dataset resource (Vertex AI only)
  • examples: Inline training examples (Gemini API only)
config
CreateTuningJobConfig
Configuration options for the tuning job:

Response

name
string
The resource name of the tuning job
state
JobState
Current state of the tuning job. One of:
  • JOB_STATE_QUEUED: Job is queued
  • JOB_STATE_PENDING: Job is pending
  • JOB_STATE_RUNNING: Job is running
  • JOB_STATE_SUCCEEDED: Job completed successfully
  • JOB_STATE_FAILED: Job failed
  • JOB_STATE_CANCELLED: Job was cancelled
create_time
string
Timestamp when the job was created
start_time
string
Timestamp when the job started running
end_time
string
Timestamp when the job completed
base_model
string
The base model being tuned
tuned_model
TunedModel
Information about the resulting tuned model

Usage

Basic Example

from google import genai
from google.genai import types

client = genai.Client(api_key='your-api-key')

# Create a tuning job
tuning_job = client.tunings.tune(
    base_model='gemini-1.5-flash-002',
    training_dataset=types.TuningDataset(
        gcs_uri='gs://my-bucket/training-data.jsonl'
    ),
    config=types.CreateTuningJobConfig(
        tuned_model_display_name='my-tuned-model',
        epoch_count=5,
        learning_rate_multiplier=1.0
    )
)

print(f"Tuning job created: {tuning_job.name}")
print(f"State: {tuning_job.state}")

Polling for Completion

import time

# Create tuning job
tuning_job = client.tunings.tune(
    base_model='gemini-1.5-flash-002',
    training_dataset=types.TuningDataset(
        gcs_uri='gs://my-bucket/training-data.jsonl'
    )
)

# Poll until job completes
while tuning_job.state in [
    types.JobState.JOB_STATE_QUEUED,
    types.JobState.JOB_STATE_PENDING,
    types.JobState.JOB_STATE_RUNNING
]:
    print(f"Job state: {tuning_job.state}")
    time.sleep(60)
    tuning_job = client.tunings.get(name=tuning_job.name)

if tuning_job.state == types.JobState.JOB_STATE_SUCCEEDED:
    print(f"Tuning completed! Model: {tuning_job.tuned_model.model}")
else:
    print(f"Tuning failed with state: {tuning_job.state}")

With Inline Examples (Gemini API)

from google.genai import types

tuning_job = client.tunings.tune(
    base_model='gemini-1.5-flash-002',
    training_dataset=types.TuningDataset(
        examples=[
            {
                'text_input': 'What is the capital of France?',
                'output': 'The capital of France is Paris.'
            },
            {
                'text_input': 'What is 2+2?',
                'output': '2+2 equals 4.'
            }
        ]
    ),
    config=types.CreateTuningJobConfig(
        epoch_count=10,
        batch_size=4
    )
)

See Also

Build docs developers (and LLMs) love