Skip to main content
This guide explains how to fine-tune models using TrueFoundry’s AI Gateway with OpenAI or Vertex AI providers.

Client Setup

All providers use the OpenAI SDK with provider-specific headers. Choose your provider to get started:
from openai import OpenAI

client = OpenAI(
    api_key="your-truefoundry-api-key",
    base_url="https://{controlPlaneUrl}/api/llm",
    default_headers={
        "x-tfy-provider-name": "openai-main"  # truefoundry provider integration name
    }
)

Training File Format

Create a JSONL file with one JSON object per line. Each line represents a conversation pair for training:
{"messages":[{"role":"user","content":"What is the weather in San Francisco?"},{"role":"assistant","content":"The weather in San Francisco is currently 18°C with partly cloudy skies. It's a mild day with light winds."}]}
{"messages":[{"role":"user","content":"What is the weather in Minneapolis?"},{"role":"assistant","content":"The weather in Minneapolis is currently 5°C with overcast skies. It's a chilly day, so you might want to dress warmly."}]}
{"messages":[{"role":"user","content":"What is the weather in San Diego?"},{"role":"assistant","content":"The weather in San Diego is currently 22°C with sunny skies. It's a beautiful day with perfect beach weather."}]}

Workflow Steps

The finetuning process follows these steps for all providers:
  1. Upload: Upload training file → Get file ID
  2. Create: Create finetune job → Get job ID
  3. Monitor: Check status until complete
  4. Use: Use the fine-tuned model for inference

Step-by-Step Examples

from openai import OpenAI

client = OpenAI(
    api_key="your-truefoundry-api-key",
    base_url="https://{controlPlaneUrl}/api/llm",
    default_headers={
        "x-tfy-provider-name": "openai-main"
    }
)

# Upload the training file
file = client.files.create(
    file=open("training.jsonl", "rb"),
    purpose="fine-tune"
)

print(file.id)  # Example: file-PnFGrFLN5LjjcWr4eFsStK
from openai import OpenAI

client = OpenAI(
    api_key="your-truefoundry-api-key",
    base_url="https://{controlPlaneUrl}/api/llm",
    default_headers={
        "x-tfy-provider-name": "openai-main"
    }
)

job = client.fine_tuning.jobs.create(
    training_file=file.id,
    model="gpt-3.5-turbo",
    hyperparameters={
        "n_epochs": 2
    },
    suffix="my-custom-model"
)

print(job.id)  # Example: ftjob-abc123xyz
from openai import OpenAI
import time

client = OpenAI(
    api_key="your-truefoundry-api-key",
    base_url="https://{controlPlaneUrl}/api/llm",
    default_headers={
        "x-tfy-provider-name": "openai-main"
    }
)

# Retrieve job status
job = client.fine_tuning.jobs.retrieve(job.id)
print(f"Status: {job.status}")  # queued, running, succeeded, failed, cancelled

Status Reference

  • queued: Job is queued and waiting to start
  • running: Fine-tuning is in progress
  • succeeded: Fine-tuning completed successfully
  • failed: Fine-tuning failed

Hyperparameters

You can configure the following hyperparameters in the hyperparameters object:
  • n_epochs: Number of training epochs
  • batch_size: Batch size
  • learning_rate_multiplier: Learning rate multiplier