Skip to main content

Provider capabilities

The summary below reflects high-level Files API availability per provider. For per-operation details (upload, list, delete, etc.), see Provider Support Matrix below.
Legend:
  • Supported by Provider and Truefoundry
  • Provided by provider, but not by Truefoundry
  • Provider does not support this feature
ProviderFiles
OpenAI
Azure OpenAI
Anthropic
Bedrock
Vertex
Cohere
Gemini
Groq
Cerebras
Together-AI
xAI
DeepInfra
For every gateway endpoint and provider, see Supported APIs.

Provider Support Matrix

TrueFoundry’s Files API provides a unified OpenAI-compatible interface across multiple providers. Not all providers support every operation — refer to the matrix below.
ProviderUploadListRetrieveContentDeleteIntegrationStorage Backend
OpenAIPass-throughOpenAI API
Azure OpenAIPass-throughAzure OpenAI API
AnthropicTransformedAnthropic Files API
AWS BedrockTransformedAWS S3
Google Vertex AITransformedGoogle Cloud Storage
GroqTransformedGroq API
  • Pass-through — Requests are proxied directly to the provider’s native File API without modification.
  • Transformed — The gateway translates requests and responses between OpenAI’s format and the provider’s native format. For Bedrock and Vertex AI, the gateway also manages file storage (S3 and GCS respectively) on behalf of the provider.
This guide demonstrates how to set up the client and use file-related endpoints. Setup OpenAI client first - Provider Support Matrix
from openai import OpenAI

BASE_URL = "{GATEWAY_BASE_URL}"
API_KEY = "your-truefoundry-api-key"

# Configure OpenAI client with TrueFoundry settings
client = OpenAI(
    api_key=API_KEY,
    base_url=BASE_URL,
)

Provider Specific Extra Headers

When making requests, you’ll need to specify provider-specific headers based on which LLM provider you’re using. Choose your provider:
extra_headers = {
    "x-tfy-provider-name": "openai-provider-name"  # name of tfy provider integration
}

1. Upload File

Use this to upload files for usage with Batch or Assistants APIs.
file = client.files.create(
    file=open("request.jsonl", "rb"),
    purpose="batch",
    extra_headers=extra_headers
)

print(file.id)
Notes:
  • Max individual file size: 512 MB
  • Max org-wide storage: 100 GB
  • For Batch API: only .jsonl, max 50 MB
  • For Assistants API: up to 2M tokens
Expected Output
{
  "id": "file-abc123",
  "object": "file",
  "bytes": 120000,
  "created_at": 1677610602,
  "filename": "mydata.jsonl",
  "purpose": "fine-tune",
}

2. List Files

Returns a list of files
files = client.files.list(
	extra_headers=extra_headers
)
Expected Output
{
  "data": [
    {
      "id": "file-abc123",
      "object": "file",
      "bytes": 175,
      "created_at": 1613677385,
      "filename": "salesOverview.pdf",
      "purpose": "assistants",
    },
    {
      "id": "file-abc123",
      "object": "file",
      "bytes": 140,
      "created_at": 1613779121,
      "filename": "puppy.jsonl",
      "purpose": "fine-tune",
    }
  ],
  "object": "list"
}

3. Retrieve File

Returns information about a specific file.
file = client.files.retrieve(
	file.id,
	extra_headers=extra_headers
)

print(file.json())
Expected Output
{
  "id": "file-abc123",
  "object": "file",
  "bytes": 120000,
  "created_at": 1677610602,
  "filename": "mydata.jsonl",
  "purpose": "fine-tune",
}

4. Delete File

Delete a file permanently
deleted_file = client.files.delete(
	file.id,
	extra_headers=extra_headers
)

print(deleted_file.json())
Expected Output
{
  "id": "file-abc123",
  "object": "file",
  "deleted": true
}

5. Retrieve File Content

Returns the contents of the specified file.
output_content = client.files.content(
    file.id,
    extra_headers=extra_headers
)

print(output_content.json())
Anthropic: Retrieve File Content works only for files created by skills or the code execution tool. Files you upload via the Files API cannot be downloaded.