Skip to main content
The Cloudflare SDK supports multiple file upload methods to accommodate different runtime environments and use cases.

Supported file types

Request parameters that accept file uploads support multiple formats:

File API

Web-standard File objects or compatible structures

ReadStream

Node.js fs.ReadStream for file system access

Response

fetch Response objects or compatible structures

toFile Helper

SDK’s toFile() helper for buffers and arrays

File upload methods

Use fs.createReadStream() for optimal performance in Node.js:
import fs from 'fs';
import Cloudflare from 'cloudflare';

const client = new Cloudflare();

await client.apiGateway.userSchemas.create({
  zone_id: '023e105f4ecef8ad9ca31a8372d0c353',
  file: fs.createReadStream('/path/to/schema.json'),
  kind: 'openapi_v3',
});
This is the recommended approach for Node.js environments as it streams the file without loading it entirely into memory.

The toFile helper

The toFile() helper converts raw data into an uploadable format:
import { toFile } from 'cloudflare';

// Signature
toFile(
  data: Buffer | Uint8Array | ArrayBuffer,
  filename: string,
  options?: { type?: string }
): Promise<File>

Usage examples

import { toFile } from 'cloudflare';

const file = await toFile(
  Buffer.from('file contents'),
  'document.txt',
  { type: 'text/plain' }
);
data
Buffer | Uint8Array | ArrayBuffer
required
Raw binary data to upload
filename
string
required
Name for the uploaded file
options.type
string
MIME type of the file (e.g., "application/json")

Reading files from disk

Use fileFromPath() to read files from the filesystem:
import { fileFromPath } from 'cloudflare';

const file = await fileFromPath('/path/to/schema.json');

await client.apiGateway.userSchemas.create({
  zone_id: '023e105f4ecef8ad9ca31a8372d0c353',
  file: file,
  kind: 'openapi_v3',
});
fileFromPath() is only available in Node.js environments.

Complete upload examples

API Gateway schema upload

import fs from 'fs';
import Cloudflare from 'cloudflare';

const client = new Cloudflare({
  apiToken: process.env.CLOUDFLARE_API_TOKEN,
});

// Upload OpenAPI schema from file system
const schema = await client.apiGateway.userSchemas.create({
  zone_id: '023e105f4ecef8ad9ca31a8372d0c353',
  file: fs.createReadStream('./openapi.yaml'),
  kind: 'openapi_v3',
});

console.log('Schema uploaded:', schema.schema_id);

Dynamic file generation

import Cloudflare, { toFile } from 'cloudflare';

const client = new Cloudflare();

// Generate schema programmatically
const schemaData = {
  openapi: '3.0.0',
  info: {
    title: 'My API',
    version: '1.0.0',
  },
  paths: {},
};

const schemaJson = JSON.stringify(schemaData, null, 2);
const file = await toFile(
  Buffer.from(schemaJson),
  'schema.json',
  { type: 'application/json' }
);

await client.apiGateway.userSchemas.create({
  zone_id: '023e105f4ecef8ad9ca31a8372d0c353',
  file: file,
  kind: 'openapi_v3',
});

Remote file upload

import fetch from 'node-fetch';
import Cloudflare from 'cloudflare';

const client = new Cloudflare();

// Download and upload in one step
const remoteFile = await fetch('https://api.example.com/schema.json');

await client.apiGateway.userSchemas.create({
  zone_id: '023e105f4ecef8ad9ca31a8372d0c353',
  file: remoteFile,
  kind: 'openapi_v3',
});

Multipart form data

File uploads use multipart/form-data encoding automatically:
// The SDK handles multipart encoding for you
await client.apiGateway.userSchemas.create({
  zone_id: '023e105f4ecef8ad9ca31a8372d0c353',
  file: fs.createReadStream('./schema.json'),
  kind: 'openapi_v3',
  // Other fields are encoded alongside the file
});
The SDK automatically sets the correct Content-Type: multipart/form-data header and constructs the multipart body.

File metadata

When using the File API or toFile(), you can specify metadata:
// Using File constructor
const file = new File(
  ['content'],
  'filename.txt',
  {
    type: 'text/plain',
    lastModified: Date.now(),
  }
);

// Using toFile helper
const file = await toFile(
  Buffer.from('content'),
  'filename.txt',
  { type: 'text/plain' }
);

Best practices

1

Use streams for large files

In Node.js, prefer fs.createReadStream() to avoid loading entire files into memory:
// Good: Streams the file
file: fs.createReadStream('/large/file.json')

// Bad: Loads entire file into memory
file: await toFile(fs.readFileSync('/large/file.json'), 'file.json')
2

Set correct MIME types

Specify the correct content type for better compatibility:
const file = await toFile(
  jsonBuffer,
  'schema.json',
  { type: 'application/json' }
);
3

Handle file errors

Wrap file operations in try-catch blocks:
try {
  const file = fs.createReadStream('/path/to/file');
  await client.apiGateway.userSchemas.create({
    zone_id: zoneId,
    file,
    kind: 'openapi_v3',
  });
} catch (err) {
  if (err.code === 'ENOENT') {
    console.error('File not found');
  }
  throw err;
}
4

Validate files before upload

Check file size and format before initiating uploads:
const stats = fs.statSync('/path/to/file');
if (stats.size > 10 * 1024 * 1024) {
  throw new Error('File too large (max 10MB)');
}

TypeScript types

The SDK provides type definitions for uploadable values:
type Uploadable = 
  | File
  | Blob  
  | Response
  | ReadStream
  | { 
      name?: string;
      type?: string;
      size?: number;
      arrayBuffer(): Promise<ArrayBuffer>;
    };
Any object matching this structure can be used for file uploads.

Build docs developers (and LLMs) love