Skip to main content

Overview

The StorageManager is the central facade for all storage operations in the Synapse SDK. It manages storage contexts (provider + data set pairs) with intelligent caching and provides both simple auto-managed operations and advanced split-operation workflows. Access via synapse.storage.

Architecture

StorageManager
├── Auto-managed uploads (upload, download)
├── Context management (createContext, createContexts)
└── Data set operations (findDataSets, terminateDataSet)

Simple Upload/Download

upload()

Upload data with automatic multi-copy redundancy across providers.
data
Uint8Array | ReadableStream<Uint8Array>
required
Raw bytes or readable stream to upload. Use streams for large files.
options
StorageManagerUploadOptions
Upload configuration options
result
UploadResult
Upload result containing:
  • pieceCid - Content identifier
  • size - Data size in bytes
  • copies - Successful provider copies
  • failures - Failed copy attempts
import { Synapse } from '@filoz/synapse-sdk'

const synapse = Synapse.create({ account, chain })

// Simple upload with 2 copies (default)
const data = new Uint8Array([1, 2, 3, 4])
const result = await synapse.storage.upload(data)

console.log('PieceCID:', result.pieceCid.toString())
console.log('Copies:', result.copies.length)
result.copies.forEach(copy => {
  console.log(`Provider ${copy.providerId}: ${copy.role}`)
})

// With progress tracking
const result = await synapse.storage.upload(data, {
  callbacks: {
    onProgress: (bytes) => console.log(`Uploaded ${bytes} bytes`),
    onStored: (providerId, pieceCid) => 
      console.log(`Stored on provider ${providerId}`),
    onPiecesConfirmed: (dataSetId, providerId) =>
      console.log(`Confirmed on chain: dataset ${dataSetId}`)
  }
})

// Streaming upload for large files
const fileStream = readableStream // ReadableStream<Uint8Array>
const result = await synapse.storage.upload(fileStream, {
  pieceCid: preCalculatedCid // Required for streams
})

download()

Download data from storage with automatic provider resolution.
options
StorageManagerDownloadOptions
required
Download options
data
Uint8Array
Downloaded and validated data bytes
// Auto-resolve from any available provider
const data = await synapse.storage.download({
  pieceCid: 'baga6ea4seaq...'
})

// From specific provider
const data = await synapse.storage.download({
  pieceCid: 'baga6ea4seaq...',
  providerAddress: '0x...'
})

// From specific context
const context = await synapse.storage.createContext()
const data = await synapse.storage.download({
  pieceCid: 'baga6ea4seaq...',
  context
})

Context Management

createContext()

Create or retrieve a single storage context with intelligent caching.
options
StorageServiceOptions
Context creation options
context
StorageContext
A storage context for the selected provider
// Auto-select provider and data set
const context = await synapse.storage.createContext()

// Specific provider
const context = await synapse.storage.createContext({
  providerId: 1n
})

// Specific data set
const context = await synapse.storage.createContext({
  dataSetId: 42n
})

getDefaultContext()

Get or create a default storage context (first endorsed provider with a new or existing data set).
context
StorageContext
A cached or newly created storage context using an endorsed provider
// Get default context (cached after first call)
const context = await synapse.storage.getDefaultContext()
await context.upload(data)

createContexts()

Create multiple contexts for multi-provider redundancy.
options
CreateContextsOptions
Multi-context creation options
contexts
StorageContext[]
Array of storage contexts
// Create 3 contexts with auto-selection
const contexts = await synapse.storage.createContexts({ count: 3 })

// Specific providers
const contexts = await synapse.storage.createContexts({
  providerIds: [1n, 2n, 3n]
})

Data Set Operations

findDataSets()

Query all data sets owned by the current account.
options
object
dataSets
EnhancedDataSetInfo[]
Array of data set information including:
  • dataSetId - Data set ID
  • providerId - Provider ID
  • activePieceCount - Number of pieces
  • metadata - Data set metadata
  • withCDN - CDN status
const dataSets = await synapse.storage.findDataSets()

dataSets.forEach(ds => {
  console.log(`Dataset ${ds.dataSetId}: ${ds.activePieceCount} pieces`)
  console.log(`Provider: ${ds.providerId}`)
  console.log(`CDN enabled: ${ds.withCDN}`)
})

terminateDataSet()

Terminate a data set and remove all its pieces.
options
object
required
txHash
Hash
Transaction hash of the termination
const txHash = await synapse.storage.terminateDataSet({
  dataSetId: 42n
})

getStorageInfo()

Get comprehensive storage service information including pricing, providers, and allowances.
info
StorageInfo
Complete service information:
  • pricing - Cost per TiB (per epoch/day/month)
  • providers - List of approved providers
  • serviceParameters - Epoch durations, size limits
  • allowances - Current operator approvals and usage
const info = await synapse.storage.getStorageInfo()

console.log('Price per TiB/month:', info.pricing.noCDN.perTiBPerMonth)
console.log('Approved providers:', info.providers.length)
console.log('Min upload size:', info.serviceParameters.minUploadSize)

if (info.allowances) {
  console.log('Service approved:', info.allowances.isApproved)
  console.log('Rate allowance:', info.allowances.rateAllowance)
}

preflightUpload()

Run preflight checks for an upload without creating a context.
options
object
required
info
PreflightInfo
Preflight check results:
  • estimatedCost - Cost breakdown (per epoch/day/month)
  • allowanceCheck - Whether user has sufficient allowance
const preflight = await synapse.storage.preflightUpload({
  size: 1024 * 1024 * 100, // 100 MiB
  withCDN: true
})

console.log('Cost per month:', preflight.estimatedCost.perMonth)

if (!preflight.allowanceCheck.sufficient) {
  console.log('Insufficient allowance:', preflight.allowanceCheck.message)
}

Upload Flow Details

The upload() method orchestrates a multi-step flow:
  1. Store - Upload to primary provider
  2. Pull - Secondary providers fetch from primary via SP-to-SP transfer
  3. Commit - Add pieces to on-chain data sets
const result = await synapse.storage.upload(data, {
  count: 3, // 1 primary + 2 secondaries
  callbacks: {
    onStored: (providerId) => 
      console.log(`1. Stored on primary: ${providerId}`),
    onPullProgress: (providerId, cid, status) =>
      console.log(`2. Pull to ${providerId}: ${status}`),
    onPiecesAdded: (txHash, providerId) =>
      console.log(`3. Commit tx submitted: ${txHash}`),
    onPiecesConfirmed: (dataSetId, providerId) =>
      console.log(`4. Confirmed on dataset: ${dataSetId}`)
  }
})

// Check results
if (result.copies.length < 3) {
  console.warn('Only', result.copies.length, 'copies succeeded')
  result.failures.forEach(f => {
    console.error(`Provider ${f.providerId} (${f.role}): ${f.error}`)
  })
}

Error Handling

import { StoreError, CommitError } from '@filoz/synapse-sdk/errors'

try {
  const result = await synapse.storage.upload(data)
} catch (error) {
  if (StoreError.is(error)) {
    console.error('Primary store failed:', error.message)
    console.error('Provider:', error.providerId)
  } else if (CommitError.is(error)) {
    console.error('All commits failed:', error.message)
  }
}

See Also

Build docs developers (and LLMs) love