Storage Adapters
Overview
Overview of storage adapters for uploading files to cloud providers.
Storage Adapters
Storage adapters handle uploading, downloading, and deleting files from remote storage providers.
Available Adapters
| Provider | Adapter | Auth Method | Import Path | Status |
|---|---|---|---|---|
| S3-Compatible | PluginS3 | Presigned URLs | nuxt-upload-kit/providers/s3 | Experimental |
| Azure Data Lake | PluginAzureDataLake | SAS URL | nuxt-upload-kit/providers/azure-datalake | Available |
| Firebase Storage | PluginFirebaseStorage | Firebase SDK | nuxt-upload-kit/providers/firebase | Experimental |
| Google Cloud Storage | - | - | - | Coming soon |
The
PluginS3 adapter works with AWS S3 and any S3-compatible service including Cloudflare R2, DigitalOcean Spaces, MinIO, Backblaze B2, Wasabi, and Supabase Storage.All storage adapters are imported from
nuxt-upload-kit/providers/*. This keeps the main package lightweight and only bundles the providers you actually use.Using Storage Adapters
Configure a storage adapter using the storage option:
import { PluginS3 } from "nuxt-upload-kit/providers/s3"
const uploader = useUploadKit({
storage: PluginS3({
getPresignedUploadUrl: async (fileId, contentType, metadata) => {
const response = await fetch("/api/s3/presign", {
method: "POST",
body: JSON.stringify({ key: fileId, contentType, ...metadata }),
})
return response.json() // { uploadUrl, publicUrl }
},
}),
})
Only one storage adapter can be active at a time. The
storage option takes precedence over any storage adapters in the plugins array.Without Storage Adapters
If you don't use a storage adapter, implement upload logic with onUpload:
const uploader = useUploadKit()
uploader.onUpload(async (file, onProgress) => {
const formData = new FormData()
formData.append("file", file.data as Blob)
// Use XMLHttpRequest for progress tracking
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest()
xhr.upload.onprogress = (e) => {
if (e.lengthComputable) {
onProgress(Math.round((e.loaded / e.total) * 100))
}
}
xhr.onload = () => {
if (xhr.status >= 200 && xhr.status < 300) {
resolve(JSON.parse(xhr.response))
} else {
reject(new Error("Upload failed"))
}
}
xhr.onerror = () => reject(new Error("Network error"))
xhr.open("POST", "/api/upload")
xhr.send(formData)
})
})
Standalone Upload
All storage adapters expose a standalone upload() method for uploading raw Blob or File data directly, bypassing the useUploadKit pipeline (validation, preprocessing, processing).
const storage = PluginS3({ getPresignedUploadUrl: /* ... */ })
// Upload an edited/cropped image directly
const blob = canvasElement.toBlob()
const result = await storage.upload(blob, "edited-photo.jpg", {
contentType: "image/jpeg",
})
console.log(result.url) // Public URL
console.log(result.storageKey) // Resolved storage key
Standalone Upload Options
| Option | Type | Default | Description |
|---|---|---|---|
contentType | string | application/octet-stream | MIME type of the data |
onProgress | (percentage: number) => void | - | Progress callback (0-100) |
The storageKey parameter is treated like a filename — the adapter prepends any configured path prefix (e.g., options.path).
Use Cases
- Uploading edited or cropped images
- Uploading thumbnails separately from the main file
- Any scenario where you have a
Bloband just need it in storage
Storage Adapter Interface
All storage adapters implement the following interface:
interface StoragePlugin {
id: string
// Standalone upload (bypasses pipeline)
upload: (data: Blob | File, storageKey: string, options?) => Promise<{ url: string; storageKey: string; ... }>
hooks: {
// Required: Upload a file through the pipeline
upload: (file, context) => Promise<{ url: string; ... }>
// Optional: Get remote file metadata
getRemoteFile?: (fileId, context) => Promise<{
size: number
mimeType: string
remoteUrl: string
preview?: string
}>
// Optional: Delete a file
remove?: (file, context) => Promise<void>
}
}
Creating Custom Storage Adapters
See the Custom Storage Adapters guide for creating your own storage adapter.

