Skip to main content

Overview

Multi-Cloud Manager provides unified APIs for managing blobs (files) in both Azure Blob Storage and Google Cloud Storage. This guide covers upload, download, and delete operations with complete code examples.

Azure Blob Operations

Upload Blob to Azure

Upload a file to an Azure blob container. Endpoint: POST /api/azure/storage/blobs/upload Content-Type: multipart/form-data Form Parameters:
  • accountName - Storage account name
  • accountKey - Storage account access key
  • containerName - Target container name
  • file - The file to upload
Implementation (azure_modules/storage.py:285-303):
from azure.storage.blob import BlobServiceClient

def upload_blob(storage_account_id):
    if "access_token" not in session:
        return jsonify({"error": "Unauthorized"}), 401
    
    account_name = request.form.get("accountName")
    account_key = request.form.get("accountKey")
    container_name = request.form.get("containerName")
    file = request.files.get("file")
    
    try:
        if not all([account_name, account_key, container_name, file]):
            return jsonify({"error": "Brak wymaganych danych"}), 400
        
        # Create blob service client
        blob_service = BlobServiceClient(
            account_url=f"https://{account_name}.blob.core.windows.net",
            credential=account_key
        )
        
        # Get blob client and upload
        blob_client = blob_service.get_blob_client(
            container=container_name,
            blob=file.filename
        )
        blob_client.upload_blob(file.stream, overwrite=True)
        
        return jsonify({"message": f"Plik '{file.filename}' został przesłany"}), 200
    except Exception as e:
        return jsonify({"error": str(e)}), 500
Example using cURL:
curl -X POST https://api.example.com/api/azure/storage/blobs/upload \
  -F "accountName=mystorageaccount" \
  -F "accountKey=your-account-key" \
  -F "containerName=my-container" \
  -F "file=@/path/to/document.pdf"
The overwrite=True parameter ensures that if a blob with the same name exists, it will be replaced.

Download Blob from Azure

Download a file from an Azure blob container. Endpoint: GET /api/azure/storage/blobs/download Request Body:
{
  "accountName": "mystorageaccount",
  "accountKey": "your-account-key",
  "containerName": "my-container",
  "blobName": "document.pdf"
}
Implementation (azure_modules/storage.py:305-329):
from flask import send_file
from io import BytesIO

def download_blob(storage_account_id):
    if "access_token" not in session:
        return jsonify({"error": "Unauthorized"}), 401
    
    data = request.get_json()
    account_name = data.get("accountName")
    account_key = data.get("accountKey")
    container_name = data.get("containerName")
    blob_name = data.get("blobName")
    
    try:
        blob_service = BlobServiceClient(
            account_url=f"https://{account_name}.blob.core.windows.net",
            credential=account_key
        )
        
        # Get blob client and download
        blob_client = blob_service.get_blob_client(
            container=container_name,
            blob=blob_name
        )
        stream = blob_client.download_blob()
        blob_data = stream.readall()

        # Send file to client
        return send_file(
            BytesIO(blob_data),
            download_name=blob_name,
            as_attachment=True
        )
    except Exception as e:
        return jsonify({"error": str(e)}), 500
Response: Binary file data with appropriate headers for download

Delete Blob from Azure

Delete a file from an Azure blob container. Endpoint: DELETE /api/azure/storage/blobs Request Body:
{
  "accountName": "mystorageaccount",
  "accountKey": "your-account-key",
  "containerName": "my-container",
  "blobName": "document.pdf"
}
Implementation (azure_modules/storage.py:331-349):
def delete_blob(storage_account_id):
    if "access_token" not in session:
        return jsonify({"error": "Unauthorized"}), 401
    
    data = request.get_json()
    account_name = data.get("accountName")
    account_key = data.get("accountKey")
    container_name = data.get("containerName")
    blob_name = data.get("blobName")
    
    try:
        blob_service = BlobServiceClient(
            account_url=f"https://{account_name}.blob.core.windows.net",
            credential=account_key
        )
        
        blob_client = blob_service.get_blob_client(
            container=container_name,
            blob=blob_name
        )
        blob_client.delete_blob()
        
        return jsonify({"message": f"Blob '{blob_name}' został usunięty"}), 200
    except Exception as e:
        return jsonify({"error": str(e)}), 500

GCP Object Operations

Upload Object to GCP

Upload a file to a GCP storage bucket. Endpoint: POST /api/gcp/storage/blobs/upload Content-Type: multipart/form-data Form Parameters:
  • bucketName - Target bucket name
  • projectId - GCP project ID
  • file - The file to upload
Implementation (gcp/storage.py:179-208):
from google.cloud import storage
import io

def upload_blob_to_bucket():
    accounts = session.get("accounts", [])
    gcp_account = next((acc for acc in accounts if acc.get("provider") == "gcp"), None)
    
    if not gcp_account:
        return jsonify({"error": "Nie znaleziono aktywnego konta GCP w sesji"}), 401

    try:
        if 'file' not in request.files:
            return jsonify({"error": "Brak pliku w żądaniu."}), 400

        file = request.files['file']
        if file.filename == '':
            return jsonify({"error": "Nie wybrano pliku."}), 400

        bucket_name = request.form.get("bucketName")
        project_id = request.form.get("projectId")
        
        if not all([bucket_name, project_id]):
            return jsonify({"error": "Parametry 'bucketName' i 'projectId' są wymagane."}), 400

        # Create storage client and upload
        credentials = SessionCredentials(gcp_account)
        storage_client = storage.Client(project=project_id, credentials=credentials)
        bucket = storage_client.bucket(bucket_name)
        
        blob = bucket.blob(file.filename)
        blob.upload_from_file(file)

        return jsonify({"message": f"Plik '{file.filename}' został pomyślnie wysłany."}), 201

    except Exception as e:
        return jsonify({"error": f"Wystąpił błąd podczas wysyłania pliku: {e}"}), 500
Example using cURL:
curl -X POST https://api.example.com/api/gcp/storage/blobs/upload \
  -F "bucketName=my-gcp-bucket" \
  -F "projectId=my-project-123" \
  -F "file=@/path/to/image.jpg"
The blob name will be set to the uploaded file’s filename. To use a different name, you would need to modify the blob() call.

Download Object from GCP

Download a file from a GCP storage bucket. Endpoint: GET /api/gcp/storage/blobs/download Query Parameters:
  • bucketName - Source bucket name
  • projectId - GCP project ID
  • blobName - Name of the object to download
Implementation (gcp/storage.py:210-240):
def download_blob_from_bucket():
    accounts = session.get("accounts", [])
    gcp_account = next((acc for acc in accounts if acc.get("provider") == "gcp"), None)
    
    if not gcp_account:
        return jsonify({"error": "Nie znaleziono aktywnego konta GCP w sesji"}), 401

    try:
        bucket_name = request.args.get("bucketName")
        project_id = request.args.get("projectId")
        blob_name = request.args.get("blobName")
        
        if not all([bucket_name, project_id, blob_name]):
            return jsonify({"error": "Parametry 'bucketName', 'projectId' i 'blobName' są wymagane."}), 400

        # Create storage client
        credentials = SessionCredentials(gcp_account)
        storage_client = storage.Client(project=project_id, credentials=credentials)
        bucket = storage_client.bucket(bucket_name)
        blob = bucket.blob(blob_name)

        # Download to memory
        file_buffer = io.BytesIO()
        blob.download_to_file(file_buffer)
        file_buffer.seek(0)  # Reset buffer position

        # Send file to client
        return send_file(
            file_buffer,
            download_name=blob_name,
            as_attachment=True,
            mimetype=blob.content_type
        )

    except Exception as e:
        return jsonify({"error": f"Wystąpił błąd podczas pobierania pliku: {e}"}), 500
Example Request:
GET /api/gcp/storage/blobs/download?bucketName=my-bucket&projectId=my-project-123&blobName=reports/2024.pdf
The mimetype is automatically detected from the blob’s content_type property, ensuring proper browser handling.

Delete Object from GCP

Delete a file from a GCP storage bucket. Endpoint: DELETE /api/gcp/storage/blobs Request Body:
{
  "bucketName": "my-gcp-bucket",
  "projectId": "my-project-123",
  "blobName": "old-file.txt"
}
Implementation (gcp/storage.py:242-265):
def delete_blob_from_bucket():
    accounts = session.get("accounts", [])
    gcp_account = next((acc for acc in accounts if acc.get("provider") == "gcp"), None)
    
    if not gcp_account:
        return jsonify({"error": "Nie znaleziono aktywnego konta GCP w sesji"}), 401

    data = request.get_json()
    bucket_name = data.get("bucketName")
    project_id = data.get("projectId")
    blob_name = data.get("blobName")
    
    if not all([bucket_name, project_id, blob_name]):
        return jsonify({"error": "Pola 'bucketName', 'projectId' i 'blobName' są wymagane."}), 400

    try:
        credentials = SessionCredentials(gcp_account)
        storage_client = storage.Client(project=project_id, credentials=credentials)
        bucket = storage_client.bucket(bucket_name)
        blob = bucket.blob(blob_name)
        blob.delete()

        return jsonify({"message": f"Plik '{blob_name}' został pomyślnie usunięty."}), 200

    except Exception as e:
        return jsonify({"error": f"Wystąpił błąd podczas usuwania pliku: {e}"}), 500

Error Handling Best Practices

Azure Error Handling

Common Azure blob storage errors:
from azure.core.exceptions import (
    ResourceNotFoundError,
    ResourceExistsError,
    HttpResponseError
)

try:
    blob_client.upload_blob(data)
except ResourceNotFoundError:
    # Container doesn't exist
    return jsonify({"error": "Container not found"}), 404
except ResourceExistsError:
    # Blob already exists and overwrite=False
    return jsonify({"error": "Blob already exists"}), 409
except HttpResponseError as e:
    # Other HTTP errors (403, 401, etc.)
    return jsonify({"error": str(e)}), e.status_code

GCP Error Handling

Common GCP storage errors:
from google.cloud.exceptions import NotFound, Forbidden

try:
    blob.upload_from_file(file)
except NotFound:
    # Bucket or blob doesn't exist
    return jsonify({"error": "Bucket not found"}), 404
except Forbidden:
    # Insufficient permissions
    return jsonify({"error": "Access denied"}), 403
except Exception as e:
    # Generic error handling
    return jsonify({"error": str(e)}), 500

Comparison: Azure vs GCP

FeatureAzure Blob StorageGCP Cloud Storage
AuthenticationAccount name + keyOAuth2 credentials
URL Format{account}.blob.core.windows.netstorage.googleapis.com/{bucket}
Client Libraryazure.storage.blob.BlobServiceClientgoogle.cloud.storage.Client
Upload Methodupload_blob(stream, overwrite=True)upload_from_file(file)
Download Methoddownload_blob().readall()download_to_file(buffer)
Delete Methoddelete_blob()delete()
Overwrite BehaviorExplicit overwrite parameterAutomatic overwrite

Best Practices

For large files (>100MB), consider:
  • Using chunked uploads for better performance
  • Implementing progress tracking
  • Setting appropriate timeouts
  • Handling network interruptions with retry logic
  • Always use HTTPS endpoints
  • Store access keys securely (environment variables, key vault)
  • Implement proper authentication checks
  • Use short-lived tokens when possible
  • Enable CORS only for specific origins
  • Validate all input parameters before making API calls
  • Provide meaningful error messages to users
  • Log detailed errors server-side for debugging
  • Implement retry logic for transient failures
  • Handle specific exceptions (NotFound, Forbidden, etc.)
  • Use connection pooling for multiple operations
  • Implement caching for frequently accessed blobs
  • Consider using CDN for public content
  • Enable compression for text-based files
  • Use parallel uploads for multiple files

Azure Blob Storage

Storage account and container management

GCP Cloud Storage

Bucket creation and configuration

Build docs developers (and LLMs) love