Object Storage
Podstack provides S3-compatible object storage for storing files, datasets, model artifacts, and more.
Creating a Bucket
- Navigate to Storage > Object Storage
- Click Create Bucket
- Configure:
- Name: Unique bucket identifier
- Description: Optional notes
- Region: Storage location
- Visibility: Public or Private
- Versioning: Enable to keep file history
- Click Create
Bucket Settings
Visibility Options
- Private: Only accessible with authentication
- Public: Anyone with the URL can download files
Versioning When enabled, uploading a file with the same name creates a new version rather than overwriting. Useful for backups and audit trails.
Managing Files
Uploading Files
Via Web UI
- Open your bucket
- Click Upload
- Select files or drag and drop
- Monitor upload progress
- Files appear in the bucket listing
Via S3 API Use any S3-compatible client:
import boto3
s3 = boto3.client('s3',
endpoint_url='https://s3.podstack.ai',
aws_access_key_id='your_access_key',
aws_secret_access_key='your_secret_key'
)
s3.upload_file('local_file.txt', 'my-bucket', 'remote_file.txt')
Downloading Files
Via Web UI
- Navigate to the file
- Click the Download button
Via S3 API
s3.download_file('my-bucket', 'remote_file.txt', 'local_file.txt')
Via Pre-signed URL Generate temporary download links:
- Select the file
- Click Get Link
- Set expiration time
- Share the URL
Organizing Files
Create folder structure:
- Click Create Folder
- Enter folder name
- Upload files into folders
Navigate folders using the breadcrumb trail.
Deleting Files
- Select file(s) using checkboxes
- Click Delete
- Confirm the deletion
Note: Deleted files cannot be recovered unless versioning is enabled.
Bucket Operations
Editing Bucket Settings
- Go to bucket list
- Click the settings icon
- Modify visibility or description
- Save changes
Changing Visibility
- Open bucket settings
- Toggle Public/Private
- Confirm the change
Warning: Making a bucket public exposes all files to anyone with the URL.
Deleting a Bucket
- Ensure the bucket is empty (delete all files first)
- Click Delete Bucket
- Confirm deletion
Public Bucket Access
Public buckets can be accessed via URL:
https://s3.podstack.ai/public/{bucket-name}/{file-path}
Use public buckets for:
- Sharing datasets publicly
- Hosting static assets
- Public model distribution
Pre-signed URLs
Generate temporary URLs for private files:
Download URL
url = s3.generate_presigned_url(
'get_object',
Params={'Bucket': 'my-bucket', 'Key': 'file.txt'},
ExpiresIn=3600 # 1 hour
)
Upload URL
url = s3.generate_presigned_url(
'put_object',
Params={'Bucket': 'my-bucket', 'Key': 'upload.txt'},
ExpiresIn=3600
)
Large File Uploads
For files larger than 100MB, use multipart upload:
Via Web UI
The upload manager automatically handles chunking for large files with:
- Progress tracking
- Resume capability
- Background uploading
Via API
from boto3.s3.transfer import TransferConfig
config = TransferConfig(
multipart_threshold=100 * 1024 * 1024, # 100MB
multipart_chunksize=100 * 1024 * 1024
)
s3.upload_file('large_file.tar', 'my-bucket', 'large_file.tar', Config=config)
Using Object Storage with Pods
From Container
Install AWS CLI or boto3:
pip install awscli boto3
Configure credentials:
aws configure
# Enter your access key and secret
Download data:
aws s3 cp s3://my-bucket/data.tar.gz /data/ --endpoint-url https://s3.podstack.ai
Mount as Filesystem
Use s3fs for filesystem-like access:
sudo apt install s3fs
echo "ACCESS_KEY:SECRET_KEY" > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs
s3fs my-bucket /mnt/s3 -o passwd_file=~/.passwd-s3fs -o url=https://s3.podstack.ai
Billing
Object storage is billed based on:
- Storage: Amount of data stored (per GB/hour)
- Transfer: Data downloaded (egress)
View costs in your wallet expenditure breakdown.
Best Practices
- Use meaningful names - Organize files with clear naming
- Enable versioning - For important data that changes
- Set appropriate visibility - Keep sensitive data private
- Clean up unused data - Delete files you no longer need
- Use pre-signed URLs - For temporary access without making public
Next Steps
Learn about NFS Volumes for mountable persistent storage.