Set up S3 bucket operations (upload, download, presigned URLs)
✓Works with OpenClaudeYou are an AWS developer setting up S3 bucket operations. The user wants to upload files, download files, and generate presigned URLs for temporary access.
What to check first
- Run
aws configureto verify AWS credentials are set (Access Key ID, Secret Access Key, region) - Run
aws s3 lsto confirm S3 bucket access and list available buckets - Verify the bucket name and region match your AWS account setup
Steps
- Install boto3 using
pip install boto3— this is the official AWS SDK for Python - Create an S3 client with
boto3.client('s3')specifying your region - For uploads, use
put_object()method with Bucket, Key, and Body parameters - For downloads, use
get_object()method and read the StreamingBody response - For presigned URLs, use
generate_presigned_url()with ClientMethod, Params, and ExpiresIn (seconds) - Set ExpiresIn to control URL expiration time (3600 = 1 hour, 86400 = 1 day)
- Handle exceptions like NoCredentialsError, ClientError, and FileNotFoundError
- Test each operation with a small test file before scaling to production
Code
import boto3
from botocore.exceptions import ClientError, NoCredentialsError
import os
class S3Operations:
def __init__(self, bucket_name, region_name='us-east-1'):
"""Initialize S3 client with bucket name and region"""
try:
self.s3_client = boto3.client('s3', region_name=region_name)
self.bucket_name = bucket_name
except NoCredentialsError:
print("ERROR: AWS credentials not configured")
raise
def upload_file(self, file_path, s3_key=None):
"""Upload file to S3 bucket"""
if not os.path.exists(file_path):
raise FileNotFoundError(f"File {file_path} not found")
s3_key = s3_key or os.path.basename(file_path)
try:
self.s3_client.put_object(
Bucket=self.bucket_name,
Key=s3_key,
Body=open(file_path, 'rb')
)
print(f"✓ Uploaded {file_path} to s3://{self.bucket_name}/{s3_key}")
return f"s3://{self.bucket_name}/{s3_key}"
except ClientError as e:
print(f"ERROR uploading file: {e}")
raise
def download_file(self, s3_key, local_path):
"""Download file from S3 bucket"""
try:
response = self.s3_client.get_object(
Note: this example was truncated in the source. See the GitHub repo for the latest full version.
Common Pitfalls
- Treating this skill as a one-shot solution — most workflows need iteration and verification
- Skipping the verification steps — you don't know it worked until you measure
- Applying this skill without understanding the underlying problem — read the related docs first
When NOT to Use This Skill
- When a simpler manual approach would take less than 10 minutes
- On critical production systems without testing in staging first
- When you don't have permission or authorization to make these changes
How to Verify It Worked
- Run the verification steps documented above
- Compare the output against your expected baseline
- Check logs for any warnings or errors — silent failures are the worst kind
Production Considerations
- Test in staging before deploying to production
- Have a rollback plan — every change should be reversible
- Monitor the affected systems for at least 24 hours after the change
Related Cloud (AWS/GCP/Azure) Skills
Other Claude Code skills in the same category — free to download.
Lambda Function
Create AWS Lambda function with handler
DynamoDB CRUD
Create DynamoDB CRUD operations
SQS Setup
Set up SQS queue producer and consumer
SNS Notifications
Configure SNS for push notifications
CloudFront Setup
Set up CloudFront CDN distribution
Cognito Auth
Implement AWS Cognito authentication
RDS Setup
Configure RDS database connection
ECS Task Definition
Create ECS task definitions
Want a Cloud (AWS/GCP/Azure) skill personalized to YOUR project?
This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.