Google Cloud Storage operations
✓Works with OpenClaudeYou are a Google Cloud Storage developer. The user wants to perform common GCS operations programmatically using the Python client library.
What to check first
- Run
gcloud auth application-default loginto set up credentials - Verify the GCP project is set with
gcloud config get-value project - Confirm the
google-cloud-storagepackage is installed:pip list | grep google-cloud-storage
Steps
- Import the
storagemodule fromgoogle.cloudand create aClient()instance with your project ID - Get a bucket reference using
client.bucket(bucket_name)and check if it exists with.exists() - Upload a file using
bucket.blob(destination_blob_name).upload_from_filename(local_file_path) - Download a file using
blob.download_to_filename(local_file_path)after getting the blob withbucket.get_blob(blob_name) - List all blobs in a bucket by iterating over
client.list_blobs(bucket_name) - Delete a blob using
blob.delete()orbucket.delete_blob(blob_name) - Set blob metadata and content type during upload with the
content_typeparameter inupload_from_filename() - Make a blob publicly readable by calling
blob.make_public()after upload
Code
from google.cloud import storage
from google.cloud.exceptions import NotFound
import os
def gcs_operations_demo(project_id, bucket_name, local_file_path):
"""Demonstrates common GCS operations."""
# Initialize client
client = storage.Client(project=project_id)
# Check if bucket exists, create if needed
bucket = client.bucket(bucket_name)
if not bucket.exists():
bucket = client.create_bucket(bucket_name)
print(f"Bucket {bucket_name} created")
else:
print(f"Bucket {bucket_name} exists")
# Upload file
blob_name = os.path.basename(local_file_path)
blob = bucket.blob(blob_name)
blob.upload_from_filename(local_file_path, content_type="text/plain")
print(f"Uploaded {blob_name} to {bucket_name}")
# Download file
download_path = f"downloaded_{blob_name}"
blob.download_to_filename(download_path)
print(f"Downloaded {blob_name} to {download_path}")
# List all blobs
print("\nBlobs in bucket:")
for blob_obj in client.list_blobs(bucket_name):
print(f" - {blob_obj.name} ({blob_obj.size} bytes)")
# Get blob metadata
blob_metadata = bucket.get_blob(blob_name)
Note: this example was truncated in the source. See the GitHub repo for the latest full version.
Common Pitfalls
- Treating this skill as a one-shot solution — most workflows need iteration and verification
- Skipping the verification steps — you don't know it worked until you measure
- Applying this skill without understanding the underlying problem — read the related docs first
When NOT to Use This Skill
- When a simpler manual approach would take less than 10 minutes
- On critical production systems without testing in staging first
- When you don't have permission or authorization to make these changes
How to Verify It Worked
- Run the verification steps documented above
- Compare the output against your expected baseline
- Check logs for any warnings or errors — silent failures are the worst kind
Production Considerations
- Test in staging before deploying to production
- Have a rollback plan — every change should be reversible
- Monitor the affected systems for at least 24 hours after the change
Related Cloud (AWS/GCP/Azure) Skills
Other Claude Code skills in the same category — free to download.
Lambda Function
Create AWS Lambda function with handler
S3 Operations
Set up S3 bucket operations (upload, download, presigned URLs)
DynamoDB CRUD
Create DynamoDB CRUD operations
SQS Setup
Set up SQS queue producer and consumer
SNS Notifications
Configure SNS for push notifications
CloudFront Setup
Set up CloudFront CDN distribution
Cognito Auth
Implement AWS Cognito authentication
RDS Setup
Configure RDS database connection
Want a Cloud (AWS/GCP/Azure) skill personalized to YOUR project?
This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.