Free 40-page Claude guide — setup, 120 prompt codes, MCP servers, AI agents. Download free →
CLSkills
Cloud (AWS/GCP/Azure)intermediate

GCS Operations

Share

Google Cloud Storage operations

Works with OpenClaude

You are a Google Cloud Storage developer. The user wants to perform common GCS operations programmatically using the Python client library.

What to check first

  • Run gcloud auth application-default login to set up credentials
  • Verify the GCP project is set with gcloud config get-value project
  • Confirm the google-cloud-storage package is installed: pip list | grep google-cloud-storage

Steps

  1. Import the storage module from google.cloud and create a Client() instance with your project ID
  2. Get a bucket reference using client.bucket(bucket_name) and check if it exists with .exists()
  3. Upload a file using bucket.blob(destination_blob_name).upload_from_filename(local_file_path)
  4. Download a file using blob.download_to_filename(local_file_path) after getting the blob with bucket.get_blob(blob_name)
  5. List all blobs in a bucket by iterating over client.list_blobs(bucket_name)
  6. Delete a blob using blob.delete() or bucket.delete_blob(blob_name)
  7. Set blob metadata and content type during upload with the content_type parameter in upload_from_filename()
  8. Make a blob publicly readable by calling blob.make_public() after upload

Code

from google.cloud import storage
from google.cloud.exceptions import NotFound
import os

def gcs_operations_demo(project_id, bucket_name, local_file_path):
    """Demonstrates common GCS operations."""
    
    # Initialize client
    client = storage.Client(project=project_id)
    
    # Check if bucket exists, create if needed
    bucket = client.bucket(bucket_name)
    if not bucket.exists():
        bucket = client.create_bucket(bucket_name)
        print(f"Bucket {bucket_name} created")
    else:
        print(f"Bucket {bucket_name} exists")
    
    # Upload file
    blob_name = os.path.basename(local_file_path)
    blob = bucket.blob(blob_name)
    blob.upload_from_filename(local_file_path, content_type="text/plain")
    print(f"Uploaded {blob_name} to {bucket_name}")
    
    # Download file
    download_path = f"downloaded_{blob_name}"
    blob.download_to_filename(download_path)
    print(f"Downloaded {blob_name} to {download_path}")
    
    # List all blobs
    print("\nBlobs in bucket:")
    for blob_obj in client.list_blobs(bucket_name):
        print(f"  - {blob_obj.name} ({blob_obj.size} bytes)")
    
    # Get blob metadata
    blob_metadata = bucket.get_blob(blob_name)

Note: this example was truncated in the source. See the GitHub repo for the latest full version.

Common Pitfalls

  • Treating this skill as a one-shot solution — most workflows need iteration and verification
  • Skipping the verification steps — you don't know it worked until you measure
  • Applying this skill without understanding the underlying problem — read the related docs first

When NOT to Use This Skill

  • When a simpler manual approach would take less than 10 minutes
  • On critical production systems without testing in staging first
  • When you don't have permission or authorization to make these changes

How to Verify It Worked

  • Run the verification steps documented above
  • Compare the output against your expected baseline
  • Check logs for any warnings or errors — silent failures are the worst kind

Production Considerations

  • Test in staging before deploying to production
  • Have a rollback plan — every change should be reversible
  • Monitor the affected systems for at least 24 hours after the change

Quick Info

Difficultyintermediate
Version1.0.0
AuthorClaude Skills Hub
cloudgcpstorage

Install command:

curl -o ~/.claude/skills/gcs-operations.md https://claude-skills-hub.vercel.app/skills/cloud/gcs-operations.md

Related Cloud (AWS/GCP/Azure) Skills

Other Claude Code skills in the same category — free to download.

Want a Cloud (AWS/GCP/Azure) skill personalized to YOUR project?

This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.