Explain Codes LogoExplain Codes Logo

How to write a file or data to an S3 object using boto3

python
boto3
aws-credentials
data-storage
Alex KataevbyAlex Kataev·Feb 22, 2025
TLDR

Want to store data in an S3 bucket swiftly using boto3? Here's the quickest way. First, ensure your AWS credentials are configured correctly:

import boto3 s3 = boto3.client('s3') # Secure delivery man for your data s3.put_object(Bucket='my-bucket', Key='example.txt', Body='Hello, World!') # S3 bucket + key = your new cloud locker

This saves 'Hello, World!' to the file 'example.txt'. Replace Body='Hello, World!' with Body=open('file.txt', 'rb') to save a file instead.

Direct file upload made easy

For an efficient upload of larger files, use upload_file(), your personal data doorman:

import boto3 s3 = boto3.resource('s3') # Not a bucket, but a resourceful bucket! s3.Bucket('my-bucket').upload_file('path/to/local/file.txt', 'file.txt') # That path ends right in the bucket

Just make sure you are knocking on the right door: check the file paths!

Playing with JSON data storage

Got loads of JSON data? Python's got you! json.dumps() helps convert Python objects to JSON format:

import json import boto3 s3 = boto3.resource('s3') # Calling our bucket resource again data = {'hello': 'world'} # You can't say it enough, can you? # Give your data a new home in the cloud s3.Object('my-bucket', 'data.json').put(Body=json.dumps(data))

Swing the other way and bring your JSON data back to Python:

json_content = s3.Object('my-bucket', 'data.json').get()['Body'].read().decode('utf-8') data = json.loads(json_content) # Welcome home, data!

AWS credentials deserve maximum security

Data security is our prime motto! Avoid hardcoding AWS credentials. Instead, tuck them away safely in your .aws/credentials or environment variables.

Binary data? No problem!

Got binary data to store, like images or videos? Make sure you read the file in 'rb' mode:

s3.put_object(Bucket='my-bucket', Key='image.jpg', Body=open('path/to/me_smiling_with_bucket.jpg', 'rb')) # Possibly the first image of you in S3!

Lambda functions for JSON read/write operations

Want to simplify your data storage? Leverage lambda functions:

# Define short lambda functions for JSON operations # because why write more when you can write less? json_dump_s3 = lambda obj, bucket, key: s3.Object(bucket, key).put(Body=json.dumps(obj)) json_load_s3 = lambda bucket, key: json.loads(s3.Object(bucket, key).get()['Body'].read().decode('utf-8')) # Use the lambdas for a smoother operation json_dump_s3({'hello': 'world'}, 'my-bucket', 'data.json') data = json_load_s3('my-bucket', 'data.json') # It's like data never left us!

Error handling

Bad situations give you the best life experiences! Equip your code to handle potential exceptions, adopting a never-fail attitude:

from botocore.exceptions import NoCredentialsError, PartialUploadError try: s3.upload_file('file.txt', 'my-bucket', 'file.txt') except NoCredentialsError: print('Credentials not available, please insert and try again') # Oops, forgot my keys! except PartialUploadError: print('Partial upload, file may be too large or network interrupted') # Probably the 'large' in Amazon Large Storage was a joke!