Hey there, fellow developer! Ready to dive into the world of Amazon S3 and Python? You're in for a treat. Amazon S3 (Simple Storage Service) is a game-changer when it comes to object storage, and with boto3, Python's AWS SDK, we're going to make it sing.
Before we jump in, make sure you've got:
pip install boto3
- easy peasy)Let's get this party started:
import boto3 # Configure your credentials (if not using IAM roles) boto3.setup_default_session(aws_access_key_id='YOUR_KEY', aws_secret_access_key='YOUR_SECRET')
Pro tip: Using IAM roles? Skip the credential setup - AWS will handle it for you.
Time to make that connection:
s3 = boto3.client('s3')
Boom! You're connected. If something goes wrong, boto3 will raise an exception, so keep an eye out.
response = s3.list_buckets() for bucket in response['Buckets']: print(f"Bucket Name: {bucket['Name']}")
s3.create_bucket(Bucket='my-awesome-new-bucket')
s3.upload_file('local_file.txt', 'my-bucket', 'remote_file.txt')
s3.download_file('my-bucket', 'remote_file.txt', 'downloaded_file.txt')
s3.delete_object(Bucket='my-bucket', Key='remote_file.txt')
s3.delete_bucket(Bucket='my-bucket')
s3.put_object(Bucket='my-bucket', Key='file.txt', Body='Hello, World!', Metadata={'mykey': 'myvalue'})
policy = { 'Version': '2012-10-17', 'Statement': [{ 'Sid': 'AddPerm', 'Effect': 'Allow', 'Principal': '*', 'Action': ['s3:GetObject'], 'Resource': 'arn:aws:s3:::my-bucket/*' }] } s3.put_bucket_policy(Bucket='my-bucket', Policy=json.dumps(policy))
s3.put_bucket_versioning(Bucket='my-bucket', VersioningConfiguration={'Status': 'Enabled'})
notification = { 'QueueConfigurations': [{ 'QueueArn': 'arn:aws:sqs:us-west-2:123456789012:my-queue', 'Events': ['s3:ObjectCreated:*'] }] } s3.put_bucket_notification_configuration(Bucket='my-bucket', NotificationConfiguration=notification)
Always wrap your S3 operations in try-except blocks:
try: s3.upload_file('local_file.txt', 'my-bucket', 'remote_file.txt') except botocore.exceptions.ClientError as e: print(f"An error occurred: {e}")
For retries, consider using the boto3.client
with a custom retry configuration:
config = botocore.config.Config(retries={'max_attempts': 10, 'mode': 'adaptive'}) s3 = boto3.client('s3', config=config)
s3.put_object(Bucket='my-bucket', Key='secret.txt', Body='Top Secret', ServerSideEncryption='AES256')
Unit test your S3 operations using moto, a fantastic mocking library for AWS services:
from moto import mock_s3 @mock_s3 def test_upload_file(): s3 = boto3.client('s3') s3.create_bucket(Bucket='test-bucket') s3.upload_file('test.txt', 'test-bucket', 'test.txt') # Add assertions here
And don't forget, the AWS CLI is your friend for quick verifications:
aws s3 ls s3://my-bucket
There you have it! You're now equipped to tackle Amazon S3 with Python like a pro. Remember, practice makes perfect, so get out there and start building. The sky's the limit with what you can create using S3 and boto3.
Happy coding, and may your buckets always be organized and your uploads swift!