Hey there, fellow JavaScript devs! Ready to dive into the world of Amazon S3? Let's talk about syncing data for user-facing integrations. Buckle up, because we're about to make your life a whole lot easier.
First things first, let's get that S3 client up and running. You'll need the AWS SDK, so go ahead and install it:
npm install aws-sdk
Now, let's initialize that client:
const AWS = require('aws-sdk'); const s3 = new AWS.S3({ accessKeyId: 'YOUR_ACCESS_KEY', secretAccessKey: 'YOUR_SECRET_KEY', region: 'us-west-2' });
Pro tip: Keep those credentials safe! Use environment variables or AWS's recommended practices for handling secrets.
Reading data is a breeze. Check this out:
async function readObject(bucket, key) { const params = { Bucket: bucket, Key: key }; const { Body } = await s3.getObject(params).promise(); return Body.toString('utf-8'); }
Need to handle larger files? Streams are your friend:
function streamObject(bucket, key) { return s3.getObject({ Bucket: bucket, Key: key }).createReadStream(); }
Writing is just as easy. Here's how you upload an object with some custom metadata:
async function writeObject(bucket, key, body, metadata) { const params = { Bucket: bucket, Key: key, Body: body, Metadata: metadata }; return s3.putObject(params).promise(); }
Now, let's get to the good stuff - syncing data. Here's a basic sync function:
async function syncData(localData, bucket, key) { try { const s3Data = await readObject(bucket, key); if (JSON.stringify(localData) !== s3Data) { await writeObject(bucket, key, JSON.stringify(localData)); console.log('Data synced successfully'); } else { console.log('Data already in sync'); } } catch (error) { if (error.code === 'NoSuchKey') { await writeObject(bucket, key, JSON.stringify(localData)); console.log('Initial data upload complete'); } else { throw error; } } }
For those hefty files, multipart uploads are your best bet:
async function multipartUpload(bucket, key, filePath) { const fileStream = fs.createReadStream(filePath); const multipartParams = { Bucket: bucket, Key: key, Body: fileStream }; try { const data = await s3.upload(multipartParams).promise(); console.log('Upload Success', data.Location); } catch (error) { console.error('Error', error); } }
Want to give users temporary access? Pre-signed URLs are the way to go:
function getSignedUrl(bucket, key, expirationInSeconds = 60) { const params = { Bucket: bucket, Key: key, Expires: expirationInSeconds }; return s3.getSignedUrlPromise('getObject', params); }
Always be prepared for the unexpected:
try { await s3.headBucket({ Bucket: 'my-bucket' }).promise(); } catch (error) { if (error.code === 'NotFound') { console.error('Bucket not found'); } else if (error.code === 'Forbidden') { console.error('Insufficient permissions'); } else { console.error('Unexpected error', error); } }
Testing is crucial. Here's a quick example using Jest:
jest.mock('aws-sdk'); test('readObject returns correct data', async () => { const mockGetObject = jest.fn().mockReturnValue({ promise: () => Promise.resolve({ Body: Buffer.from('test data') }) }); AWS.S3.mockImplementation(() => ({ getObject: mockGetObject })); const result = await readObject('test-bucket', 'test-key'); expect(result).toBe('test data'); });
And there you have it! You're now equipped to tackle reading and writing data with the Amazon S3 API like a pro. Remember, practice makes perfect, so get out there and start building. Happy coding!