During re:Invent 2014 Amazon have introduced two new features to their popular storage products, DynamoDB and S3.
A new Event Notifications for Amazon S3 enables developers to create notifications that can be send to SQS or SNS when a new object is added to the bucket or an existing object is overwritten. Notifications can also be delivered to AWS Lambda for processing by a Lambda function. Additionally, publishing to Kinesis is planned to be released soon.
This feature has been explained in an Amazon Web Services blog post - New Event Notifications for Amazon S3- in greater detail:
Notifications are configured at the bucket level and apply to all of the objects in the bucket (we plan to provide control at a finer level at some point). You can elect to receive notification for any or all of the following events:
- s3:ObjectCreated:Put - An object was created by an HTTP PUT operation.
- s3:ObjectCreated:Post - An object was created by HTTP POST operation.
- s3:ObjectCreated:Copy - An object was created an S3 copy operation.
- s3:ObjectCreated:CompleteMultipartUpload - An object was created by the completion of a S3 multi-part upload.
- s3:ObjectCreated:* - An object was created by one of the event types listed above or by a similar object creation event added in the future.
- s3:ReducedRedundancyObjectLost - An S3 object stored with Reduced Redundancy has been lost.
Notifications are delivered as JSON objects containing all the information about the created/updated object and the event that caused the notification.
This feature is available now and is free (usual messaging and execution charges for SQS, SNS, and Lambda apply). S3 notifications are reliable and are delivered under a second. The bucket and the target (SQS, SNS, or Lambda) must reside in the same AWS Region for notifications to work. One notification per event type per bucket can be configured.
DynamoDB Streams, another feature announced during the conference, provides access to DynamoDB’s update log. An AWS blog post, Sneak Preview - DynamoDB Streams, explains:
Once you enable [streams] for a DynamoDB table, all changes (puts, updates, and deletes) made to the table are tracked on a rolling 24-hour basis. You can retrieve this stream of update records with a single API call.
DynamoDB streams are defined on at a table level and allow to retrieve updates at roughly twice the rate of the provisioned write capacity of a table and contain every update made to the table. If multiple updates are made to the same item in a table, the stream guarantees that these updates will show up in the stream in the same order in which they were made. Updates will be stored for 24 hours even if a table has been deleted.
The DynamoDB Streams feature is currently available as a preview. At the same time, Amazon also released DynamoDB Streams Kinesis Adapter to Process Stream Records, which can be used to process DynamoDB stream records using Kinesis. Amazon also plans to provide DynamoDB stream connectors for Amazon CloudSearch, Amazon Redshift, and other relevant AWS services.
These new features simplify building many new innovative applications including replica creation, real-time data analytics, cache management and business process execution, etc. They also make both DynamoDB and S3 a natural fit for Amazon’s Lambda architecture.