You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Processing database activity streams from Amazon Aurora in AWS Lambda requires heavy lifting work to process the events from kinesis (encryption, serializing, ...).
Motivation
Improve the developer experience reducing the heavy lifting work to process aurora streams
Proposal
With this new module, and a simple functional interface the customers can process easily the event without worrying to decrypt and use the SDK to get the java object with the stream information
A functional interface (like the SQS module) to process every single aurora stream event to allow developers get the stream information to do something with it.
This functional interface includes the sdk utilities to decrypt the message from the KinesisEvent object and serialize the object
Thanks @joanbonilla For opening the RFC. I am wondering at this point, if it would be worth it combine the efforts and actually make it part of new batch utility which we are starting to work upon as part of this RFC for java too aws-powertools/powertools-lambda#64
We are rewriting the batch module (see #797). We can think of an additional module to handle Database Activity Stream.
Thanks for your code sample. Also adding this link as a reference implementation. Note that it's available both for Aurora Postgres and MySQL. Looks like there is something for Oracle and SQL Server: doc.
Similar to the large message handling (#1259), we could think of an annotation to put on the processRecord method:
Key information
Summary
Processing database activity streams from Amazon Aurora in AWS Lambda requires heavy lifting work to process the events from kinesis (encryption, serializing, ...).
Motivation
Improve the developer experience reducing the heavy lifting work to process aurora streams
Proposal
With this new module, and a simple functional interface the customers can process easily the event without worrying to decrypt and use the SDK to get the java object with the stream information
A functional interface (like the SQS module) to process every single aurora stream event to allow developers get the stream information to do something with it.
This functional interface includes the sdk utilities to decrypt the message from the KinesisEvent object and serialize the object
More details here: https://github.com/joanbonilla/aws-lambda-powertools-java/tree/aurora-das-handler
Example of use: https://github.com/joanbonilla/sample-db-stream-pw
Drawbacks
At the moment is has been tested for postgresql streams only
Unresolved questions
mysql implementation
The text was updated successfully, but these errors were encountered: