-
Notifications
You must be signed in to change notification settings - Fork 90
RFC: Handling Large messages for SQS and SNS #1259
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hey @jeromevdl this looks good! Some thoughts below. ModulesI think the separate module - Serialization
public void processMessage(SQSMessage message, Context context) {
} In the other RFC, we also want to allow the user to provide a handler that takes the deserialized payload - e.g. here with the user defined type @Override
public void processItem(Basket basket, Context context) {
// do some stuff with the item
System.out.println(basket.toString());
} In this RFC if the user wants to use large message handling, they will have to deserialize the payload themselves. Is there any way we can structure the impl so that we can could annotate a handler like this, but have the weaving reach in and 1/ expand the large message, and then 2/ deserialize the payload to the user type? Intuitively I think this would be at least hard, and involve coupling the new Should we target the lambda or the user function consuming the record itself?IMHO the latter. This 1/ gives the user more flexibility with use of the feature (you see this already with the way you can use the existing SQS large message annotation on any function) and 2/ works better with #797 in batch mode as you observe. I think in general we should avoid making new features couple against the lambda handler because of this loss of flexibility. SNS Single Message in a batch Issue
We could make the batch handler handle SNS "batches" too, and then you could process a large message batch simply even though it would only ever be a single record. This is however poor from a 1/ complexity and 2/ performance perspective. I don't see another option here yet. |
LGTM. It would be nice if there was some way to use this and still have the inner-message deserialized to a user-defined type, but I don't see an obvious way to do this. |
Looks good to me as well. Let's ensure we check that the |
Done, will be released in 1.17.0 soon. |
Key information
Summary
We should handle large messages (with s3) for SQS and SNS and provide a better user experience (error handling, batch...) than today with the
@SqsLargeMessage
annotation.Motivation
Proposal
SQS
This feature should be considered in conjunction with #797 but should be in an individual module (eg.
powertools-large-messages
?). The aim of #797 is to provide a simple interface so that developers can focus on handling each message of the batch while the library takes care of failures. This module should permit to handle large messages individually.if #797 is providing the following interface:
this new module should provide the annotation
@LargeMessage
to handle the message from S3 (implementation would be quite similar as today, replacing the body with the content of the s3 object):We can also keep the option to not delete the object in S3 (
@LargeMessage(deletePayload = false)
).SNS
On SNS, despite
SNSEvent
has a list of records, there will always be only one message (see doc). The developer would have to create a method to handle a singleSNSRecord
(as first parameter):Drawbacks
For SNS, it forces the developer to have the following code in their handler:
Rationale and alternatives
Unresolved questions
The text was updated successfully, but these errors were encountered: