Skip to content

Commit ded3d75

Browse files
committed
docs: user specific documentation
1 parent edcc14a commit ded3d75

File tree

3 files changed

+102
-0
lines changed

3 files changed

+102
-0
lines changed

docs/content/index.mdx

+6
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,12 @@ Powertools is available in PyPi. You can use your favourite dependency managemen
2424
```bash:title=hello_world.sh
2525
sam init --location https://github.com/aws-samples/cookiecutter-aws-sam-python
2626
```
27+
* [Tracing](./core/tracer) - Decorators and utilities to trace Lambda function handlers, and both synchronous and asynchronous functions
28+
* [Logging](./core/logger) - Structured logging made easier, and decorator to enrich structured logging with key Lambda context details
29+
* [Metrics](./core/metrics) - Custom Metrics created asynchronously via CloudWatch Embedded Metric Format (EMF)
30+
* [Bring your own middleware](./utilities/middleware_factory) - Decorator factory to create your own middleware to run logic before, and after each Lambda invocation
31+
* [Parameters utility](./utilities/parameters) - Retrieve parameter values from AWS Systems Manager Parameter Store, AWS Secrets Manager, or Amazon DynamoDB, and cache them for a specific amount of time
32+
* [Batch utility](./utilities/batch) - Batch processing for AWS SQS, with a middleware allow custom record handling
2733

2834
### Lambda Layer
2935

docs/content/utilities/batch.mdx

+95
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,95 @@
1+
---
2+
title: Batch
3+
description: Utility
4+
---
5+
6+
import Note from "../../src/components/Note"
7+
8+
The batch utility provides an abstraction to process a batch event. Useful for lambda integrations with [AWS SQS](https://aws.amazon.com/sqs/), [AWS Kinesis](https://aws.amazon.com/kinesis/) and [AWS DynamoDB Streams](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html).
9+
It also provides base classes (`BaseProcessor`, `BasePartialProcessor`) allowing you to create your **own** batch processor.
10+
11+
**Key Features**
12+
13+
* Run batch processing logic with a clean interface;
14+
* Middleware and context to handle a batch event;
15+
* Removal of successful messages for [AWS SQS](https://aws.amazon.com/sqs/) batch - in case of partial failure.
16+
17+
**IAM Permissions**
18+
19+
This utility requires additional permissions to work as expected. See the following table:
20+
21+
Processor | Function/Method | IAM Permission
22+
|---------|-----------------|---------------|
23+
PartialSQSProcessor | `_clean` | `sqs:DeleteMessageBatch`
24+
25+
### PartialSQSProcessor
26+
27+
A special batch processor which aims to `clean` your SQS:Queue if one or more (not all) records of the batch fails.
28+
A batch's partial failure sends back all the records to the queue, reprocessing this batch until all records succed.
29+
This processor exists to improve performance in such cases, deleting successful messages of a batch with partial failure.
30+
31+
### Middleware
32+
33+
```python:title=app.py
34+
from aws_lambda_powertools.utilities.batch import batch_processor, PartialSQSProcessor
35+
36+
def record_handler(record):
37+
return record["body"]
38+
39+
# highlight-start
40+
@batch_processor(record_handler=record_handler, processor=PartialSQSProcessor())
41+
# highlight-end
42+
def lambda_handler(event, context):
43+
return {"statusCode": 200}
44+
```
45+
46+
## Create your own processor
47+
48+
You can create your own batch processor by inheriting the `BaseProcessor` class, and implementing `_prepare()`, `_clean` and `_process_record()`.
49+
It's also possible to inherit the `BasePartialProcessor` which contains additional logic to handle a partial failure and keep track of record status.
50+
51+
Here is an example implementation of a DynamoDBStream custom processor:
52+
53+
```python:title=custom_processor.py
54+
import json
55+
56+
from aws_lambda_powertools.utilities.batch import BaseProcessor, batch_processor
57+
import boto3
58+
59+
class DynamoDBProcessor(BaseProcessor):
60+
61+
def __init__(self, queue_url: str):
62+
self.queue_url = queue_url
63+
self.client = boto3.client("sqs")
64+
65+
def _prepare(self):
66+
pass
67+
68+
def _clean(self):
69+
pass
70+
71+
def _process_record(self, record):
72+
"""
73+
Process record and send result to sqs
74+
"""
75+
result = self.handler(record)
76+
body = json.dumps(result)
77+
self.client.send_message(QueueUrl=self.queue_url, MessageBody=body)
78+
return result
79+
80+
def record_handler(record):
81+
return record["Keys"]
82+
83+
# As context
84+
85+
processor = DynamoDBProcessor("dummy")
86+
records = {"Records": []}
87+
88+
with processor(records=records, handler=record_handler) as ctx:
89+
result = ctx.process()
90+
91+
# As middleware
92+
@batch.batch_processor(record_handler=record_handler, processor=DynamoDBProcessor("dummy"))
93+
def lambda_handler(event, context):
94+
return {"statusCode": 200}
95+
```

docs/gatsby-config.js

+1
Original file line numberDiff line numberDiff line change
@@ -32,6 +32,7 @@ module.exports = {
3232
'Utilities': [
3333
'utilities/middleware_factory',
3434
'utilities/parameters',
35+
'utilities/batch',
3536
],
3637
},
3738
navConfig: {

0 commit comments

Comments
 (0)