Skip to content

docs(batch): review API docs & README #2562

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
May 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 17 additions & 17 deletions docs/utilities/batch.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,11 +113,11 @@ Processing batches from SQS works in three stages:

=== "index.ts"

```typescript hl_lines="1-5 14 17 29-31"
```typescript hl_lines="1-5 9 12 21-23"
--8<--
examples/snippets/batch/gettingStartedSQS.ts::16
examples/snippets/batch/gettingStartedSQS.ts:18:29
examples/snippets/batch/gettingStartedSQS.ts:31:34
examples/snippets/batch/gettingStartedSQS.ts::11
examples/snippets/batch/gettingStartedSQS.ts:13:21
examples/snippets/batch/gettingStartedSQS.ts:23:25
--8<--
```

Expand All @@ -144,7 +144,7 @@ Processing batches from SQS works in three stages:
When using [SQS FIFO queues](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html){target="_blank"}, we will stop processing messages after the first failure, and return all failed and unprocessed messages in `batchItemFailures`.
This helps preserve the ordering of messages in your queue.

```typescript hl_lines="1-4 13 28-30"
```typescript hl_lines="1-4 8 20-22"
--8<-- "examples/snippets/batch/gettingStartedSQSFifo.ts"
```

Expand All @@ -167,11 +167,11 @@ Processing batches from Kinesis works in three stages:

=== "index.ts"

```typescript hl_lines="1-5 14 17 27-29"
```typescript hl_lines="1-5 9 12 19-21"
--8<-- "examples/snippets/batch/gettingStartedKinesis.ts"
```

1. **Step 1**. Creates a partial failure batch processor for Kinesis Data Streams. See [partial failure mechanics for details](#partial-failure-mechanics)
1. Creates a partial failure batch processor for Kinesis Data Streams. See [partial failure mechanics for details](#partial-failure-mechanics)

=== "Sample response"

Expand Down Expand Up @@ -200,11 +200,11 @@ Processing batches from DynamoDB Streams works in three stages:

=== "index.ts"

```typescript hl_lines="1-5 14 17 32-34"
```typescript hl_lines="1-5 9 12 24-26"
--8<-- "examples/snippets/batch/gettingStartedDynamoDBStreams.ts"
```

1. **Step 1**. Creates a partial failure batch processor for DynamoDB Streams. See [partial failure mechanics for details](#partial-failure-mechanics)
1. Creates a partial failure batch processor for DynamoDB Streams. See [partial failure mechanics for details](#partial-failure-mechanics)

=== "Sample response"

Expand All @@ -226,17 +226,17 @@ By default, we catch any exception raised by your record handler function. This

=== "Sample error handling with custom exception"

```typescript hl_lines="30"
```typescript hl_lines="25"
--8<--
examples/snippets/batch/gettingStartedErrorHandling.ts::29
examples/snippets/batch/gettingStartedErrorHandling.ts:31:38
examples/snippets/batch/gettingStartedErrorHandling.ts:40:43
examples/snippets/batch/gettingStartedErrorHandling.ts::24
examples/snippets/batch/gettingStartedErrorHandling.ts:26:30
examples/snippets/batch/gettingStartedErrorHandling.ts:32:
--8<--
```

1. Any exception works here. See [extending BatchProcessorSync section, if you want to override this behavior.](#extending-batchprocessor)

2. Exceptions raised in `record_handler` will propagate to `process_partial_response`. <br/><br/> We catch them and include each failed batch item identifier in the response dictionary (see `Sample response` tab).
2. Exceptions raised in `recordHandler` will propagate to `process_partial_response`. <br/><br/> We catch them and include each failed batch item identifier in the response dictionary (see `Sample response` tab).

=== "Sample response"

Expand Down Expand Up @@ -397,7 +397,7 @@ Use the `BatchProcessor` directly in your function to access a list of all retur
* **When successful**. We will include a tuple with `success`, the result of `recordHandler`, and the batch record
* **When failed**. We will include a tuple with `fail`, exception as a string, and the batch record

```typescript hl_lines="25 27-28 30-33 38" title="Accessing processed messages"
```typescript hl_lines="17 19-20 23 28" title="Accessing processed messages"
--8<-- "examples/snippets/batch/accessProcessedMessages.ts"
```

Expand All @@ -410,7 +410,7 @@ Within your `recordHandler` function, you might need access to the Lambda contex

We can automatically inject the [Lambda context](https://docs.aws.amazon.com/lambda/latest/dg/typescript-context.html){target="_blank"} into your `recordHandler` as optional second argument if you register it when using `BatchProcessorSync` or the `processPartialResponseSync` function.

```typescript hl_lines="17 35"
```typescript hl_lines="12 27"
--8<-- "examples/snippets/batch/accessLambdaContext.ts"
```

Expand All @@ -425,7 +425,7 @@ For these scenarios, you can subclass `BatchProcessor` and quickly override `suc

Let's suppose you'd like to add a metric named `BatchRecordFailures` for each batch record that failed processing

```typescript hl_lines="3 20 24 31 37" title="Extending failure handling mechanism in BatchProcessor"
```typescript hl_lines="3 15 19 26 32" title="Extending failure handling mechanism in BatchProcessor"
--8<-- "examples/snippets/batch/extendingFailure.ts"
```

Expand Down
15 changes: 3 additions & 12 deletions examples/snippets/batch/accessLambdaContext.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,7 @@ import {
processPartialResponse,
} from '@aws-lambda-powertools/batch';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
SQSEvent,
SQSRecord,
Context,
SQSBatchResponse,
} from 'aws-lambda';
import type { SQSRecord, Context, SQSHandler } from 'aws-lambda';

const processor = new BatchProcessor(EventType.SQS);
const logger = new Logger();
Expand All @@ -27,11 +22,7 @@ const recordHandler = (record: SQSRecord, lambdaContext?: Context): void => {
}
};

export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
return processPartialResponse(event, recordHandler, processor, {
export const handler: SQSHandler = async (event, context) =>
processPartialResponse(event, recordHandler, processor, {
context,
});
};
16 changes: 3 additions & 13 deletions examples/snippets/batch/accessProcessedMessages.ts
Original file line number Diff line number Diff line change
@@ -1,11 +1,6 @@
import { BatchProcessor, EventType } from '@aws-lambda-powertools/batch';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
SQSEvent,
SQSRecord,
Context,
SQSBatchResponse,
} from 'aws-lambda';
import type { SQSRecord, SQSHandler } from 'aws-lambda';

const processor = new BatchProcessor(EventType.SQS);
const logger = new Logger();
Expand All @@ -18,19 +13,14 @@ const recordHandler = (record: SQSRecord): void => {
}
};

export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
export const handler: SQSHandler = async (event, context) => {
const batch = event.Records; // (1)!

processor.register(batch, recordHandler, { context }); // (2)!
const processedMessages = await processor.process();

for (const message of processedMessages) {
const status: 'success' | 'fail' = message[0];
const error = message[1];
const record = message[2];
const [status, error, record] = message;

logger.info('Processed record', { status, record, error });
}
Expand Down
17 changes: 5 additions & 12 deletions examples/snippets/batch/advancedTracingRecordHandler.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,7 @@ import {
import { Tracer } from '@aws-lambda-powertools/tracer';
import { captureLambdaHandler } from '@aws-lambda-powertools/tracer/middleware';
import middy from '@middy/core';
import type {
SQSEvent,
SQSRecord,
Context,
SQSBatchResponse,
} from 'aws-lambda';
import type { SQSRecord, SQSHandler, SQSEvent } from 'aws-lambda';

const processor = new BatchProcessor(EventType.SQS);
const tracer = new Tracer({ serviceName: 'serverlessAirline' });
Expand All @@ -35,10 +30,8 @@ const recordHandler = async (record: SQSRecord): Promise<void> => {
subsegment?.close(); // (3)!
};

export const handler = middy(
async (event: SQSEvent, context: Context): Promise<SQSBatchResponse> => {
return processPartialResponse(event, recordHandler, processor, {
context,
});
}
export const handler: SQSHandler = middy(async (event: SQSEvent, context) =>
processPartialResponse(event, recordHandler, processor, {
context,
})
).use(captureLambdaHandler(tracer));
10 changes: 3 additions & 7 deletions examples/snippets/batch/customPartialProcessor.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ import type {
FailureResponse,
BaseRecord,
} from '@aws-lambda-powertools/batch/types';
import type { SQSEvent, Context, SQSBatchResponse } from 'aws-lambda';
import type { SQSHandler } from 'aws-lambda';

const tableName = process.env.TABLE_NAME || 'table-not-found';

Expand Down Expand Up @@ -89,11 +89,7 @@ const recordHandler = (): number => {
return Math.floor(randomInt(1, 10));
};

export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
return processPartialResponse(event, recordHandler, processor, {
export const handler: SQSHandler = async (event, context) =>
processPartialResponse(event, recordHandler, processor, {
context,
});
};
15 changes: 3 additions & 12 deletions examples/snippets/batch/extendingFailure.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,7 @@ import type {
EventSourceDataClassTypes,
} from '@aws-lambda-powertools/batch/types';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
SQSEvent,
SQSRecord,
Context,
SQSBatchResponse,
} from 'aws-lambda';
import type { SQSRecord, SQSHandler } from 'aws-lambda';

class MyProcessor extends BatchProcessor {
#metrics: Metrics;
Expand Down Expand Up @@ -45,11 +40,7 @@ const recordHandler = (record: SQSRecord): void => {
}
};

export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
return processPartialResponse(event, recordHandler, processor, {
export const handler: SQSHandler = async (event, context) =>
processPartialResponse(event, recordHandler, processor, {
context,
});
};
15 changes: 3 additions & 12 deletions examples/snippets/batch/gettingStartedDynamoDBStreams.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,7 @@ import {
processPartialResponse,
} from '@aws-lambda-powertools/batch';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
DynamoDBStreamEvent,
DynamoDBRecord,
Context,
DynamoDBBatchResponse,
} from 'aws-lambda';
import type { DynamoDBRecord, DynamoDBStreamHandler } from 'aws-lambda';

const processor = new BatchProcessor(EventType.DynamoDBStreams); // (1)!
const logger = new Logger();
Expand All @@ -25,11 +20,7 @@ const recordHandler = async (record: DynamoDBRecord): Promise<void> => {
}
};

export const handler = async (
event: DynamoDBStreamEvent,
context: Context
): Promise<DynamoDBBatchResponse> => {
return processPartialResponse(event, recordHandler, processor, {
export const handler: DynamoDBStreamHandler = async (event, context) =>
processPartialResponse(event, recordHandler, processor, {
context,
});
};
15 changes: 3 additions & 12 deletions examples/snippets/batch/gettingStartedErrorHandling.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,7 @@ import {
processPartialResponse,
} from '@aws-lambda-powertools/batch';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
SQSEvent,
SQSRecord,
Context,
SQSBatchResponse,
} from 'aws-lambda';
import type { SQSRecord, SQSHandler } from 'aws-lambda';

const processor = new BatchProcessor(EventType.SQS);
const logger = new Logger();
Expand All @@ -32,12 +27,8 @@ const recordHandler = async (record: SQSRecord): Promise<void> => {
}
};

export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
export const handler: SQSHandler = async (event, context) =>
// prettier-ignore
return processPartialResponse(event, recordHandler, processor, { // (2)!
processPartialResponse(event, recordHandler, processor, { // (2)!
context,
});
};
15 changes: 3 additions & 12 deletions examples/snippets/batch/gettingStartedKinesis.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,7 @@ import {
processPartialResponse,
} from '@aws-lambda-powertools/batch';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
KinesisStreamEvent,
KinesisStreamRecord,
Context,
KinesisStreamBatchResponse,
} from 'aws-lambda';
import type { KinesisStreamHandler, KinesisStreamRecord } from 'aws-lambda';

const processor = new BatchProcessor(EventType.KinesisDataStreams); // (1)!
const logger = new Logger();
Expand All @@ -20,11 +15,7 @@ const recordHandler = async (record: KinesisStreamRecord): Promise<void> => {
logger.info('Processed item', { item: payload });
};

export const handler = async (
event: KinesisStreamEvent,
context: Context
): Promise<KinesisStreamBatchResponse> => {
return processPartialResponse(event, recordHandler, processor, {
export const handler: KinesisStreamHandler = async (event, context) =>
processPartialResponse(event, recordHandler, processor, {
context,
});
};
16 changes: 4 additions & 12 deletions examples/snippets/batch/gettingStartedSQS.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,7 @@ import {
processPartialResponse,
} from '@aws-lambda-powertools/batch';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
SQSEvent,
SQSRecord,
Context,
SQSBatchResponse,
} from 'aws-lambda';
import type { SQSRecord, SQSHandler } from 'aws-lambda';

const processor = new BatchProcessor(EventType.SQS); // (1)!
const logger = new Logger();
Expand All @@ -23,13 +18,10 @@ const recordHandler = async (record: SQSRecord): Promise<void> => { // (2)!
}
};

export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
export const handler: SQSHandler = async (event, context) =>
// prettier-ignore
return processPartialResponse(event, recordHandler, processor, { // (3)!
processPartialResponse(event, recordHandler, processor, { // (3)!
context,
});
};

export { processor };
15 changes: 3 additions & 12 deletions examples/snippets/batch/gettingStartedSQSFifo.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,7 @@ import {
processPartialResponseSync,
} from '@aws-lambda-powertools/batch';
import { Logger } from '@aws-lambda-powertools/logger';
import type {
SQSEvent,
SQSRecord,
Context,
SQSBatchResponse,
} from 'aws-lambda';
import type { SQSHandler, SQSRecord } from 'aws-lambda';

const processor = new SqsFifoPartialProcessor(); // (1)!
const logger = new Logger();
Expand All @@ -21,11 +16,7 @@ const recordHandler = (record: SQSRecord): void => {
}
};

export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
return processPartialResponseSync(event, recordHandler, processor, {
export const handler: SQSHandler = async (event, context) =>
processPartialResponseSync(event, recordHandler, processor, {
context,
});
};
2 changes: 1 addition & 1 deletion examples/snippets/batch/testingYourCode.ts
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ describe('Function tests', () => {
};

// Act
const response = await handler(sqsEvent, context);
const response = await handler(sqsEvent, context, () => {});

// Assess
expect(response).toEqual(expectedResponse);
Expand Down
Loading