Skip to content

Commit 49eb589

Browse files
committed
Merge branch 'develop' into feat/validator-utility
* develop: (57 commits) chore: bump version to 1.5.0 (#158) chore(batch): Housekeeping for recent changes (#157) docs: address readability feedbacks chore: add debug logging for sqs batch processing chore: remove middlewares module, moving decorator functionality to base and sqs docs: add detail to batch processing fix: throw exception by default if messages processing fails chore: add test for sqs_batch_processor interface fix: add sqs_batch_processor as its own method docs: simplify documentation more SQS specific focus Update for sqs_batch_processor interface chore: add sqs_batch_processor decorator to simplify interface feat(parameters): transform = "auto" (#133) chore: fix typos, docstrings and type hints (#154) chore: tiny changes for readability fix: ensure debug log event has latest ctx docs: rephrase the wording to make it more clear docs: refactor example; improve docs about creating your own processor refactor: remove references to BaseProcessor. Left BasePartialProcessor docs: add newly created Slack Channel fix: update image with correct sample ...
2 parents 349e88d + 063d8cd commit 49eb589

31 files changed

+1512
-24
lines changed

.pre-commit-config.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# We use poetry to run formatting and linting before commit/push
2-
# Longers checks such as tests, security and complexity baseline
2+
# Longer checks such as tests, security and complexity baseline
33
# are run as part of CI to prevent slower feedback loop
44
# All checks can be run locally via `make pr`
55

CHANGELOG.md

+12
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,18 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
66

77
## [Unreleased]
88

9+
## [1.5.0] - 2020-09-04
10+
11+
### Added
12+
- **Logger**: Add `xray_trace_id` to log output to improve integration with CloudWatch Service Lens
13+
- **Logger**: Allow reordering of logged output
14+
- **Utilities**: Add new `SQS batch processing` utility to handle partial failures in processing message batches
15+
- **Utilities**: Add typing utility providing static type for lambda context object
16+
- **Utilities**: Add `transform=auto` in parameters utility to deserialize parameter values based on the key name
17+
18+
### Fixed
19+
- **Logger**: The value of `json_default` formatter is no longer written to logs
20+
921
## [1.4.0] - 2020-08-25
1022

1123
### Added

CONTRIBUTING.md

+13
Original file line numberDiff line numberDiff line change
@@ -69,6 +69,19 @@ [email protected] with any additional questions or comments.
6969
## Security issue notifications
7070
If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue.
7171

72+
## Troubleshooting
73+
74+
### API reference documentation
75+
76+
When you are working on the codebase and you use the local API reference documentation to preview your changes, you might see the following message: `Module aws_lambda_powertools not found`.
77+
78+
This happens when:
79+
80+
* You did not install the local dev environment yet
81+
- You can install dev deps with `make dev` command
82+
* The code in the repository is raising an exception while the `pdoc` is scanning the codebase
83+
- Unfortunately, this exception is not shown to you, but if you run, `poetry run pdoc --pdf aws_lambda_powertools`, the exception is shown and you can prevent the exception from being raised
84+
- Once resolved the documentation should load correctly again
7285

7386
## Licensing
7487

README.md

+2
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ A suite of utilities for AWS Lambda functions that makes tracing with AWS X-Ray,
77

88
**[📜Documentation](https://awslabs.github.io/aws-lambda-powertools-python/)** | **[API Docs](https://awslabs.github.io/aws-lambda-powertools-python/api/)** | **[🐍PyPi](https://pypi.org/project/aws-lambda-powertools/)** | **[Feature request](https://github.com/awslabs/aws-lambda-powertools-python/issues/new?assignees=&labels=feature-request%2C+triage&template=feature_request.md&title=)** | **[🐛Bug Report](https://github.com/awslabs/aws-lambda-powertools-python/issues/new?assignees=&labels=bug%2C+triage&template=bug_report.md&title=)** | **[Kitchen sink example](https://github.com/awslabs/aws-lambda-powertools-python/tree/develop/example)** | **[Detailed blog post](https://aws.amazon.com/blogs/opensource/simplifying-serverless-best-practices-with-lambda-powertools/)**
99

10+
> **Join us on the AWS Developers Slack at `#lambda-powertools`** - **[Invite, if you don't have an account](https://join.slack.com/t/awsdevelopers/shared_invite/zt-gu30gquv-EhwIYq3kHhhysaZ2aIX7ew)**
11+
1012
## Features
1113

1214
* **[Tracing](https://awslabs.github.io/aws-lambda-powertools-python/core/tracer/)** - Decorators and utilities to trace Lambda function handlers, and both synchronous and asynchronous functions

aws_lambda_powertools/logging/formatter.py

+22-4
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
import json
22
import logging
3+
import os
34

45

56
class JsonFormatter(logging.Formatter):
@@ -30,13 +31,27 @@ def __init__(self, **kwargs):
3031
self.default_json_formatter = kwargs.pop("json_default", str)
3132
# Set the insertion order for the log messages
3233
self.format_dict = dict.fromkeys(kwargs.pop("log_record_order", ["level", "location", "message", "timestamp"]))
34+
self.reserved_keys = ["timestamp", "level", "location"]
3335
# Set the date format used by `asctime`
3436
super(JsonFormatter, self).__init__(datefmt=kwargs.pop("datefmt", None))
3537

36-
self.reserved_keys = ["timestamp", "level", "location"]
37-
self.format_dict.update(
38-
{"level": "%(levelname)s", "location": "%(funcName)s:%(lineno)d", "timestamp": "%(asctime)s", **kwargs}
39-
)
38+
self.format_dict.update(self._build_root_keys(**kwargs))
39+
40+
@staticmethod
41+
def _build_root_keys(**kwargs):
42+
return {
43+
"level": "%(levelname)s",
44+
"location": "%(funcName)s:%(lineno)d",
45+
"timestamp": "%(asctime)s",
46+
**kwargs,
47+
}
48+
49+
@staticmethod
50+
def _get_latest_trace_id():
51+
xray_trace_id = os.getenv("_X_AMZN_TRACE_ID")
52+
trace_id = xray_trace_id.split(";")[0].replace("Root=", "") if xray_trace_id else None
53+
54+
return trace_id
4055

4156
def update_formatter(self, **kwargs):
4257
self.format_dict.update(kwargs)
@@ -76,6 +91,9 @@ def format(self, record): # noqa: A003
7691
if record.exc_text:
7792
log_dict["exception"] = record.exc_text
7893

94+
# fetch latest X-Ray Trace ID, if any
95+
log_dict.update({"xray_trace_id": self._get_latest_trace_id()})
96+
7997
# Filter out top level key with values that are None
8098
log_dict = {k: v for k, v in log_dict.items() if v is not None}
8199

aws_lambda_powertools/logging/logger.py

+4-4
Original file line numberDiff line numberDiff line change
@@ -235,14 +235,14 @@ def handler(event, context):
235235

236236
@functools.wraps(lambda_handler)
237237
def decorate(event, context):
238+
lambda_context = build_lambda_context_model(context)
239+
cold_start = _is_cold_start()
240+
self.structure_logs(append=True, cold_start=cold_start, **lambda_context.__dict__)
241+
238242
if log_event:
239243
logger.debug("Event received")
240244
self.info(event)
241245

242-
lambda_context = build_lambda_context_model(context)
243-
cold_start = _is_cold_start()
244-
245-
self.structure_logs(append=True, cold_start=cold_start, **lambda_context.__dict__)
246246
return lambda_handler(event, context)
247247

248248
return decorate

aws_lambda_powertools/metrics/base.py

+5-3
Original file line numberDiff line numberDiff line change
@@ -112,7 +112,7 @@ def add_metric(self, name: str, unit: MetricUnit, value: Union[float, int]):
112112
Metric name
113113
unit : MetricUnit
114114
`aws_lambda_powertools.helper.models.MetricUnit`
115-
value : float
115+
value : Union[float, int]
116116
Metric value
117117
118118
Raises
@@ -146,6 +146,8 @@ def serialize_metric_set(self, metrics: Dict = None, dimensions: Dict = None, me
146146
Dictionary of metrics to serialize, by default None
147147
dimensions : Dict, optional
148148
Dictionary of dimensions to serialize, by default None
149+
metadata: Dict, optional
150+
Dictionary of metadata to serialize, by default None
149151
150152
Example
151153
-------
@@ -183,7 +185,7 @@ def serialize_metric_set(self, metrics: Dict = None, dimensions: Dict = None, me
183185
metric_names_and_values: Dict[str, str] = {} # { "metric_name": 1.0 }
184186

185187
for metric_name in metrics:
186-
metric: str = metrics[metric_name]
188+
metric: dict = metrics[metric_name]
187189
metric_value: int = metric.get("Value", 0)
188190
metric_unit: str = metric.get("Unit", "")
189191

@@ -257,7 +259,7 @@ def add_metadata(self, key: str, value: Any):
257259
258260
Parameters
259261
----------
260-
name : str
262+
key : str
261263
Metadata key
262264
value : any
263265
Metadata value
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# -*- coding: utf-8 -*-
2+
3+
"""
4+
Batch processing utility
5+
"""
6+
7+
from .base import BasePartialProcessor, batch_processor
8+
from .sqs import PartialSQSProcessor, sqs_batch_processor
9+
10+
__all__ = ("BasePartialProcessor", "PartialSQSProcessor", "batch_processor", "sqs_batch_processor")
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,146 @@
1+
# -*- coding: utf-8 -*-
2+
3+
"""
4+
Batch processing utilities
5+
"""
6+
7+
import logging
8+
from abc import ABC, abstractmethod
9+
from typing import Any, Callable, Dict, List, Tuple
10+
11+
from aws_lambda_powertools.middleware_factory import lambda_handler_decorator
12+
13+
logger = logging.getLogger(__name__)
14+
15+
16+
class BasePartialProcessor(ABC):
17+
"""
18+
Abstract class for batch processors.
19+
"""
20+
21+
def __init__(self):
22+
self.success_messages: List = []
23+
self.fail_messages: List = []
24+
self.exceptions: List = []
25+
26+
@abstractmethod
27+
def _prepare(self):
28+
"""
29+
Prepare context manager.
30+
"""
31+
raise NotImplementedError()
32+
33+
@abstractmethod
34+
def _clean(self):
35+
"""
36+
Clear context manager.
37+
"""
38+
raise NotImplementedError()
39+
40+
@abstractmethod
41+
def _process_record(self, record: Any):
42+
"""
43+
Process record with handler.
44+
"""
45+
raise NotImplementedError()
46+
47+
def process(self) -> List[Tuple]:
48+
"""
49+
Call instance's handler for each record.
50+
"""
51+
return [self._process_record(record) for record in self.records]
52+
53+
def __enter__(self):
54+
self._prepare()
55+
return self
56+
57+
def __exit__(self, exception_type, exception_value, traceback):
58+
self._clean()
59+
60+
def __call__(self, records: List[Any], handler: Callable):
61+
"""
62+
Set instance attributes before execution
63+
64+
Parameters
65+
----------
66+
records: List[Any]
67+
List with objects to be processed.
68+
handler: Callable
69+
Callable to process "records" entries.
70+
"""
71+
self.records = records
72+
self.handler = handler
73+
return self
74+
75+
def success_handler(self, record: Any, result: Any):
76+
"""
77+
Success callback
78+
79+
Returns
80+
-------
81+
tuple
82+
"success", result, original record
83+
"""
84+
entry = ("success", result, record)
85+
self.success_messages.append(record)
86+
return entry
87+
88+
def failure_handler(self, record: Any, exception: Exception):
89+
"""
90+
Failure callback
91+
92+
Returns
93+
-------
94+
tuple
95+
"fail", exceptions args, original record
96+
"""
97+
entry = ("fail", exception.args, record)
98+
logger.debug(f"Record processing exception: {exception}")
99+
self.exceptions.append(exception)
100+
self.fail_messages.append(record)
101+
return entry
102+
103+
104+
@lambda_handler_decorator
105+
def batch_processor(
106+
handler: Callable, event: Dict, context: Dict, record_handler: Callable, processor: BasePartialProcessor = None
107+
):
108+
"""
109+
Middleware to handle batch event processing
110+
111+
Parameters
112+
----------
113+
handler: Callable
114+
Lambda's handler
115+
event: Dict
116+
Lambda's Event
117+
context: Dict
118+
Lambda's Context
119+
record_handler: Callable
120+
Callable to process each record from the batch
121+
processor: PartialSQSProcessor
122+
Batch Processor to handle partial failure cases
123+
124+
Examples
125+
--------
126+
**Processes Lambda's event with PartialSQSProcessor**
127+
>>> from aws_lambda_powertools.utilities.batch import batch_processor, PartialSQSProcessor
128+
>>>
129+
>>> def record_handler(record):
130+
>>> return record["body"]
131+
>>>
132+
>>> @batch_processor(record_handler=record_handler, processor=PartialSQSProcessor())
133+
>>> def handler(event, context):
134+
>>> return {"StatusCode": 200}
135+
136+
Limitations
137+
-----------
138+
* Async batch processors
139+
140+
"""
141+
records = event["Records"]
142+
143+
with processor(records, record_handler):
144+
processor.process()
145+
146+
return handler(event, context)
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
"""
2+
Batch processing exceptions
3+
"""
4+
5+
6+
class SQSBatchProcessingError(Exception):
7+
"""When at least one message within a batch could not be processed"""

0 commit comments

Comments
 (0)