Skip to content

feat: toggle to disable log deduplication locally for pytest live log #262 #268

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 13 additions & 7 deletions aws_lambda_powertools/logging/logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,12 +129,14 @@ def __init__(
self.sampling_rate = resolve_env_var_choice(
choice=sampling_rate, env=os.getenv(constants.LOGGER_LOG_SAMPLING_RATE, 0.0)
)
self._is_deduplication_disabled = resolve_truthy_env_var_choice(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NIT: We store whether the deduplication is disabled rather than enabled, then (line 175) we check against this flag not being true. It makes this flag a bit hard to read IMHO. Should we invert this?

env=os.getenv(constants.LOGGER_LOG_DEDUPLICATION_ENV, "false")
)
self.log_level = self._get_log_level(level)
self.child = child
self._handler = logging.StreamHandler(stream) if stream is not None else logging.StreamHandler(sys.stdout)
self._default_log_keys = {"service": self.service, "sampling_rate": self.sampling_rate}
self._logger = self._get_logger()

self._init_logger(**kwargs)

def __getattr__(self, name):
Expand Down Expand Up @@ -167,12 +169,16 @@ def _init_logger(self, **kwargs):
self._logger.addHandler(self._handler)
self.structure_logs(**kwargs)

logger.debug("Adding filter in root logger to suppress child logger records to bubble up")
for handler in logging.root.handlers:
# It'll add a filter to suppress any child logger from self.service
# Where service is Order, it'll reject parent logger Order,
# and child loggers such as Order.checkout, Order.shared
handler.addFilter(SuppressFilter(self.service))
# Pytest Live Log feature duplicates log records for colored output
# but we explicitly add a filter for log deduplication.
# This flag disables this protection when you explicit want logs to be duplicated (#262)
if not self._is_deduplication_disabled:
logger.debug("Adding filter in root logger to suppress child logger records to bubble up")
for handler in logging.root.handlers:
# It'll add a filter to suppress any child logger from self.service
# Example: `Logger(service="order")`, where service is Order
# It'll reject all loggers starting with `order` e.g. order.checkout, order.shared
handler.addFilter(SuppressFilter(self.service))

# as per bug in #249, we should not be pre-configuring an existing logger
# therefore we set a custom attribute in the Logger that will be returned
Expand Down
1 change: 1 addition & 0 deletions aws_lambda_powertools/shared/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

LOGGER_LOG_SAMPLING_RATE: str = "POWERTOOLS_LOGGER_SAMPLE_RATE"
LOGGER_LOG_EVENT_ENV: str = "POWERTOOLS_LOGGER_LOG_EVENT"
LOGGER_LOG_DEDUPLICATION_ENV: str = "POWERTOOLS_LOG_DEDUPLICATION_DISABLED"

MIDDLEWARE_FACTORY_TRACE_ENV: str = "POWERTOOLS_TRACE_MIDDLEWARES"

Expand Down
12 changes: 12 additions & 0 deletions docs/content/core/logger.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -439,6 +439,18 @@ def test_lambda_handler(lambda_handler, lambda_context):
lambda_handler(test_event, lambda_context) # this will now have a Context object populated
```

### pytest live log feature

Pytest Live Log feature duplicates emitted log messages in order to style log statements according to their levels, for this to work use `POWERTOOLS_LOG_DEDUPLICATION_DISABLED` env var.

```bash:title=pytest_live_log.sh
POWERTOOLS_LOG_DEDUPLICATION_DISABLED="1" pytest -o log_cli=1
```

<Note type="warning">
This feature should be used with care, as it explicitly disables our ability to filter propagated messages to the root logger (if configured).
</Note><br/>

## FAQ

**How can I enable boto3 and botocore library logging?**
Expand Down
1 change: 1 addition & 0 deletions docs/content/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -138,6 +138,7 @@ Environment variable | Description | Utility | Default
**POWERTOOLS_TRACE_MIDDLEWARES** | Creates sub-segment for each custom middleware | [Middleware factory](./utilities/middleware_factory) | `false`
**POWERTOOLS_LOGGER_LOG_EVENT** | Logs incoming event | [Logging](./core/logger) | `false`
**POWERTOOLS_LOGGER_SAMPLE_RATE** | Debug log sampling | [Logging](./core/logger) | `0`
**POWERTOOLS_LOG_DEDUPLICATION_DISABLED** | Disables log deduplication filter protection to use Pytest Live Log feature | [Logging](./core/logger) | `false`
**LOG_LEVEL** | Sets logging level | [Logging](./core/logger) | `INFO`

## Debug mode
Expand Down
22 changes: 22 additions & 0 deletions tests/functional/test_logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
from aws_lambda_powertools import Logger, Tracer
from aws_lambda_powertools.logging.exceptions import InvalidLoggerSamplingRateError
from aws_lambda_powertools.logging.logger import set_package_logger
from aws_lambda_powertools.shared import constants


@pytest.fixture
Expand Down Expand Up @@ -376,6 +377,7 @@ def test_logger_do_not_log_twice_when_root_logger_is_setup(stdout, service_name)
child_logger = Logger(service=service_name, child=True, stream=stdout)
logger.info("PARENT")
child_logger.info("CHILD")
root_logger.info("ROOT")

# THEN it should only contain only two log entries
# since child's log records propagated to root logger should be rejected
Expand All @@ -400,3 +402,23 @@ def test_logger_extra_kwargs(stdout, service_name):

# THEN second log should not have request_id in the root structure
assert "request_id" not in no_extra_fields_log


def test_logger_log_twice_when_log_filter_isnt_present_and_root_logger_is_setup(monkeypatch, stdout, service_name):
# GIVEN Lambda configures the root logger with a handler
root_logger = logging.getLogger()
root_logger.addHandler(logging.StreamHandler(stream=stdout))

# WHEN we create a new Logger and child Logger
# and log deduplication filter for child messages are disabled
# see #262 for more details on why this is needed for Pytest Live Log feature
monkeypatch.setenv(constants.LOGGER_LOG_DEDUPLICATION_ENV, "true")
logger = Logger(service=service_name, stream=stdout)
child_logger = Logger(service=service_name, child=True, stream=stdout)
logger.info("PARENT")
child_logger.info("CHILD")

# THEN it should only contain only two log entries
# since child's log records propagated to root logger should be rejected
logs = list(stdout.getvalue().strip().split("\n"))
assert len(logs) == 4