Skip to content

Commit

Permalink
Merge branch 'develop' into chore/support-cdk-environment-e2e
Browse files Browse the repository at this point in the history
* develop:
  chore(deps-dev): bump aws-cdk-lib from 2.40.0 to 2.41.0 (#1507)
  update changelog with latest changes
  feat(tracer): support methods with the same name (ABCs) by including fully qualified name in v2 (#1486)
  chore(deps-dev): bump aws-cdk-aws-apigatewayv2-integrations-alpha from 2.39.1a0 to 2.40.0a0 (#1496)
  chore(deps-dev): bump mkdocs-material from 8.4.2 to 8.4.3 (#1504)
  chore(deps): bump pydantic from 1.10.1 to 1.10.2 (#1502)
  update changelog with latest changes
  feat(data-classes): add KafkaEvent and KafkaEventRecord (#1485)
  chore(deps-dev): bump pytest from 7.1.2 to 7.1.3 (#1497)
  update changelog with latest changes
  feat(event_handler): add cookies as 1st class citizen in v2 (#1487)
  chore(deps-dev): bump black from 22.6.0 to 22.8.0 (#1494)
  chore(deps-dev): bump aws-cdk-lib from 2.39.1 to 2.40.0 (#1495)
  chore(maintenance): add discord link to first PR and first issue (#1493)
  update changelog with latest changes
  refactor(batch): remove legacy sqs_batch_processor (#1492)
  chore(deps): bump pydantic from 1.10.0 to 1.10.1 (#1491)
  chore(deps-dev): bump flake8-variables-names from 0.0.4 to 0.0.5 (#1490)
  • Loading branch information
heitorlessa committed Sep 9, 2022
2 parents af90f3f + 217bd6c commit ceca92e
Show file tree
Hide file tree
Showing 37 changed files with 1,641 additions and 1,243 deletions.
4 changes: 4 additions & 0 deletions .github/boring-cyborg.yml
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,8 @@ labelPRBasedOnFilePath:
firstPRWelcomeComment: >
Thanks a lot for your first contribution! Please check out our contributing guidelines and don't hesitate to ask whatever you need.
In the meantime, check out the #python channel on our AWS Lambda Powertools Discord: [Invite link](https://discord.gg/B8zZKbbyET)
# Comment to be posted to congratulate user on their first merged PR
firstPRMergeComment: >
Awesome work, congrats on your first merged pull request and thank you for helping improve everyone's experience!
Expand All @@ -103,6 +105,8 @@ firstPRMergeComment: >
firstIssueWelcomeComment: >
Thanks for opening your first issue here! We'll come back to you as soon as we can.
In the meantime, check out the #python channel on our AWS Lambda Powertools Discord: [Invite link](https://discord.gg/B8zZKbbyET)
###### IssueLink Adder #################################################################################################
# Insert Issue (Jira/Github etc) link in PR description based on the Issue ID in PR title.
#insertIssueLinkInPrDescription:
Expand Down
36 changes: 26 additions & 10 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,10 @@
* **ci:** event resolution for on_label_added workflow
* **event_handler:** fix bug with previous array implementation

## Code Refactoring

* **batch:** remove legacy sqs_batch_processor ([#1492](https://github.com/awslabs/aws-lambda-powertools-python/issues/1492))

## Documentation

* **homepage:** note about v2 version
Expand All @@ -21,33 +25,45 @@
## Features

* **ci:** add actionlint in pre-commit hook
* **data-classes:** add KafkaEvent and KafkaEventRecord ([#1485](https://github.com/awslabs/aws-lambda-powertools-python/issues/1485))
* **event_handler:** add cookies as 1st class citizen in v2 ([#1487](https://github.com/awslabs/aws-lambda-powertools-python/issues/1487))
* **event_handler:** improved support for headers and cookies in v2 ([#1455](https://github.com/awslabs/aws-lambda-powertools-python/issues/1455))
* **event_sources:** add CloudWatch dashboard custom widget event ([#1474](https://github.com/awslabs/aws-lambda-powertools-python/issues/1474))
* **tracer:** support methods with the same name (ABCs) by including fully qualified name in v2 ([#1486](https://github.com/awslabs/aws-lambda-powertools-python/issues/1486))

## Maintenance

* **bandit:** update baseline
* **ci:** enable ci checks for v2
* **ci:** destructure assignment on comment_large_pr
* **ci:** add missing description fields
* **ci:** fix invalid dependency leftover
* **ci:** remove dangling debug step
* **ci:** limit E2E workflow run for source code change
* **ci:** add note for state persistence on comment_large_pr
* **ci:** add linter for GitHub Actions as pre-commit hook ([#1479](https://github.com/awslabs/aws-lambda-powertools-python/issues/1479))
* **ci:** remove unused and undeclared OS matrix env
* **ci:** sync package version with pypi
* **ci:** add workflow to suggest splitting large PRs ([#1480](https://github.com/awslabs/aws-lambda-powertools-python/issues/1480))
* **ci:** create adhoc docs workflow for v2
* **ci:** create adhoc docs workflow for v2
* **ci:** create docs workflow for v2
* **ci:** create reusable docs publishing workflow ([#1482](https://github.com/awslabs/aws-lambda-powertools-python/issues/1482))
* **ci:** format comment on comment_large_pr script
* **ci:** add note for state persistence on comment_large_pr
* **ci:** destructure assignment on comment_large_pr
* **ci:** record pr details upon labeling
* **ci:** sync package version with pypi
* **ci:** remove unused and undeclared OS matrix env
* **ci:** enable ci checks for v2
* **ci:** add workflow to suggest splitting large PRs ([#1480](https://github.com/awslabs/aws-lambda-powertools-python/issues/1480))
* **ci:** add linter for GitHub Actions as pre-commit hook ([#1479](https://github.com/awslabs/aws-lambda-powertools-python/issues/1479))
* **ci:** remove dangling debug step
* **ci:** fix invalid dependency leftover
* **deps-dev:** bump mypy-boto3-dynamodb from 1.24.55.post1 to 1.24.60 ([#306](https://github.com/awslabs/aws-lambda-powertools-python/issues/306))
* **deps:** bump pydantic from 1.10.0 to 1.10.1 ([#1491](https://github.com/awslabs/aws-lambda-powertools-python/issues/1491))
* **deps:** bump pydantic from 1.10.1 to 1.10.2 ([#1502](https://github.com/awslabs/aws-lambda-powertools-python/issues/1502))
* **deps-dev:** bump aws-cdk-aws-apigatewayv2-integrations-alpha from 2.39.1a0 to 2.40.0a0 ([#1496](https://github.com/awslabs/aws-lambda-powertools-python/issues/1496))
* **deps-dev:** bump mypy-boto3-dynamodb from 1.24.55.post1 to 1.24.60 ([#1481](https://github.com/awslabs/aws-lambda-powertools-python/issues/1481))
* **deps-dev:** bump mypy-boto3-dynamodb from 1.24.55.post1 to 1.24.60 ([#306](https://github.com/awslabs/aws-lambda-powertools-python/issues/306))
* **deps-dev:** bump mkdocs-material from 8.4.1 to 8.4.2 ([#1483](https://github.com/awslabs/aws-lambda-powertools-python/issues/1483))
* **deps-dev:** bump flake8-variables-names from 0.0.4 to 0.0.5 ([#1490](https://github.com/awslabs/aws-lambda-powertools-python/issues/1490))
* **deps-dev:** bump aws-cdk-lib from 2.39.1 to 2.40.0 ([#1495](https://github.com/awslabs/aws-lambda-powertools-python/issues/1495))
* **deps-dev:** bump black from 22.6.0 to 22.8.0 ([#1494](https://github.com/awslabs/aws-lambda-powertools-python/issues/1494))
* **deps-dev:** bump pytest from 7.1.2 to 7.1.3 ([#1497](https://github.com/awslabs/aws-lambda-powertools-python/issues/1497))
* **deps-dev:** bump mkdocs-material from 8.4.2 to 8.4.3 ([#1504](https://github.com/awslabs/aws-lambda-powertools-python/issues/1504))
* **maintainers:** update release workflow link
* **maintenance:** add discord link to first PR and first issue ([#1493](https://github.com/awslabs/aws-lambda-powertools-python/issues/1493))


<a name="v1.28.0"></a>
Expand Down
5 changes: 3 additions & 2 deletions aws_lambda_powertools/event_handler/api_gateway.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
from aws_lambda_powertools.event_handler import content_types
from aws_lambda_powertools.event_handler.exceptions import NotFoundError, ServiceError
from aws_lambda_powertools.shared import constants
from aws_lambda_powertools.shared.cookies import Cookie
from aws_lambda_powertools.shared.functions import resolve_truthy_env_var_choice
from aws_lambda_powertools.shared.json_encoder import Encoder
from aws_lambda_powertools.utilities.data_classes import (
Expand Down Expand Up @@ -147,7 +148,7 @@ def __init__(
content_type: Optional[str],
body: Union[str, bytes, None],
headers: Optional[Dict[str, Union[str, List[str]]]] = None,
cookies: Optional[List[str]] = None,
cookies: Optional[List[Cookie]] = None,
):
"""
Expand All @@ -162,7 +163,7 @@ def __init__(
Optionally set the response body. Note: bytes body will be automatically base64 encoded
headers: dict[str, Union[str, List[str]]]
Optionally set specific http headers. Setting "Content-Type" here would override the `content_type` value.
cookies: list[str]
cookies: list[Cookie]
Optionally set cookies.
"""
self.status_code = status_code
Expand Down
118 changes: 118 additions & 0 deletions aws_lambda_powertools/shared/cookies.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,118 @@
from datetime import datetime
from enum import Enum
from io import StringIO
from typing import List, Optional


class SameSite(Enum):
"""
SameSite allows a server to define a cookie attribute making it impossible for
the browser to send this cookie along with cross-site requests. The main
goal is to mitigate the risk of cross-origin information leakage, and provide
some protection against cross-site request forgery attacks.
See https://tools.ietf.org/html/draft-ietf-httpbis-cookie-same-site-00 for details.
"""

DEFAULT_MODE = ""
LAX_MODE = "Lax"
STRICT_MODE = "Strict"
NONE_MODE = "None"


def _format_date(timestamp: datetime) -> str:
# Specification example: Wed, 21 Oct 2015 07:28:00 GMT
return timestamp.strftime("%a, %d %b %Y %H:%M:%S GMT")


class Cookie:
"""
A Cookie represents an HTTP cookie as sent in the Set-Cookie header of an
HTTP response or the Cookie header of an HTTP request.
See https://tools.ietf.org/html/rfc6265 for details.
"""

def __init__(
self,
name: str,
value: str,
path: str = "",
domain: str = "",
secure: bool = True,
http_only: bool = False,
max_age: Optional[int] = None,
expires: Optional[datetime] = None,
same_site: Optional[SameSite] = None,
custom_attributes: Optional[List[str]] = None,
):
"""
Parameters
----------
name: str
The name of this cookie, for example session_id
value: str
The cookie value, for instance an uuid
path: str
The path for which this cookie is valid. Optional
domain: str
The domain for which this cookie is valid. Optional
secure: bool
Marks the cookie as secure, only sendable to the server with an encrypted request over the HTTPS protocol
http_only: bool
Enabling this attribute makes the cookie inaccessible to the JavaScript `Document.cookie` API
max_age: Optional[int]
Defines the period of time after which the cookie is invalid. Use negative values to force cookie deletion.
expires: Optional[datetime]
Defines a date where the permanent cookie expires.
same_site: Optional[SameSite]
Determines if the cookie should be sent to third party websites
custom_attributes: Optional[List[str]]
List of additional custom attributes to set on the cookie
"""
self.name = name
self.value = value
self.path = path
self.domain = domain
self.secure = secure
self.expires = expires
self.max_age = max_age
self.http_only = http_only
self.same_site = same_site
self.custom_attributes = custom_attributes

def __str__(self) -> str:
payload = StringIO()
payload.write(f"{self.name}={self.value}")

if self.path:
payload.write(f"; Path={self.path}")

if self.domain:
payload.write(f"; Domain={self.domain}")

if self.expires:
payload.write(f"; Expires={_format_date(self.expires)}")

if self.max_age:
if self.max_age > 0:
payload.write(f"; MaxAge={self.max_age}")
else:
# negative or zero max-age should be set to 0
payload.write("; MaxAge=0")

if self.http_only:
payload.write("; HttpOnly")

if self.secure:
payload.write("; Secure")

if self.same_site:
payload.write(f"; SameSite={self.same_site.value}")

if self.custom_attributes:
for attr in self.custom_attributes:
payload.write(f"; {attr}")

return payload.getvalue()
16 changes: 9 additions & 7 deletions aws_lambda_powertools/shared/headers_serializer.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,16 @@
from collections import defaultdict
from typing import Any, Dict, List, Union

from aws_lambda_powertools.shared.cookies import Cookie


class BaseHeadersSerializer:
"""
Helper class to correctly serialize headers and cookies for Amazon API Gateway,
ALB and Lambda Function URL response payload.
"""

def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str]) -> Dict[str, Any]:
def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[Cookie]) -> Dict[str, Any]:
"""
Serializes headers and cookies according to the request type.
Returns a dict that can be merged with the response payload.
Expand All @@ -25,7 +27,7 @@ def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str


class HttpApiHeadersSerializer(BaseHeadersSerializer):
def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str]) -> Dict[str, Any]:
def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[Cookie]) -> Dict[str, Any]:
"""
When using HTTP APIs or LambdaFunctionURLs, everything is taken care automatically for us.
We can directly assign a list of cookies and a dict of headers to the response payload, and the
Expand All @@ -44,11 +46,11 @@ def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str
else:
combined_headers[key] = ", ".join(values)

return {"headers": combined_headers, "cookies": cookies}
return {"headers": combined_headers, "cookies": list(map(str, cookies))}


class MultiValueHeadersSerializer(BaseHeadersSerializer):
def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str]) -> Dict[str, Any]:
def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[Cookie]) -> Dict[str, Any]:
"""
When using REST APIs, headers can be encoded using the `multiValueHeaders` key on the response.
This is also the case when using an ALB integration with the `multiValueHeaders` option enabled.
Expand All @@ -69,13 +71,13 @@ def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str
if cookies:
payload.setdefault("Set-Cookie", [])
for cookie in cookies:
payload["Set-Cookie"].append(cookie)
payload["Set-Cookie"].append(str(cookie))

return {"multiValueHeaders": payload}


class SingleValueHeadersSerializer(BaseHeadersSerializer):
def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str]) -> Dict[str, Any]:
def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[Cookie]) -> Dict[str, Any]:
"""
The ALB integration has `multiValueHeaders` disabled by default.
If we try to set multiple headers with the same key, or more than one cookie, print a warning.
Expand All @@ -93,7 +95,7 @@ def serialize(self, headers: Dict[str, Union[str, List[str]]], cookies: List[str
)

# We can only send one cookie, send the last one
payload["headers"]["Set-Cookie"] = cookies[-1]
payload["headers"]["Set-Cookie"] = str(cookies[-1])

for key, values in headers.items():
if isinstance(values, str):
Expand Down
6 changes: 4 additions & 2 deletions aws_lambda_powertools/tracing/tracer.py
Original file line number Diff line number Diff line change
Expand Up @@ -355,7 +355,8 @@ def capture_method(
"""Decorator to create subsegment for arbitrary functions
It also captures both response and exceptions as metadata
and creates a subsegment named `## <method_name>`
and creates a subsegment named `## <method_module.method_qualifiedname>`
# see here: [Qualified name for classes and functions](https://peps.python.org/pep-3155/)
When running [async functions concurrently](https://docs.python.org/3/library/asyncio-task.html#id6),
methods may impact each others subsegment, and can trigger
Expand Down Expand Up @@ -509,7 +510,8 @@ async def async_tasks():
functools.partial(self.capture_method, capture_response=capture_response, capture_error=capture_error),
)

method_name = f"{method.__name__}"
# Example: app.ClassA.get_all # noqa E800
method_name = f"{method.__module__}.{method.__qualname__}"

capture_response = resolve_truthy_env_var_choice(
env=os.getenv(constants.TRACER_CAPTURE_RESPONSE_ENV, "true"), choice=capture_response
Expand Down
3 changes: 0 additions & 3 deletions aws_lambda_powertools/utilities/batch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,16 +13,13 @@
batch_processor,
)
from aws_lambda_powertools.utilities.batch.exceptions import ExceptionInfo
from aws_lambda_powertools.utilities.batch.sqs import PartialSQSProcessor, sqs_batch_processor

__all__ = (
"BatchProcessor",
"BasePartialProcessor",
"ExceptionInfo",
"EventType",
"FailureResponse",
"PartialSQSProcessor",
"SuccessResponse",
"batch_processor",
"sqs_batch_processor",
)
8 changes: 4 additions & 4 deletions aws_lambda_powertools/utilities/batch/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -170,19 +170,19 @@ def batch_processor(
Lambda's Context
record_handler: Callable
Callable to process each record from the batch
processor: PartialSQSProcessor
processor: BasePartialProcessor
Batch Processor to handle partial failure cases
Examples
--------
**Processes Lambda's event with PartialSQSProcessor**
**Processes Lambda's event with a BasePartialProcessor**
>>> from aws_lambda_powertools.utilities.batch import batch_processor, PartialSQSProcessor
>>> from aws_lambda_powertools.utilities.batch import batch_processor, BatchProcessor
>>>
>>> def record_handler(record):
>>> return record["body"]
>>>
>>> @batch_processor(record_handler=record_handler, processor=PartialSQSProcessor())
>>> @batch_processor(record_handler=record_handler, processor=BatchProcessor())
>>> def handler(event, context):
>>> return {"StatusCode": 200}
Expand Down
13 changes: 0 additions & 13 deletions aws_lambda_powertools/utilities/batch/exceptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,19 +24,6 @@ def format_exceptions(self, parent_exception_str):
return "\n".join(exception_list)


class SQSBatchProcessingError(BaseBatchProcessingError):
"""When at least one message within a batch could not be processed"""

def __init__(self, msg="", child_exceptions: Optional[List[ExceptionInfo]] = None):
super().__init__(msg, child_exceptions)

# Overriding this method so we can output all child exception tracebacks when we raise this exception to prevent
# errors being lost. See https://github.com/awslabs/aws-lambda-powertools-python/issues/275
def __str__(self):
parent_exception_str = super(SQSBatchProcessingError, self).__str__()
return self.format_exceptions(parent_exception_str)


class BatchProcessingError(BaseBatchProcessingError):
"""When all batch records failed to be processed"""

Expand Down
Loading

0 comments on commit ceca92e

Please sign in to comment.