Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Data Classes Utility #171

Merged
merged 10 commits into from
Sep 22, 2020
2 changes: 2 additions & 0 deletions docs/content/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ sam init --location https://github.com/aws-samples/cookiecutter-aws-sam-python
* [Bring your own middleware](./utilities/middleware_factory) - Decorator factory to create your own middleware to run logic before, and after each Lambda invocation
* [Parameters utility](./utilities/parameters) - Retrieve parameter values from AWS Systems Manager Parameter Store, AWS Secrets Manager, or Amazon DynamoDB, and cache them for a specific amount of time
* [Batch utility](./utilities/batch) - Batch processing for AWS SQS, handles partial failure.
* [Event source data classes utility](./utilities/data_classes) - Data classes describing the schema of common Lambda event triggers.

### Lambda Layer

Expand Down Expand Up @@ -73,6 +74,7 @@ Utility | Description
[Bring your own middleware](.//utilities/middleware_factory) | Decorator factory to create your own middleware to run logic before, and after each Lambda invocation
[Parameters utility](./utilities/parameters) | Retrieve parameter values from AWS Systems Manager Parameter Store, AWS Secrets Manager, or Amazon DynamoDB, and cache them for a specific amount of time
[Typing utility](./utilities/typing) | Static typing classes to speedup development in your IDE
[Event source data classes](./utilities/data_classes) - Batch processing for AWS SQS, handles partial failure.
to-mc marked this conversation as resolved.
Show resolved Hide resolved

## Environment variables

Expand Down
Binary file added docs/content/media/utilities_data_classes.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
210 changes: 210 additions & 0 deletions docs/content/utilities/data_classes.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,210 @@
---
title: Event Source Data Classes
description: Utility
---

import Note from "../../src/components/Note"

The event source data classes utility provides classes describing the schema of common Lambda events triggers.

**Key Features**

* Type hinting and code completion for common event types
* Helper functions for decoding/deserializing nested fields
* Docstrings for fields contained in event schemas

**Background**

When authoring Lambda functions, you often need to understand the schema of the event dictionary which is passed to the
handler. There are several common event types which follow a specific schema, depending on the service triggering the
Lambda function.


## Utilizing the data classes

The classes are initialized by passing in the Lambda event object into the constructor of the appropriate data class.
For example, if your Lambda function is being triggered by an API Gateway proxy integration, you can use the
`APIGatewayProxyEvent` class.

![Utilities Data Classes](../media/utilities_data_classes.png)


## Supported event sources
<Note type="info">
The examples provided below are far from exhaustive - the data classes themselves are designed to provide a form of
documentation inherently (via autocompletion, types and docstrings).
</Note>


### API Gateway Proxy V1 (REST API)
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEvent

def lambda_handler(event, context):
event = APIGatewayProxyEvent(event)
request_context = event.request_context
identity = request_context.identity

if 'helloworld' in event.path && event.http_method == 'GET':
user = identity.user
do_something_with(event.body, user)
```

### API Gateway Proxy V2 (HTTP API)
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import APIGatewayProxyEventV2

def lambda_handler(event, context):
event = APIGatewayProxyEventV2(event)
request_context = event.request_context
query_string_parameters = event.query_string_parameters

if 'helloworld' in event.raw_path && request_context.http.method == 'POST':
do_something_with(event.body, query_string_parameters)
```

### CloudWatch logs
CloudWatch logs events by default are compressed and base64 encoded. You can use the helper function provided to decode,
decompress and parse json data from the event.

```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import CloudWatchLogsEvent

def lambda_handler(event, context):
event = CloudWatchLogsEvent(event)

decompressed_log = event.parse_logs_data
log_events = decompressed_log.log_events
for event in log_events:
do_something_with(event.timestamp, event.message)
```

### Cognito user pool triggers
Cognito User Pools have several [different Lambda trigger sources](https://docs.aws.amazon.com/cognito/latest/developerguide/cognito-user-identity-pools-working-with-aws-lambda-triggers.html#cognito-user-identity-pools-working-with-aws-lambda-trigger-sources), all of which map to a different data class, which
can be imported from `aws_lambda_powertools.data_classes.cognito_user_pool_event`:

Trigger/Event Source | Data Class
------------------------------------------------- | -------------------------------------------------
Custom message event | `data_classes.cognito_user_pool_event.CustomMessageTriggerEvent`
Post authentication | `data_classes.cognito_user_pool_event.PostAuthenticationTriggerEvent`
Post confirmation | `data_classes.cognito_user_pool_event.PostConfirmationTriggerEvent`
Pre authentication | `data_classes.cognito_user_pool_event.PreAuthenticationTriggerEvent`
Pre sign-up | `data_classes.cognito_user_pool_event.PreSignUpTriggerEvent`
Pre token generation | `data_classes.cognito_user_pool_event.PreTokenGenerationTriggerEvent`
User migration | `data_classes.cognito_user_pool_event.UserMigrationTriggerEvent`

```python:title=lambda_app.py
from aws_lambda_powertools.utilities.cognito_user_pool_event import PostConfirmationTriggerEvent

def lambda_handler(event, context):
event = PostConfirmationTriggerEvent(event)

user_attributes = user_attributes = event.request.user_attributes
do_something_with(user_attributes)
```

### DynamoDB streams
The DynamoDB data class utility provides the base class for `DynamoDBStreamEvent`, a typed class for
attributes values (`AttributeValue`), as well as enums for stream view type (`StreamViewType`) and event type
(`DynamoDBRecordEventName`).

```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import DynamoDBStreamEvent, DynamoDBRecordEventName

def lambda_handler(event, context):
event = DynamoDBStreamEvent(event)

# Multiple records can be delivered in a single event
for record in event.records:
if record.event_name == DynamoDBRecordEventName.MODIFY:
do_something_with(record.dynamodb.new_image)
do_something_with(record.dynamodb.old_image)
```

### EventBridge
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import EventBridgeEvent

def lambda_handler(event, context):
event = EventBridgeEvent(event)
do_something_with(event.detail)

```

### Kinesis streams
Kinesis events by default contain base64 encoded data. You can use the helper function to access the data either as json
or plain text, depending on the original payload.
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import KinesisStreamEvent

def lambda_handler(event, context):
event = KinesisStreamEvent(event)

# if data was delivered as json
data = event.data_as_text()

# if data was delivered as text
data = event.data_as_json()

do_something_with(data)

```

### S3 events
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import S3Event

def lambda_handler(event, context):
event = S3Event(event)
bucket_name = event.bucket_name

# Multiple records can be delivered in a single event
for record in event.records:
object_key = record.s3.get_object.key

do_something_with(f'{bucket_name}/{object_key}')

```

### SES events
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import SESEvent

def lambda_handler(event, context):
event = SESEvent(event)

# Multiple records can be delivered in a single event
for record in event.records:
mail = record.ses.mail
common_headers = list(mail.common_headers)

do_something_with(common_headers.to, common_headers.subject)

```

### SNS
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import SNSEvent

def lambda_handler(event, context):
event = SNSEvent(event)

# Multiple records can be delivered in a single event
for record in event.records:
message = record.sns.message
subject = record.sns.subject

do_something_with(subject, message)
```

### SQS
```python:title=lambda_app.py
from aws_lambda_powertools.utilities.data_classes import SQSEvent

def lambda_handler(event, context):
event = SQSEvent(event)

# Multiple records can be delivered in a single event
for record in event.records:
do_something_with(record.body)
```
3 changes: 2 additions & 1 deletion docs/gatsby-config.js
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@ module.exports = {
'utilities/parameters',
'utilities/batch',
'utilities/typing',
'utilities/validation'
'utilities/validation',
'utilities/data_classes'
],
},
navConfig: {
Expand Down