Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New serverless pattern - Amazon SNS to Amazon Kinesis Data Firehose (Terraform) #1892

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
86 changes: 86 additions & 0 deletions sns-firehose-tf/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# Amazon SNS to Amazon Kinesis Data Firehose

This pattern publishes SNS messages to a Kinesis Firehose Delivery Stream so that they can be forwarded to archival or analytics destinations.

Learn more about this pattern at Serverless Land Patterns: https://serverlessland.com/patterns/sns-firehose-tf

Important: this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the [AWS Pricing page](https://aws.amazon.com/pricing/) for details. You are responsible for any AWS costs incurred. No warranty is implied in this example.

## Requirements

* [Create an AWS account](https://portal.aws.amazon.com/gp/aws/developer/registration/index.html) if you do not already have one and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources.
* [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) installed and configured
* [Git Installed](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
* [Terraform](https://learn.hashicorp.com/tutorials/terraform/install-cli?in=terraform/aws-get-started) installed

## Deployment Instructions

## Deployment Instructions

1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository:
```
git clone https://github.com/aws-samples/serverless-patterns
```

1. Change the working directory to this pattern's directory

```sh
cd serverless-patterns/sns-firehose-tf
```

1. From the command line, initialize terraform to to downloads and installs the providers defined in the configuration:
```
terraform init
```
1. From the command line, apply the configuration in the main.tf file:
```
terraform apply
```
1. During the prompts:
* Enter a bucket name
* Enter yes

1. Note the outputs from the deployment process. These contain the resource names and/or ARNs which are used for testing.

## How it works

This template creates an SNS Topic, Kinesis Firehose Delivery Stream, S3 bucket, and subscribed the Kinesis Firehose Delivery Stream to the SNS Topic. As messages are published to the topic, they are streamed to the Firehose Delivery Stream, and then delivered the the Firehose Delivery Stream's destinations, which in this case is an S3 bucket.

## Testing

1. Publish a message to the SNS topic by running the CLI command:
```
aws sns publish --topic-arn arn:aws:sns:us-east-1:{AWS ACCOUNT NUMBER}:SourceSNSTopic-TF --message "Hello world"
```

2. Check that test messages are being sent to the destination S3 bucket (it will take a few minutes for events to begin streaming):

```
aws s3 ls s3://{destination bucket name} --recursive --human-readable --summarize
```

## Cleanup

1. Change directory to the pattern directory:
```
cd sns-firehose-tf
```
1. Delete the files if you have published the message from SNS because bucket will be not deleted if any files exist.
```
aws s3 rm s3://{destination bucket name} --recursive
```
1. Delete all created resources by terraform
```bash
terraform destroy
```
1. During the prompts:
* Enter a bucket name
* Enter yes
1. Confirm all created resources has been deleted
```bash
terraform show
```
----
Copyright 2021 Amazon.com, Inc. or its affiliates. All Rights Reserved.

SPDX-License-Identifier: MIT-0
60 changes: 60 additions & 0 deletions sns-firehose-tf/example-pattern.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
{
"title": "Amazon SNS to Amazon Kinesis Data Firehose",
"description": "Publishes SNS messages to a Kinesis Firehose Delivery Stream so that they can be forwarded to archival or analytics destinations.",
"language": "YAML",
"level": "200",
"framework": "Terraform",
"introBox": {
"headline": "How it works",
"text": [
"This template creates an SNS Topic, Kinesis Firehose Delivery Stream, S3 bucket, and subscribed the Kinesis Firehose Delivery Stream to the SNS Topic.",
"As messages are published to the topic, they are streamed to the Firehose Delivery Stream, and then delivered the the Firehose Delivery Stream's destinations, which in this case is an S3 bucket."
]
},
"gitHub": {
"template": {
"repoURL": "https://github.com/aws-samples/serverless-patterns/tree/main/sns-firehose-tf",
"templateURL": "serverless-patterns/sns-firehose-tf",
"projectFolder": "sns-firehose-tf",
"templateFile": "main.tf"
}
},
"resources": {
"bullets": [
{
"text": "Fanout to Kinesis Data Firehose delivery streams",
"link": "https://docs.amazonaws.cn/en_us/sns/latest/dg/sns-firehose-as-subscriber.html"
},
{
"text": "Introducing message archiving and analytics for Amazon SNS",
"link": "https://aws.amazon.com/blogs/compute/introducing-message-archiving-and-analytics-for-amazon-sns/"
}
]
},
"deploy": {
"text": [
"terraform init",
"terraform apply"
]
},
"testing": {
"text": [
"See the Github repo for detailed testing instructions."
]
},
"cleanup": {
"text": [
"terraform destroy",
"terraform show"
]
},
"authors": [
{
"name": "Makendran G",
"image": "https://drive.google.com/file/d/1mUObnbmn52UWL-Zn39EpgpneiBNv3LCN/view?usp=sharing",
"bio": "Cloud Support Engineer @ AWS",
"linkedin": "makendran",
"twitter": "@MakendranG"
}
]
}
187 changes: 187 additions & 0 deletions sns-firehose-tf/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,187 @@
provider "aws" {
region = "us-east-1"
}

variable "destination_bucket_name" {
description = "Enter a bucket name"
type = string
}

resource "random_id" "unique_suffix" {
byte_length = 4
}

resource "aws_s3_bucket" "destination_bucket" {
bucket = "${var.destination_bucket_name}-${random_id.unique_suffix.hex}"

# Additional S3 bucket configurations go here
}

resource "aws_iam_role" "sns_subscription_role" {
name = "SNSSubscriptionRole-TF"

assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "sns.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
EOF
}

resource "aws_iam_role" "delivery_stream_role" {
name = "DeliveryStreamRole-TF"

assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "firehose.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
EOF
}

resource "aws_iam_policy" "sns_firehose_access_policy" {
name = "SNS_Firehose_access_policy_tf"
description = "IAM policy for SNS to access Kinesis Firehose"

policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"firehose:DescribeDeliveryStream",
"firehose:ListDeliveryStreams",
"firehose:ListTagsForDeliveryStream",
"firehose:PutRecord",
"firehose:PutRecordBatch"
],
"Resource": [
"${aws_kinesis_firehose_delivery_stream.extended_s3_stream.arn}"
]
}
]
}
EOF
}

resource "aws_iam_policy" "delivery_stream_policy" {
name = "firehose_delivery_policy_tf"
description = "IAM policy for Kinesis Firehose to access S3 bucket"

policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:AbortMultipartUpload",
"s3:GetBucketLocation",
"s3:GetObject",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:PutObject"
],
"Resource": [
"${aws_s3_bucket.destination_bucket.arn}",
"${aws_s3_bucket.destination_bucket.arn}/*"
]
}
]
}
EOF
}

resource "aws_iam_role_policy_attachment" "sns_firehose_access_attachment" {
policy_arn = aws_iam_policy.sns_firehose_access_policy.arn
role = aws_iam_role.sns_subscription_role.name
}

resource "aws_iam_role_policy_attachment" "delivery_stream_access_attachment" {
policy_arn = aws_iam_policy.delivery_stream_policy.arn
role = aws_iam_role.delivery_stream_role.name
}

resource "aws_sns_topic" "sns_topic" {
name = "SourceSNSTopic-TF"
fifo_topic = false
}

resource "aws_sns_topic_subscription" "sns_subscription" {
protocol = "firehose"
topic_arn = aws_sns_topic.sns_topic.arn
endpoint = aws_kinesis_firehose_delivery_stream.extended_s3_stream.arn
depends_on = [aws_sns_topic.sns_topic, aws_kinesis_firehose_delivery_stream.extended_s3_stream]
subscription_role_arn = aws_iam_role.sns_subscription_role.arn
}

resource "aws_kinesis_firehose_delivery_stream" "extended_s3_stream" {
name = "Firehost-stream-TF"
destination = "extended_s3"

extended_s3_configuration {
bucket_arn = aws_s3_bucket.destination_bucket.arn
role_arn = aws_iam_role.delivery_stream_role.arn
buffering_size = 1
buffering_interval = 60
compression_format = "GZIP"

cloudwatch_logging_options {
enabled = true
log_group_name = "/aws/kinesisfirehose/ibcd"
log_stream_name = "S3Delivery"
}

}
}

output "sns_subscription_role_arn" {
value = aws_iam_role.sns_subscription_role.arn
}

output "delivery_stream_role_arn" {
value = aws_iam_role.delivery_stream_role.arn
}

output "sns_firehose_access_policy_arn" {
value = aws_iam_policy.sns_firehose_access_policy.arn
}

output "delivery_stream_policy_arn" {
value = aws_iam_policy.delivery_stream_policy.arn
}

output "sns_topic_arn" {
value = aws_sns_topic.sns_topic.arn
}

output "sns_subscription_arn" {
value = aws_sns_topic_subscription.sns_subscription.arn
}

output "kinesis_firehose_stream_arn" {
value = aws_kinesis_firehose_delivery_stream.extended_s3_stream.arn
}


output "destination_bucket_arn" {
value = aws_s3_bucket.destination_bucket.arn
}