Terraform module setting up a collector that receives Audit Trail events sent by Spacelift and stores them in AWS S3.
provider "aws" {
region = "us-east-1" # Change if you want to use a different region
}
module "collector" {
source = "github.com/spacelift-io-examples/terraform-aws-spacelift-events-collector"
# Add inputs described below as needed
}The main resources for this module are:
- Courier: A Lambda function that exposes a URL (see the
courier_urloutput) and forwards incoming events to a Kinesis Firehose Delivery Stream. - Stream: A Kinesis Firehose Delivery Stream that buffers events forwarded by the Courier and eventually sends them in batches to the Storage.
- Storage: An S3 bucket that stores the events (see the
storage_bucket_nameoutput).
| Name | Version |
|---|---|
| archive | ~> 2.2 |
| aws | >= 5.51.1 |
| random | ~> 3.1 |
| Name | Description | Type | Default | Required |
|---|---|---|---|---|
| buffer_interval | Buffer incoming data for the specified period of time, in seconds, before delivering it to the destination | number |
300 |
no |
| buffer_size | Buffer incoming events to the specified size, in MBs, before delivering it to the destination | number |
5 |
no |
| delete_events_when_destroying_stack | Whether to delete stored events when destroying the stack | bool |
false |
no |
| events_expiration_days | Keep the events for this number of days | number |
365 |
no |
| logs_retention_days | Keep the logs for this number of days | number |
14 |
no |
| logs_verbose | Include debug information in the logs | bool |
false |
no |
| python_version | AWS Lambda Python runtime version | string |
"3.9" |
no |
| secret | Secret to be expected by the collector | string |
"" |
no |
| Name | Description |
|---|---|
| courier_function_arn | The ARN for the Lambda function for the courier |
| courier_url | The HTTP URL endpoint for the courier |
| storage_bucket_name | The name for the S3 bucket that stores the events |
| stream_name | The name for the Kinesis Firehose Delivery Stream |