Skip to content

AWS VPC Flow Logs

Ian edited this page Nov 8, 2021 · 10 revisions

AWS VPC Flow Logs

This setup will send flow logs from AWS to New Relic.

It uses the ktranslate docker image as a Lambda function to do so.

Steps:

  1. Enable VPC logs into an S3 bucket. Follow these instructions.

Note: Ensure that if using a custom format, the version field is added.

Note: For better graphs, ensure that the vpc-id field is selected.

  1. Create a private ECR registry and upload the kentik/ktranslate:v2arm64 image here. For example, assuming your account ID is 451031991406 and you wish to operate in the us-east-1 region:
docker pull kentik/ktranslate:v2
docker tag kentik/ktranslate:v2 451031991406.dkr.ecr.us-east-1.amazonaws.com/ktranslate:v2
docker push 451031991406.dkr.ecr.us-east-1.amazonaws.com/ktranslate:v2

Note: This image contains both arm64 and x86 compiled code. Use it on any preferred architecture, it should "just work".

Further instructions are here.

  1. Create a Lambda function from the uploaded ECR image. You only need to set the following environmental variables:
Variable Name Value
AWS_IS_LAMBDA TRUE
KENTIK_MODE nr1.vpc.lambda
KENTIK_VPC aws
NEW_RELIC_API_KEY $NR_INSIGHTS_INSERT_KEY
NR_ACCOUNT_ID $NR_ACCOUNT_ID
KENTIK_SAMPLE_RATE 1
KENTIK_LOG_LEVEL info

Also set the Timeout to 20 seconds.

  1. Grant permissions to this lambda. Ensure that the role it runs on has AmazonS3ReadOnlyAccess for the S3 bucket you are targeting.

  2. Create a trigger so that the function runs on every PUT and POST of an object into your S3 bucket. Instructions are here.

  3. Watch events come into New Relic into the KFlow table. A query to see this data is:

SELECT * FROM KFlow where provider = 'kentik-vpc' SINCE 30 MINUTES AGO