Skip to content

New Relic VPC Flow Logs Configuration [Not currently supported]

Zack Mutchler edited this page Jul 20, 2021 · 1 revision

Pre-reqs

Validate you have docker installed and running (docs)

docker version --format '{{.Server.Version}}'

Validate you have a non-root user available in the docker group (docs)

grep -e "docker" /etc/group

Get your New Relic Account ID and Insights Insert Key, replacing $NR_INSIGHTS_INSERT_KEY and $NR_ACCOUNT_ID in these commands, respectively.

Pull data from AWS vpc logs directly and send to New Relic:

Step 1

Create a vpc log export sending logs to an S3 bucket.

https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs-s3.html

Note: you must send the version field in the log. Any other field elements are optional.

Step 2

Enable notification of new objects in this bucket going to a sqs Queue.

https://docs.aws.amazon.com/AmazonS3/latest/userguide/NotificationHowTo.html

Replace $AWS_QUEUE with the created Queue's URL. For example, https://sqs.us-east-1.amazonaws.com/ACCOUNT_ID/Q_NAME

Step 3

Create a role which can view objects in this bucket and list EC2 instances.

Replace $AWS_ROLE with this role.

Replace $REGIONS with a set of AWS regions to look for EC2 instances in. Defaults to us-east-1. Comma separate regions to add multiple ones (us-east-1,us-east-2). The created role must have read only EC2 permissions for these regions.

Step 4

Run ktranslate inside of AWS like:

docker run --name ktranslate -d \
  -e NEW_RELIC_API_KEY=$NR_INSIGHTS_INSERT_KEY \
  -p 8083:8083 \
  kentik/ktranslate:v2 \
    -nr_account_id=$NR_ACCOUNT_ID \
    -vpc=aws \
    -sqs_name=$AWS_QUEUE \
    -iam_role=$AWS_ROLE \
    -aws_regions=$REGIONS \
    nr1.vpc

To scale to high volumes, create multiple instances of ktranslate all reading from the same queue.