Blog

Creating Notifications for Critical Vulnerabilities in AWS Inspector with SNS topic, Lambda function and EventBridge

Daniel Dimitrov
Daniel Dimitrov
DevOps & Cloud Engineer
05.10.2023
Reading time: 5 mins.
Last Updated: 12.02.2024

Table of Contents

Dealing with complex cloud infrastructures and services in many different fields, there are often important security checks and audits. Thankfully the public cloud has services for monitoring, security, vulnerability checks and so on.

Working with AWS we have services like GuardDuty, CloudWatch, CloudTrail, Inspector etc. We can generate output files on all of them, so having a report is not that hard of a task when needed. But why do we need to face an audit to start looking at the security vulnerabilities in our infrastructure? 

Talking about vulnerabilities, AWS Inspector comes to mind. This is a service that automatically assesses applications for vulnerabilities and deviations from best practices, providing a detailed list of security findings and prioritized recommendations. Sounds great, right. But there are no notifications when a finding occurs for a critical issue. This means, the infrastructure engineers need to check the inspector regularly to monitor the infrastructure. It’s not that hard of a task, but it takes time and can be really annoying, especially if there are more important and crucial things to do. So, when there are no pre-built notifications by default, let’s create them by using a Lambda function, SNS topic and a S3 bucket.

Creating S3 bucket

First, you need to create a S3 bucket to store Inspector’s findings. To do that, simply go to Amazon S3 -> Create bucket. Write a name for it, in our case we choose aws-inspector-findings-01. Keep in mind that in S3 the bucket name should be unique.

Create bucket list

Keep the setting for blocking all public access, because you don’t want your vulnerabilities to be public, right?

Block Public Access Settings for This Bucket

After finishing with the setup and creation of the container go inside it then  Permissions -> Bucket policy -> Edit. You’ll need to give permission to the AWS Inspector to store findings inside the container. The policy should look something like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Allow inspector to perform Put and Delete actions on s3",
            "Effect": "Allow",
            "Principal": {
                "Service": "inspector2.amazonaws.com"
            },
            "Action": [
                "s3:PutObject",
                "s3:PutObjectAcl",
                "s3:AbortMultipartUpload"
            ],
            "Resource": "arn:aws:s3:::BUCKET_NAME/*",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "AWS_ACCOUNT_ID"
                },
                "ArnLike": {
                    "aws:SourceArn": "arn:aws:inspector2:REGION:WS_ACCOUNT_ID:report/*"
                }
            }
        }
    ]
}

Make sure you change the uppercase words with your information: BUCKET_NAME, AWS_ACCOUNT_ID, REGION.

Generate KMS Key

The next step before creating the lambda function is to generate a KMS key with some permissions. That’s because we don’t want to transfer and store our sensitive vulnerability information from the Inspector without any encryption. To generate your key go to Key Management Service -> Create a key and follow the steps.  After creating the key you’ll need to edit the Key policy. Bellow is an example of what should it look like:

{
    "Version": "2012-10-17",
    "Id": "ID",
    "Statement": [
        {
            "Sid": "Enable IAM User Permissions",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::AWS_ACCOUNT_ID:root"
            },
            "Action": "kms:*",
            "Resource": "*"
        },
        {
"Sid": "Allow inspector to perform kms actions",
            "Effect": "Allow",
            "Principal": {
                "Service": "inspector2.amazonaws.com"
            },
            "Action": [
                "kms:Decrypt",
                "kms:GenerateDataKey*"
            ],
            "Resource": "*",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "AWS_ACCOUNT_ID"
                },
                "ArnLike": {
                    "aws:SourceArn": "arn:aws:inspector2:REGION:AWS_ACCOUNT_ID:report/*"
                }
            }
        }
    ]
}

Setting up Lambda function

Now, you can start creating your Lambda function. The idea of the function is to generate a file when finding of a critical vulnerability occurs and store it to the S3 bucket automatically, without any manual steps.

In this example, we are going to use python, but you can choose the language of want. To create a function go to Lambda -> Create function. Give it a name (we choose aws-inspector-notifications-01), change the runtime to Python 3.10 and keep the architecture x86_64. Then you can click Create function.

Here is a code snippet if you want to use our example:

import json boto3

service = boto3.service('inspector2')

def lambda_handler(event, context):
    response = service.create_sbom_export(
        reportFormat='SPDX_2_3',
        resourceFilterCriteria={
            'ec2InstanceTags': [
                {
                    'comparison': 'EQUALS',
                    'key': 'Criticality',
                    'value': event['Criticality']
                },
            ]
        },
        s3Destination={
            'bucketName': 'BUCKET_NAME',
            'kmsKeyArn': 'KMS_KEY_ARN'
        }
    )
    
    return {
        'statusCode': 200,
        'body': json.dumps(response)
    }

In this example we are using modules json and boto3. Json module is for JSON serialization and deserialization. Boto3 module is imported to interact with AWS services. service variable initializes an AWS Inspector service using Boto3. Service variable allows interaction with AWS Inspector service.

Lambda_handler is the main Lambda handler function that gets called when the Lambda function is invoked and calls the create_sbom_export method of the AWS Inspector client. It exports a Software Bill of Materials (SBOM) report. The format (reportFormat) in our example is SPDX_2_3

The Filter Criteria is based on a value of key Criticality which will change later to Critical to filter only critical vulnerabilities.
The SBOM report will be exported to an S3 bucket specified by BUCKET_NAME, and the KMS key used for encryption is specified by KMS_KEY_ARM, so make sure you change them with your information. You can test the function by going to Test. Give it a name and don’t forget to set the filter to critical:

{
 "Criticality": "critical"
}
Test Event List

Then simply click Test. If everything is configured right you should see this green box with Executing function: succeeded.

Executing function Success

EventBridge Scheduler

Now, when the Lambda function is ready, you should create a Scheduler with AWS EventBridge. There are many different ways of triggering a lambda function, but for this example we are going to do it on a daily basis. To configure the scheduler go to EventBridge -> Scheduler -> Schedules -> Create schedule.  Give it a name and a description of your choice, Occurrence should be Recurring schedule, so it continuously triggers the lambda function on a cron basis. So here you can change the cron to trigger the function whenever you want, but we prefer it to be every day at 1:00:

Then for target choose AWS Lambda Invoke, then choose your Lambda function and for Payload write the json we use for testing:

SNS topic for notifications

So far so good, we have a scheduler, and function that sends findings to S3. Now we need SNS topic and S3 event notification for us to be able to be notified when a new finding appears in our bucket. Go to SNS -> Topics -> Create topic and follow the process. We’ll need to edit the policy for the sns topic, ours look like this:

{
  "Version": "2012-10-17",
  "Id": "__default_policy_ID",
  "Statement": [
    {
      "Sid": "__default_statement_ID",
      "Effect": "Allow",
      "Principal": {
        "AWS": "*"
      },
      "Action": [
        "SNS:GetTopicAttributes",
        "SNS:SetTopicAttributes",
        "SNS:AddPermission",
        "SNS:RemovePermission",
        "SNS:DeleteTopic",
        "SNS:Subscribe",
        "SNS:ListSubscriptionsByTopic",
        "SNS:Publish"
      ],
      "Resource": "SNS_TOPIC_ARN",
      "Condition": {
        "StringEquals": {
          "AWS:SourceOwner": "ACOUNT_ID"
        }
      }
    },
    {
      "Sid": "AllowS3ToPublishToSNSTopic",
      "Effect": "Allow",
      "Principal": {
        "Service": "s3.amazonaws.com"
      },
      "Action": "sns:Publish",
      "Resource": "SNS_TOPIC_ARN"
    }
  ]
}

Then you need SNS Subscription to your email address, or the email address you want notifications to be sent. You can follow the official AWS Documentation.

After the email is configured you have to go back to the S3 bucket we created at 1. Then go to Properties and scroll down to Event Notifications. Now we should create a notification for the findings with path, prefix and suffix, so the sns topic will notify you when a new object is uploaded there.

General Configuration

Select Event Types to Put and Post and scroll down to select your SNS topic.

To Wrap It Up

Now we have automation for receiving the findings for critical vulnerabilities on a day to day basis and fix them on time. You can always change the Lambda function to receive full findings every day. Or maybe you don’t need them every day. Maybe you need them once a week or once a month. That’s up to you. 

Leave a Reply

Your email address will not be published. Required fields are marked *

More Posts

In the dynamic world of serverless computing, securing your AWS Lambda function is crucial. However, one often neglected area is the security of containerized applications in Amazon Elastic Container Registry...
Reading
Note: The following example demonstrates upgrading a Kubernetes cluster from version 1.23 to 1.24. Replace the version numbers according to your specific setup. To ensure a seamless upgrade, it’s crucial...
Reading
Get In Touch
ITGix provides you with expert consultancy and tailored DevOps services to accelerate your business growth.
Newsletter for
Tech Experts
Join 12,000+ business leaders and engineers who receive blogs, e-Books, and case studies on emerging technology.