A python script to acquire multiple aws ec2 instances in a forensically sound-ish way

Overview

acquire_ec2.py

The script acquire_ec2.py is used to automatically acquire AWS EC2 instances. The script needs to be run on an EC2 instance in the same region as the EC2 instances that should be acquired. It was developed for forensic acquisition if an analyst wants to apply traditional forensic analysis of AWS EC2 instances. It was developed for internal use by a former DT-Sec employee. No guarantees are made

Acquisition Process

Since there is no export functionality available in neither the AWS console nor the AWS API, the acquisition of EC2 instances or more precisely their attached EBS volumes needs to be done manually. This is the reason why this script was developed. Basically, the script automatically discovers and images the EBS volumes attached to a list of EC2 instances identified by their EC2 instance ID. The raw images files are written to an S3 bucket where they can be downloaded for further analysis.

For each of the discovered EBS volumes the following process is executed:

  1. Create snapshot of EBS volume
  2. Create temporary EBS volume based on created snapshot
  3. Attach temporary EBS volume to acquisition host
  4. Image attached EBS volume using dd and write image to S3 bucket
  5. Hash volume image
  6. Detach temporary EBS volume
  7. Delete temporary EBS volume

The following diagram shows an overview of the acquisition process.

acquire_ec2_architecture

Prerequisites

Acquisition Host and Dependencies

The script needs to be run as root on an EC2 instance in the same region as the EC2 instances that should be acquired. The EC2 instance which is used for acquisition (acquisition host) should be sized as

  • r6g.xlarge (Preferred, ARM-based, 10 Gbit network) or
  • r5n.xlarge (x64-based, 25 Gbit network).

Since the script uses multiple processes to acquire EC2 instances, more CPU cores will lead to more acquisition processes that can be executed concurrently. By default, cpu_count - 1 processes are spawned. Hence, for large environments to acquire a larger instance with at least 8 CPU cores is beneficial.

The scripts needs to be run on an Linux based AMI (e.g. Amazon Linux or Ubuntu). The following software / commands needs to be present on the acquisition host:

  • python3
  • python3-pip
  • aws cli
  • dd
  • blockdev
  • sha256sum

As for Amazon Linux, this leads to the following packages to be installed additionally:

  • python3
  • python3-pip

The packages can be installed by running the following command.

$ sudo yum install python3 python3-pip

As for Ubuntu, the following packages need to be installed:

  • awscli
  • python3-pip

Please consider that awscli should be installed using pip3 since the package in the repository is broken. The installation commands are as follows

$ sudo apt install python3-pip
$ sudo pip3 install awscli

In order to fulfill the Python dependencies of acquire_ec2.py the following pip packages need to be installed

  • boto3==1.17.5
  • botocore==1.20.5
  • certifi==2020.12.5
  • chardet==4.0.0
  • idna==2.10
  • jmespath==0.10.0
  • multiprocessing-logging==0.3.1
  • python-dateutil==2.8.1
  • requests==2.25.1
  • s3transfer==0.3.4
  • six==1.15.0
  • urllib3==1.26.3

To install the python dependencies run the following command.

$ sudo pip3 install -r requirements.txt

S3 Bucket

To store the acquired volume images an S3 bucket is needed. This destination S3 bucket needs to be created before running the script and needs to be accessible from the internet with a proper access and secret key with multi-factor authentication. Furthermore this S3 bucket needs to encrypted using a custom KMS key. Typically, this S3 bucket should be created by the personnel operating the AWS environment.

Policy

To allow access to the relevant resources the following policy should be implemented.

{
    "Version": "2021-02-26",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:PutObjectTagging",
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::<bucket-name>/*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "ec2:CreateVolume",
                "ec2:AttachVolume",
                "ec2:DetachVolume",
                "ec2:DeleteVolume",
                "ec2:CreateSnapshot",
                "ec2:DescribeSnapshots",
                "ec2:DescribeVolumes",
                "ec2:DescribeInstanceAttribute",
                "ec2:DescribeInstances",
                "ec2:CreateTags",
                "ec2:DescribeTags",
                "ec2:DeleteTags"
            ],
            "Resource": [
                "*"
            ]
        }
    ]
}

The policy can either be attached to the acquisition host EC2 instance as instance role (tested) or to a specific IAM user (currently not tested).

KMS Key Access

Since customer managed KMS keys should be used for the creation of volumes, access to the relevant KMS key needs to be granted to the instance IAM role or to the IAM user.

Needed from AWS Operations Team

The following is needed from the AWS operations team in order to successfully acquire EBS volumes:

  • EC2 acquisition host which has the access rights described in the policy section. The policy can be set up as instance role.
  • S3 Bucket to store the acquired images. The bucket must follow the currently valid security requirements. The bucket needs to be accessible from the environment you want to download the data to (ip based).
  • Access to the KMS key used by the application to encrypt EBS volumes.
  • Access to the KMS key that should be used for the acquisition process.

Future Work Ideas

  • Set up a template (AMI) for the acquisition host
  • Evaluate the usage of an independent forensics VPC that does not have to be set up for each incident in the application VPC

Usage

The script can be used as follows.

[[email protected] ec2-user]# python3 acquire_ec2.py -h


                           _                        ___
   ____ __________ ___  __(_)_______      ___  ____|__ \
  / __ `/ ___/ __ `/ / / / / ___/ _ \    / _ \/ ___/_/ /
 / /_/ / /__/ /_/ / /_/ / / /  /  __/   /  __/ /__/ __/
 \__,_/\___/\__, /\__,_/_/_/   \___/____\___/\___/____/
              /_/                 /_____/

 Ver. 1.0

usage: acquire_ec2.py [-h] --case CASE --instance-list INSTANCE_LIST
                      --s3-bucket S3_BUCKET [--akey AKEY] [--skey SKEY]
                      [--kms-key KMS_KEY]

Script for backing up network devices

optional arguments:
  -h, --help            show this help message and exit
  --case CASE           Case name (no whitespaces allowed)
  --instance-list INSTANCE_LIST
                        List of EC2 instance IDs to acquire
  --s3-bucket S3_BUCKET
                        S3 bucket used to store forensic images
  --akey AKEY           AWS access key ID
  --skey SKEY           AWS secret access key
  --kms-key KMS_KEY     AWS KMS key ID

The following mandatory parameters need to be passed to the script:

  • case (case name)
  • instance-list (text file that contains the EC2 instance IDs to acquire)
  • s3-bucket (S3 bucket name)

If no KMS key ID is passed, the standard encryption key is used. F or productional use of the script, a customer managed KMS key should be passed.

For example the script can be called as follows.

$ sudo python3 acquire_ec2.py --case "CASE-NAME-AND-ID" --instance-list ./ec2_list.txt --s3-bucket forensic-foo

In this example, the instance IDs that should be acquired are placed in the file ec2_list.txt. The volume images will be written to the S3 bucket forensic-foo under the case CASE-NAME-AND-ID.

To use a customer managed KMS key, the script can be called as follows.

$ sudo python3 acquire_ec2.py --case "CASE-NAME-AND-ID" --instance-list ./ec2_list.txt --s3-bucket forensic-fu --kms-key eebbb888-eeee-4444-baba-085bbbbbbbbb

In this example, the KMS key with the ID eebbb888-eeee-4444-baba-085bbbbbbbbb is used for encrypting the temporary volumes created during the acquisition process.

Acquisition Performance

The following performance tests were done in the CDC AWS sandbox environment and compares the acquisition performance of different EC2 instance types used for the acquisition host. All tests are based on gp3 EBS volumes with 500 MB/s throughput and at least 4,000 IOPS.

Instance Type AMI Network Write to S3 (MB/s) Read from S3 (MB/s)
r5n.xlarge Amazon Linux 25 GBit 105 MB/s 162 MB/s
r6g.xlarge Ubuntu 10 Gbit 165 MB/s 170 MB/s
r5b.xlarge Amazon Linux 10 GBit 150 MB/s 133 MB/s
Owner
Deutsche Telekom Security GmbH
Telekom Security on GitHub, home of T-Pot, PEBA, Explo and more.
Deutsche Telekom Security GmbH
Robust and blazing fast open-redirect vulnerability scanner with ability of recursevely crawling all of web-forms, entry points, or links with data.

After Golismero project got dead there is no more any up to date open-source tool that can collect links with parametrs and web-forms and then test th

railway zeppelin 34 Aug 25, 2022
FTP Anonymous Login

FTPAnon FTP Anonymous Login Install git clone https://github.com/SiThuTuntimehacker/FTPAnon cd FTPAnon bash install.sh access ftp sever " ftpaccess.tx

SiThuTun 3 Mar 23, 2022
Instrument asyncio Python for distributed tracing with AWS X-Ray.

xraysink (aka xray-asyncio) Extra AWS X-Ray instrumentation to use distributed tracing with asyncio Python libraries that are not (yet) supported by t

Gary Donovan 12 Nov 10, 2022
Request based Python module(s) to help with the Newegg raffle.

Newegg Shuffle Python module(s) to help you with the Newegg raffle How to use $ git clone https://github.com/Matthew17-21/Newegg-Shuffle $ cd Newegg-S

Matthew 45 Dec 01, 2022
This is a Python bot, which automates logging in, purchasing and planting the seeds. Open source bot and completely free.

🌻 Sunflower Land Bot 🌻 ⚠️ Warning I am not responsible for any penalties incurred by those who use the bot, use it at your own risk. This BOT is com

Newerton 18 Aug 31, 2022
Discord.py(disnake) selfbot

Zzee selfbot Discord.py selfbot Version: 1.0 ATTENTION! we are not responsible for your discord account! this program violates the ToS discord rules!

1 Jan 10, 2022
Michelle is a Discord Bot coded in Python with Discord.py by Mudit07.

Michelle is a Discord Bot coded in Python with Discord.py by Mudit07.

Michelle 3 Oct 09, 2021
WallAlley.bot is an open source and free to use financial discord bot originaly build for WallAlley server's community

WallAlley.bot About WallAlley.bot is an open source and free to use financial discord bot originaly build for WallAlley server's community. All data a

Mohammad KHADDAN 1 Jan 22, 2022
聚合空间测绘搜索(Fofa,Zoomeye,Quake,Shodan,Censys,BinaryEdge)

#Search-Tools Search-Tools集合比较常见的网络空间探测引擎 Fofa,Zoomeye,Quake,Shodan,Censys,BinaryEdge 简单说明 ICO搜索目前只有Fofa,Shodan,Quake支持 代理设置是防止在API请求过于频繁,或者在实战中,好多红队打

311 Dec 16, 2022
A simple tool that allows you to change your default AWS CLI profile.

Select AWS Profile Select AWS Profile (slapr) is a simple tool that lets you select which AWS Profile you want to use and sets it as the default AWS p

Antoni Yanev 2 Nov 09, 2022
A simple Discord bot that notifies users of new Abitti versions

A simple Discord bot that notifies users of new Abitti versions. New features might be added later on. If you have good ideas, feel free to do a PR.

1 Feb 11, 2022
Generate direct m3u playlist for all the channels subscribed in the Tata Sky portal

Tata Sky IPTV Script generator A script to generate the m3u playlist containing direct streamable file (.mpd or MPEG-DASH or DASH) based on the channe

Gaurav Thakkar 250 Jan 01, 2023
QR login for pyrogram client

Generate Pyrogram session via QRlogin

ポキ 18 Oct 21, 2022
A python library for building user interfaces in discord.

blurple.py A front-end framework for discord.py Blurple.py is a framework built on top of discord.py, giving you the tools you need to build discord b

4 Oct 25, 2021
可基于【腾讯云函数】/【GitHub Actions】/【Docker】的每日签到脚本(支持多账号使用)签到列表: |爱奇艺|全民K歌|腾讯视频|有道云笔记|网易云音乐|一加手机社区官方论坛|百度贴吧|Bilibili|V2EX|咔叽网单|什么值得买|AcFun|天翼云盘|WPS|吾爱破解|芒果TV|联通营业厅|Fa米家|小米运动|百度搜索资源平台|每日天气预报|每日一句|哔咔漫画|和彩云|智友邦|微博|CSDN|王者营地|

每日签到集合 基于【腾讯云函数】/【GitHub Actions】/【Docker】的每日签到脚本 支持多账号使用 特别声明: 本仓库发布的脚本及其中涉及的任何解锁和解密分析脚本,仅用于测试和学习研究,禁止用于商业用途,不能保证其合法性,准确性,完整性和有效性,请根据情况自行判断。

87 Nov 12, 2022
send sms via grafana alert webhook

notifier fire alarm What does this project do: the aim of this project is to send alarm notification from grafana alert manager via kavenegar api. sta

Ali Soltani 4 Oct 20, 2021
The successor of GeoSnipe, a pythonic Minecraft username sniper based on AsyncIO.

OneSnipe The successor of GeoSnipe, a pythonic Minecraft username sniper based on AsyncIO. Documentation View Documentation Features • Mojang & Micros

1 Jan 14, 2022
A telegram bot that sends a meme a day, from reddit's top meme of the day

MemeBot A telegram bot that sends a meme a day, from reddit's top meme of the day You can use the bot either with an external scheduler (ex: pythonany

Michele Vitulli 1 Dec 13, 2021
Flood discord webhooks

Webhook-Spammer Flood discord webhooks Asynchronous webhook spammer Fast & Efficient Usage - Use it with atleast 500 threads Put a valid webhook Use a

trey 1 Apr 22, 2022
Asynchronous wrapper для Gismeteo.ru.

aiopygismeteo Асинхронная обёртка для Gismeteo.ru. Синхронная версия здесь. Установка python -m pip install -U aiopygismeteo Документация https://aiop

Almaz 6 Dec 08, 2022