Using CloudFormation to set up Splunk and AWS ElasticSearch

rav3n
4 min readMar 25, 2019

During one of my last posts, I put together a CloudFormation template to deploy a Rapid7 Nexpose and Tenable Nessus vulnerability scanner to allow for easier evaluation/learning. For this post, I wanted do a similar post but with two of the top log aggregation tools; Splunk and ElasticSearch.

The template will set up two t2.small instances so this will cost a little bit of money while the instances are on.

To be transparent, I started using Splunk several years ago and am just now exploring ElasticSearch. Just like the previous post, I’ll let you decide which one you like best since both are powerful in the right hands.

Getting started

If you want to follow along, you’ll need the following:

  • Template is here.
  • Some data to upload to Splunk. I decided to use this dataset from Kaggle.

For ElasticSearch, I used the sample data inside of Kibana. I’ll explain further down. If you are wondering what’s a good data set for Splunk, I’d check Kaggle.

Before deploying, I really want to encourage reading and walking through the template to really understand what’s going on behind the scenes. Don’t fear it!

Deploying

I headed over to the console and switched to the CloudFormation service. From here, I selected Create Stack, and uploaded the template from this project’s Github repo.

I clicked Next and filled out the following parameters:

  • Stack Name — this can be whatever name you want
  • Elastic Domain Name — this can also be whatever name you want, have fun with it?
  • IP Address — this will be your IP address to access the frontend of Splunk and Kibana (the visualization layer of ElasticSearch). It needs to be in /32 notation.
  • Key Pair — this will be an existing key pair inside of the console. It should have a drop down if you are in the east (N. Virginia) or west (Oregon) region. If you don’t have one, you’ll probably need create one under the EC2 service > Key Pairs or upload one of your own.

It will take a little bit for the ElasticSearch domain and Splunk instance (Splunk takes longer) to be configured (Overall less than 15 minutes). I tried to make it as easy as possible so the URLs to access both will appear in in the Outputs tabs.

ElasticSearch

This should be the page you see once you load up the Kibana URL from the Outputs tab.

I clicked Add log data and added the Sample flight data. It’ll take a few seconds to populate. After that, if you click on Dashboard, you shall see the following screen:

Pretty awesome looking huh? Onward to Splunk!

Splunk

If you take the Splunk URL from the Outputs tab, you should see the login page.

  • Username: admin
  • Password: Check the seed-password on line 75 of the template.

After logging in, I selected Add Data and Upload on the next screen.

I left the default settings for the sake of keeping it simple so the source type was set to JSON (it auto-detected this) and index was default.

Then I just kept clicking Next….

….until I could Start Searching the data.

Done! Happy data exploring!

I hope this quick setup helps someone curious about these tools. For more information, check out the awesome communities:

When done playing around with the tools, don’t forget to delete the stack.

--

--