Saturday, May 31, 2014

Vulnerability Data into Elasticsearch

My day job has me focusing on Elasticsearch more these days. A while back I did a post on getting vulnerability data into ELSA. As a follow up I have been meaning to write a brief post on how to do the same with Elasticsearch. If you are not familiar with Elasticsearch go check it out here. From their website it is classified as "distributed restful search and analytics". It is often combined with Logstash and Kibana forming the "ELK" stack. The same reasons that having vulnerability data available with your event logs was a good idea in ELSA also apply if you are using the ELK stack. I modified my existing script to take an input file from one of several vulnerability scanners and index the results with Elasticsearch.

Before we begin my Python script makes use of the Elasticsearch API. I installed it via pip:

# pip install elasticsearch

I assume  an index exists called vulns. You can create it by hitting up the Elasticsearch API like this:
$ curl -XPUT http://localhost:9200/vulns
Different vulnerability scanners present time formats slightly different. It is a good idea to format it appropriately. For more information in the Elasticsearch docs check here. This is a sample API call you could make:

After the indexes are created you can run the script with XML output from a vulnerability scanner as input.
python -i nessus_report_test_home.nessus -e -r nessus

I have created a very simple dashboard in Kibana to visualize some of the vulnerabilities.

The script and dashboard can be found at my Github page:

Thursday, November 28, 2013

Vulnerability Data into ELSA

At Security BSides Augusta I released a script that would take a variety of vulnerability scanner data and import it into ELSA. I have been meaning to get a blog post about its usage but just haven't gotten around to in. With a couple days off of the holiday, here it is.

First the script is called and you get find it at my Github account. I have created Nessus and OpenVAS to ELSA scripts in the past. This script combines all of the above plus it adds support for NMap and Nikto all in one place.

The script is very straight forward to use. Simply give it a Nessus, OpenVAS, NMap, or Nikto output report in XML format and an ELSA IP address and you should be off to the races.

$ python –i report.nessus –r nessus –e elsa_ip

Before running the script for the first time you will want to create the XML and SQL file for ELSA to recognize the syslog output the script provides.  The -x and -s option will automatically create it for you and output them to files.

"Usage: [-i input_file | input_file=input_file] [-e elsa_ip | elsa_ip=elsa_ip_address] [-r report_type | --report_type=type] [-s | --create-sql-file] [-x | --create-xml-file][-h | --help]"

As always I welcome feedback and would be happy to add any more vulnerability assessment tools to it if you have recommendations. I would ask that you send me a sanitized output report file since I might have limited access to the tool.

Wednesday, October 23, 2013

ELSA Parsing Video

I have decided to do a video on creating parsers for ELSA. This one is on creating the parsers for syslog-ng. Forgive the text size on my terminal. You will probably have to go full screen to see all the details.

Thursday, May 30, 2013

My Presentation on Risk Assessments

Recently at the Jack Henry Smokey Mountain User Group meeting I did a presentation on risk assessments. You can find a link to a video I created here or watch it below.


Here is the presentation.

Monday, May 20, 2013

OpenVAS 6 and CentOS 6.4

On April 17th OpenVAS 6 was released. The OpenVAS folks have provided install instructions for installation on a variety of Red Hat flavors that can be found here.  Sometime soon they will be releasing a virtual appliance with everything up and running. However, in the mean time I wanted to try   out the new release.
The install instructions for CentOS did not quite get me there. After I followed the instructions on the OpenVAS site I hit the Greenbone UI at https://localhost:9392. When I tried to log in a red error message came up that said the OMP service was down. OMP stands for OpenVAS Management Protocol. It is for doing things like starting / stopping scans and creating / deleting users. In order to save some of you the trouble I created a complete list of commands to get OpenVAS 6 up and running on CentOS 6.4.

# wget -q -O - |sh
# yum install openvas
# openvas-setup
# openvas-certdata-sync
# openvasmd --rebuild
# openvasmd

Note the 'yum install openvas' and 'openvasmd --rebuild' commands will take a few minutes. So be patient with them. I look forward to working with the new version and seeing what it can do.

Saturday, March 16, 2013

pfSense into ELSA

One good piece of open source software deserves another right? It is time for a match made in heaven: pfSense and ELSA. pfSense is my favorite open source firewall. I run it in multiple places and it has always been rock solid. If you have read my blog at all you know my appreciation of ELSA. Therefore, I thought it would be a good post to combine the two.

Prepping pfSense
Before you start firing off syslogs to your ELSA server you will need to fix a small issue with pfSense. Apparently, pfSense logs to a binary data file which is then sent to tcpdump to parse. Something in the parsing causes issues with newlines. Single log entries will be sent as two syslogs making parsing them in ELSA (or any other parser) extremely difficult. Fortunately, this site has a simple script and fix for the issue. A PHP script is used to modify the filter file in pfSense. It uses SED in a clever way to weed out all the newlines in the log entries giving you one syslog per entry. You can check out the link above for more information but, I have included the script for convenience. Before you run the script you will probably have to remount the file system as read/write. If you haven't already done so enable SSH on you pfSense firewall to perform these actions.

Mount the file system as read/write

# /etc/rc.conf_mount_rw

Use vi on the firewall to create this file. Make sure to enclose script in PHP brackets

$filter=file_get_contents ('/etc/inc/');
$filternew =
        "-ttt -i pflog0 | logger -t pf -p",
        "-ttt -i pflog0 | /usr/bin/sed -e 'N;s/\\\\n //;P;D;' | logger -t pf -p",$filter);
if (strcmp($filter, $filternew) !=0) {
Execute the script to create the new filter and move it over. I had to reboot my device for this to take effect.

# chmod +x chgfilter.php
# php chgfilter.php
# mv /etc/inc/

Now your pfSense firewall should be syslogging correctly. Just point it at your ELSA server and you should be done configuring it.

Creating the patterndb XML file
First kudos to InfoSec matters blog on configuring Vyatta for ELSA. It made my work a lot easier here. Essentially I used two preexisting ELSA classes, FIREWALL_ACCESS_DENY and FIREWALL_CONNECTION_END, for parsing the pfSense firewall logs. The FIREWALL_CONNECTION_END log is for pass traffic and the FIREWALL_ACCESS_DENY log is for blocked traffic. You might have to turn on logging in pfSense for whatever rules you want to send to syslog. Below is the XML you can add to the patterndb.xml file.
After you have added that code simply restart syslog-ng and all your pfSense firewall logs should be parsing correctly in ELSA.

# service syslog-ng restart
I would like to get some dashboards up soon. The code for the above examples is also on Github. Hopefully, that will be another blog post in the near future. As always, I welcome your comments and feedback.

Saturday, March 9, 2013

Process Data to ELSA

Process Data to ELSA


It seems the useful things ELSA can help a security professional with are endless. A couple of weeks ago I thought it would be interesting to include a snapshot of running process data from Windows hosts in ELSA. Therefore, I wrote a simple script in Python to input a list of hosts, by IP address, capture all the running processes via WMI, and send them to the ELSA via syslog. If you want to use this script but do not have Python installed on a machine, I created a Windows executable.  Check my GitHub account to download the Python script/Windows executable, MySQL file to alter the ELSA database schema and XML file for appending patterndb.xml  used with syslog-ng. First let's take a look at how to configure ELSA for use with the script and then we will look at what you can do with it.


For this example I will assume you downloaded the Windows executable version of my script. You will also want to download patterndb_process.xml and PROCESS_db_setup.sql. First cut all of the text from patterndb_process.xml and add it to patterndb.xml as shown.

Next, update the MySQL database schema and restart syslog-ng since you modified patterndb.xml. 

# mysql < PROCESS_db_setup.sql
# service syslog-ng restart

Now your ELSA installation should be configured and ready to accept syslogs containing Windows process data. 

Go to the Windows host which you will want to run the script from. It might be helpful to create a separate folder. Copy the script into that folder and create a text file containing the list of hosts which you plan on analyzing process information. The script reads one address per line. The script will require a hosts file with which to pull syslog data from (-i), a Windows username (-u) and password (-p), and the ELSA server IP address (-l). Here is a screenshot on how to run the script from Windows.

You can also include a file called scanID.txt. The script might throw an error the first time looking for the file, however if you run it again it with the same parameters it should create the file automatically and work fine. The idea behind the scanID.txt file is to give each scan or process data pull a unique scan ID number to more easily track them in ELSA. Every subsequent time you run the script, this ID will increment. I designed the script to work well as a scheduled task in Windows. 


I tried to capture interesting information about processes that would be helpful to an analyst or security professional. Data such as process name, operating system type, process ID, executable directory, parent process ID and parent process name (if available) are all collected. 

I think the opportunities from a security perspective can be pretty interesting. For example suppose an analyst wants to see all the places where Java is currently running in the environment. You could simply do a query in ELSA such as 'class=PROCESS +java'. 

Furthermore, suppose an attacker is hiding a backdoor in an unusual directory. You can search ELSA for process names and sort by directory. 

In the example I searched for cmd.exe and noticed it was running in three different places. However, in the last log entry it is not running in the standard directory. Hmm.....

There is a lot more interesting data that can be mined from taking periodic snapshots of processes. An example might be searching for anomalies in parent processes. Searching for a parent process of iexplorer.exe might yield some interesting results. I would love to hear other cool ideas on how to use this data. As always I welcome you feedback.