Nightmare on Incident Response Street

Steve and I will be presenting at next week’s ISACA 2012 Annual Conference Perth, held on the 31st of October. We’re pretty damn excited (we’re going for a pretty radical form of presentation format, so, we apologise to all those people who are really looking forward to PPT slides), not just because of the presentation, but, also because it’s one of our favourite days: Halloween! Steve and I will be dressing up as Gomez and Morticia from the Addams Family.

Our synopsis, just in case you wanted to learn a bit more:

Over the last several years we have seen the trend of large data breaches continue. Incident response is critical during these incidents and done well, can protect the reputation and customer base of the organisation involved. In this presentation we will review select case studies of security incidents in 2012, and ask if these ‘black swans’ are really black any more. We will then pose the concept that organisations should assume that the nightmare has already occurred, and discuss the importance of planning your incident response from end to end, including data acquisition and handling, event detection and triage, containment and response, and of course your communication strategy.

Supporting this foray into incident response we’ll also be covering available maturity models, and incident frameworks. The combination of these will give you a good starting point for reviewing and growing the capability you have within your organisation.

So come on down and say ‘hi!’.

Introducing Prenus .. the Pretty Nessus .. thing.

One of my passions in information security is finding new ways to look at old problems. Big problems and small problems, doesn’t really matter. I’ve found a really useful way to look at these problems is visualising them. Last year this led me to hack together Burpdot, a Burp Suite log file to Graphviz “DOT” language formatted file for transformation into a graphic. Over the past few months we’ve been spending quite a bit of time with Nessus, and when you’re dumped with tons of hosts and hundreds and hundreds of findings what are you meant to do?

I don’t believe many would argue that the default Nessus web UI is ideal for analysing bulk data. And I’m not the only person to construct something to parse and process Nessus files into HTML/XLS files, you can see Jason Oliver’s work here. Hence, Prenus, the Pretty Nessus .. thing. Combining my love of Nessus, visualisation, and hacking stuff together.

Following the same principles as Burpdot, Prenus simply consumes Nessus version 2 exported XML files, and outputs the data in a few different formats. Initially, I was interested in finding a better way to analyse the results (Personally, I find the the Flash web interface frustrating), so the first output was a collection of static HTML files with Highcharts generated pie and bar graphs.

For example, the below creates a folder called ‘report’ with a bunch of HTML files:

$ prenus -t html -o report *.nessus

The top of the Prenus index, highlighting unique Nessus criticality ratings, and a split of the top 20 hosts
The top of the Prenus index, highlighting unique Nessus criticality ratings, and a split of the top 20 hosts

Vulnerability Overview page
Vulnerability Overview page

Vulnerability Detail
Vulnerability Detail

Host Overview
Host Overview

Not just wanting to end there, I thought it’d be useful to also generate a simple 2 column CSV formatted output that could be consumed and processed by Afterglow.

For example:

$ prenus -t glow *.nessus
46015 (4),
46015 (4),
46015 (4),
53532 (4),
53532 (4),
53532 (4),

But, if piped through Afterglow with our file, and then through Graphviz (in this instance Neato), you get something like this (I had to run this from within the afterglow/src/perl/graph folder):

$ prenus -t glow ~/prenus/*.nessus | ./ -t -c ~/prenus/ | neato -v -Tpng -Gnormalize=true -Goutputorder=edgesfirst -o prenus.png

Prenus Afterglow Example
Prenus Afterglow Example

If you prefer a Circos-style graph you can do that too. The circos output mode generates a tab formatted table output which can be consumed by the Circos “TableViewer” tool. Wrapping that together is relatively simple (I use the word simple here lightly, getting Circos working on OS X was a bit of a pain due to GD deps, but, super simple to do on Linux). The following two commands assume the following directory layout:


Executed from within the “tableviewer” folder, the following should create a file called “prenus.png” in the “img” folder.

$ prenus -t circos ~/prenus/*.nessus | bin/parse-table -conf samples/parse-table-01.conf | bin/make-conf -dir data
$ ../../../circos-0.62-1/bin/circos -conf etc/circos.conf -outputfile prenus

Prenus Circos Graph Example
Prenus Circos Graph Example

Potentially useful for you analysis, or maybe just some prettiness to add to your reports. Any methods or tools that can help dig through stacks of data are pretty useful to us. The above diagram has a few layout issues, so if you want to just analyse your critical severity issues you can include the “-s 4” flag to prenus.

I’ve got an ad-hoc list of enhancements which include:

  1. 1. Construct an EC2 bootstrap to allow you to deploy this on a throw-away environment so as not having to fight with the damn dependencies like I did
  2. 2. Look at using d3.js for ALL chart generation instead
  3. 3. Perhaps just looking at a standalone Rails app for your own deployment (either local, Heroku or whatever)

Let me know what you think, or, just grab the code and have a play yourself:

Symantec Endpoint Protection: Setup.exe extruder

What do you do when you need to create around 40 Symantec Endpoint Protection packages?!

I’m way too impatient to do it manually, and after automating the sylink creation (see previous post), I got the idea of automating the setup.exe creation.

Ok, first thing you will need to do is setup your sylink files: instructions here

You will need 7-zip on your SEPM, this allows us to update the contents of the zip archive.

Export a setup msi directory from your SEPM, do not create a single exe file.
Once you have done this, zip up the output into a regular zip file call, this will be your $setup_src

Running the Script:
1) put all of your sylink files into a directory structure like this:


2) create a domains.txt file in the Sylink/ directory:


3) create a groups.txt (or use your previous groups.txt from the sylink creation) and put one in each domain directory (ie: Sylink/domain1/groups.txt). The groups.txt has a list of each group:


4) find the makesfx.exe: it is on the SEPM, in your SEPM install path: /Symantec Endpoint Protection Manager/tomcat/bin, and copy it to a convenient location. You will point the script variable $MakeSFX to it.

5) edit the script variables, make sure the paths point to the right places. Note for $bits and $type, I manually update these depending if im exporting 32 or 64 bit packages and Server or Desktop packages (depending on the SEP features in the package)

$update = @"
C:\"Program Files"\7-Zip\7z.exe u
$delete = @"
C:\"Program Files"\7-Zip\7z.exe d
$MakeSFX = "D:\temp\MakeSFX.exe"
$setup_src = "D:\temp\"
$setup_dst = "D:\Program Files\Symantec\SEP Agents\bulk\"
$sylink_dir = "D:\Program Files\Symantec\SEP Agents\Sylink\"
$domains_txt = $sylink_dir + "domains.txt"
$bits = "_x32"
$type = "_Desktop"

6) run the script and marvel at how much faster you can extrude out (think sausage factory ūüôā ) setup files!

You can pull the scripts from Asterisk Labs Github repository

Im pretty sure I have hit the Win inflection on this chart:
Geeks and repetitive tasks

Actually, this script generated about 40 setup exe’s for me in 20 minutes. If it takes about 5 minutes to export a setup.exe from the SEP console, Im certain I’m in front, even with script setup time, and definately with reduction in Repetitive Click Boredom.

Symantec Endpoint Protection: Sylink.xml hacking to automate SEPM migration

I have completed a couple of projects recently migrating customers from Symantec Endpoint Protection v11.0 to v12.1, including moving to a new SEP Manager. In these projects, the decision was made to do a fresh install of the SEPM, and move the clients into the new manager, without using replication between the old and new SEPM.

The file on the SEP v11 client called Sylink.xml tells the client which server to connect to, what the server certificate is and which group the client should join on the SEPM (among other things).

There is a tool on the SEP media part 2 in Tools\SylinkDrop.exe which can be used to swap the Sylink.xml file on the SEP client.

During these projects I was looking for a way to simplify the creation of the sylink.xml file, the projects both involved a large number of client groups РI didnt really want to manually export the communication settings for over 30 groups!

This lead me on a powershell path of discovery, and the realisation that powershell could load an xml file as a data object, manipulate it, and then write it out! This was exactly what I needed.

The process I used to generate the sylink files was:
– Create the group structure in the SEPM
– Create a groups.txt file: this file had a list of SEP client groups per line
– export a sylink.xml file from the destination SEPM: for example, download the communication settings file from the “My Company” top
level group.
– run the update_sylink.ps1 powershell script
– deploy the sylink files to the SEP agents with SylinkDrop.exe

The groups.txt file contained lines like:

My Company\Desktops
My Company\Desktops\WA
My Company\Desktops\NSW
My Company\Desktops\NT
My Company\Desktops\QLD
My Company\Desktops\SA
My Company\Desktops\TAS
My Company\Desktops\VIC

Note that SEP is case sensitive – the groups.txt file must match the group names in the SEPM.

When you export the sylink.xml, you will end up with a file that looks like:

<?xml version="1.0" encoding="UTF-8"?><ServerSettings DomainId="BC7791940DEADBEEF3A86829"><CommConf><AgentCommunicationSetting AlwaysConnect="1" CommunicationMode="PULL" DisableDownloadProfile="0" Kcs="18F84DEADBEEF2CF46D" PullHeartbeatSeconds="1800" PushHeartbeatSeconds="300" UploadCmdStateHeartbeatSeconds="300" UploadLearnedApp="0" UploadLogHeartbeatSeconds="300" UploadOpStateHeartbeatSeconds="300"/> <LogSetting MaxLogRecords="100" SendingLogAllowed="1" UploadProcessLog="1" UploadRawLog="1" UploadSecurityLog="1" UploadSystemLog="1" UploadTrafficLog="1"/>
<RegisterClient PreferredGroup="My Company\Workstations (location based)" PreferredMode="1"/> <ServerList FreezeSmsList="0" Name="Default Management Server List"> <ServerPriorityBlock Name="List0"> <Server Address="" HttpPort="8014" VerifySignatures="1"/> <Server Address="SEPM"
HttpPort="8014" VerifySignatures="1"/> <Server Address="SEPM" HttpPort="8014" VerifySignatures="1"/> </ServerPriorityBlock> </ServerList> <ServerCertList>

<Certificate Name="SEPM">MIICujCCAiOgAwIBAgIQhjuQQqXvBWWzipD7elI3oTANBgkqhkiG9w0BAQUFADB3MXUwCQYDVQQI&#xd;


vUN+ypsPydoiLKd7uMsNWaFGzP4JKJjiJsrhGi3l1pLlR553GxZz2UZ1zbX7knjjiReVLrniIyYd&#xd; CPFkI/DEADBEEF+fnUbxr259h</Certificate>



The powershell script can actually update any token in the xml file, we are just using it to update the PreferredGroup item:

$xml_orig = New-Object XML

$grps = get-content groups.txt
ForEach ($i in $grps) {
$PreferredGroup = $i.toString()
$xml_new = New-Object XML
$xml_new = $xml_orig

$xml_new.ServerSettings.CommConf.RegisterClient.PreferredGroup = $PreferredGroup

$j = $i.Replace("\", "_")
$filename = $j.Replace(" ", "_")
$xml_new.Save($filename + "_Sylink.xml")
remove-variable xml_new

The script needs the groups.txt and sylink.xml to be in its currect directory.
The sylink files will be output to the currect directory, with the group name preceding, eg: My_Company_Desktops_WA_Sylink.xml.
All that is left is to run SylinkDrop -s “target sylink.xml” on the agents to repoint them to the new SEPM. I have used both Symantec¬†Management Platform (Altiris) to do this, and AD group policy.

You can pull the scripts from Asterisk Labs Github repository

I have used this script to save a lot of time generating sylink files for migrations. This also gave me an idea for automating the creation of a large number of setup.exe files for SEP deployments: Stay tuned for more on that!

Devise Google Authenticator 0.3.3

A couple of weeks back, whilst working on some building some internal management apps, I finally got around to implementing the Devise Google Authenticator gem into a rails app outside of its own testing app. During this process I realised that I hadn’t correctly updated some of the extension’s code to properly work with the Devise 2.0 release, in particular the changes to the migration schema. A few amendments, a push or two and version 0.3.3 was now available.

Looking back over the process I’ve certainly learned a lot about Ruby, Rails and Devise, plus the whole Ruby Gems eco-system. What’s surprising though, is the number of people out there who appear to be using the gem. At a high level the breakdown is as follows:

So far though, we’ve only had a few queries come in. But, to try and capture them in a more appropriate place I’ve started a Google Groups which, if you wish, you can sign up to and post queries. Or, if it’s easier, just hit us up on twitter: @xntrik or @asteriskinfosec.