The road to CREST

Hey, Dave here.. I’ve recently sat the CREST Australia exams which in turn resulted in Asterisk becoming one of the first Australian CREST member organisations. This has been a long (and difficult) journey and I wanted share a few of my experiences, thoughts and comments.

First, a bit of background. CREST is the Council of Registered Ethical Security Testers. The organisation was formed in the UK in 2007 / 2008 [1,2] with an aim to standardise ethical penetration testing and to provide professional qualifications for testers. By most accounts CREST has been a big success in the UK; the accreditation was adopted by the UK government, who now require that all penetration testing is performed by CREST certified testers from CREST approved organisations.

In 2011, the Australian Attorney Generals Department provided one-off seed funding to establish CREST in Australia, and in 2012 CREST Australia was created as a non-profit organisation [3]. Like the UK, the Australian government’s goal, was to provide Australian businesses and government agencies with a means of assuring that security testing work is performed… “with integrity, accountability and to agreed standards.

July this year I was invited to become part of the technical establishment team for CREST Australia. This was a real honour for me, but at the same time a bit daunting when I considered the calibre of the other individuals and organisations that were to be involved. When I’d first started hearing about CREST Australia, I suspected that it might end up being comprised of organisations at the big end of town. I like to think that Asterisk were invited as a representative presence for the many excellent niche information security providers in the market.

The next few months involved a lot of preparation and planning; licensing for the exam IP was obtained from the UK, and hardware for the testing rig was procured, configured and shipped to Australia. Also, July to September involved a lot of study and preparation for me personally. Although I have been doing pen-testing in one form or another since 1997, the CREST syllabus covers a lot of ground, and unless you’re testing regularly on a wide variety of platforms, these exams are no walk in the park.

At the end of September the technical establishment team descended on the bustling metropolis that is Canberra to sit the three exams that CREST Australia offer: CREST Registered Tester (CRT), CREST Certified Tester – Applications (CCT App) and CREST Certified Tester – Infrastructure (CCT Inf). We all sat these three exams over a period of three days; to be brutally honest, this was a horrendous experience – 15+ hours of the hardest exams that I’ve ever experienced in the space of 3 days. I can’t remember being so stressed in my entire life. Pro tip: don’t try to do all the exams back-to-back.

In the end somehow I pulled the rabbit from the hat and achieved the CCT certification necessary to go on to become an assessor.  We spent the next few days learning the ins and outs of exam invigilation (yup, this is a real word), then closed out the week by running the very first CREST Australia exams for a packed house of candidates.

Who knows what the future holds for CREST Australia. Asterisk are hoping that the various arms of government, regulators and corporations will recognise the value of CREST certification and will incorporate it into their evaluation process for pen-testing providers. While there is absolutely no assertation that a pen-testing company needs to be CREST certified in order to deliver quality results, we believe that CREST certification provides clients with a degree of confidence that pen-testing will be performed to a high, repeatable standard.

Personally, I’d really like to see more niche providers in the game. There are a few of us and I think we do great work. I’d like to break down some of the ingrained corporate mentality that for security testing to be done well, it needs to be done by a Big 4 company / IBM.  Maybe CREST is a way for us to start competing on more of a level footing.

TLDR; CREST exams are really hard; don’t think that you can pass without extensive prep and/or experience. If you’re from a client organisation, CREST certified testers & organisations know what they are on about.

Nightmare on Incident Response Street

Steve and I will be presenting at next week’s ISACA 2012 Annual Conference Perth, held on the 31st of October. We’re pretty damn excited (we’re going for a pretty radical form of presentation format, so, we apologise to all those people who are really looking forward to PPT slides), not just because of the presentation, but, also because it’s one of our favourite days: Halloween! Steve and I will be dressing up as Gomez and Morticia from the Addams Family.

Our synopsis, just in case you wanted to learn a bit more:

Over the last several years we have seen the trend of large data breaches continue. Incident response is critical during these incidents and done well, can protect the reputation and customer base of the organisation involved. In this presentation we will review select case studies of security incidents in 2012, and ask if these ‘black swans’ are really black any more. We will then pose the concept that organisations should assume that the nightmare has already occurred, and discuss the importance of planning your incident response from end to end, including data acquisition and handling, event detection and triage, containment and response, and of course your communication strategy.

Supporting this foray into incident response we’ll also be covering available maturity models, and incident frameworks. The combination of these will give you a good starting point for reviewing and growing the capability you have within your organisation.

So come on down and say ‘hi!’.

Introducing Prenus .. the Pretty Nessus .. thing.

One of my passions in information security is finding new ways to look at old problems. Big problems and small problems, doesn’t really matter. I’ve found a really useful way to look at these problems is visualising them. Last year this led me to hack together Burpdot, a Burp Suite log file to Graphviz “DOT” language formatted file for transformation into a graphic. Over the past few months we’ve been spending quite a bit of time with Nessus, and when you’re dumped with tons of hosts and hundreds and hundreds of findings what are you meant to do?

I don’t believe many would argue that the default Nessus web UI is ideal for analysing bulk data. And I’m not the only person to construct something to parse and process Nessus files into HTML/XLS files, you can see Jason Oliver’s work here. Hence, Prenus, the Pretty Nessus .. thing. Combining my love of Nessus, visualisation, and hacking stuff together.

Following the same principles as Burpdot, Prenus simply consumes Nessus version 2 exported XML files, and outputs the data in a few different formats. Initially, I was interested in finding a better way to analyse the results (Personally, I find the the Flash web interface frustrating), so the first output was a collection of static HTML files with Highcharts generated pie and bar graphs.

For example, the below creates a folder called ‘report’ with a bunch of HTML files:

$ prenus -t html -o report *.nessus

The top of the Prenus index, highlighting unique Nessus criticality ratings, and a split of the top 20 hosts
The top of the Prenus index, highlighting unique Nessus criticality ratings, and a split of the top 20 hosts

Vulnerability Overview page
Vulnerability Overview page

Vulnerability Detail
Vulnerability Detail

Host Overview
Host Overview

Not just wanting to end there, I thought it’d be useful to also generate a simple 2 column CSV formatted output that could be consumed and processed by Afterglow.

For example:

$ prenus -t glow *.nessus
46015 (4),10.0.0.5
46015 (4),10.0.0.20
46015 (4),10.0.0.15
53532 (4),10.0.0.5
53532 (4),10.0.0.20
53532 (4),10.0.0.15
..

But, if piped through Afterglow with our prenus.properties file, and then through Graphviz (in this instance Neato), you get something like this (I had to run this from within the afterglow/src/perl/graph folder):

$ prenus -t glow ~/prenus/*.nessus | ./afterglow.pl -t -c ~/prenus/prenus.properties | neato -v -Tpng -Gnormalize=true -Goutputorder=edgesfirst -o prenus.png

Prenus Afterglow Example
Prenus Afterglow Example

If you prefer a Circos-style graph you can do that too. The circos output mode generates a tab formatted table output which can be consumed by the Circos “TableViewer” tool. Wrapping that together is relatively simple (I use the word simple here lightly, getting Circos working on OS X was a bit of a pain due to GD deps, but, super simple to do on Linux). The following two commands assume the following directory layout:


~/
  circos/
    circos-0.62-1/
    circos-tools-0.16/
      tools/
        tableviewer/
          img/

Executed from within the “tableviewer” folder, the following should create a file called “prenus.png” in the “img” folder.

$ prenus -t circos ~/prenus/*.nessus | bin/parse-table -conf samples/parse-table-01.conf | bin/make-conf -dir data
$ ../../../circos-0.62-1/bin/circos -conf etc/circos.conf -outputfile prenus

Prenus Circos Graph Example
Prenus Circos Graph Example

Potentially useful for you analysis, or maybe just some prettiness to add to your reports. Any methods or tools that can help dig through stacks of data are pretty useful to us. The above diagram has a few layout issues, so if you want to just analyse your critical severity issues you can include the “-s 4” flag to prenus.

I’ve got an ad-hoc list of enhancements which include:

  1. 1. Construct an EC2 bootstrap to allow you to deploy this on a throw-away environment so as not having to fight with the damn dependencies like I did
  2. 2. Look at using d3.js for ALL chart generation instead
  3. 3. Perhaps just looking at a standalone Rails app for your own deployment (either local, Heroku or whatever)

Let me know what you think, or, just grab the code and have a play yourself: https://github.com/AsteriskLabs/prenus

Symantec Endpoint Protection: Setup.exe extruder

What do you do when you need to create around 40 Symantec Endpoint Protection packages?!

I’m way too impatient to do it manually, and after automating the sylink creation (see previous post), I got the idea of automating the setup.exe creation.

Prerequistes:
Ok, first thing you will need to do is setup your sylink files: instructions here

You will need 7-zip on your SEPM, this allows us to update the contents of the zip archive.

Export a setup msi directory from your SEPM, do not create a single exe file.
Once you have done this, zip up the output into a regular zip file call input.zip, this will be your $setup_src

Running the Script:
1) put all of your sylink files into a directory structure like this:

Sylink/
  domain1/
    group1_sylink.xml
    group2_sylink.xml
  domain2/
    group3_sylink.xml
    group4_sylink.xml

2) create a domains.txt file in the Sylink/ directory:

domain1
domain2

3) create a groups.txt (or use your previous groups.txt from the sylink creation) and put one in each domain directory (ie: Sylink/domain1/groups.txt). The groups.txt has a list of each group:

group1
group2

4) find the makesfx.exe: it is on the SEPM, in your SEPM install path: /Symantec Endpoint Protection Manager/tomcat/bin, and copy it to a convenient location. You will point the script variable $MakeSFX to it.

5) edit the script variables, make sure the paths point to the right places. Note for $bits and $type, I manually update these depending if im exporting 32 or 64 bit packages and Server or Desktop packages (depending on the SEP features in the package)

$update = @"
C:\"Program Files"\7-Zip\7z.exe u
"@
$delete = @"
C:\"Program Files"\7-Zip\7z.exe d
"@
$MakeSFX = "D:\temp\MakeSFX.exe"
$setup_src = "D:\temp\input.zip"
$setup_dst = "D:\Program Files\Symantec\SEP Agents\bulk\"
$sylink_dir = "D:\Program Files\Symantec\SEP Agents\Sylink\"
$domains_txt = $sylink_dir + "domains.txt"
$bits = "_x32"
$type = "_Desktop"

6) run the script and marvel at how much faster you can extrude out (think sausage factory 🙂 ) setup files!

You can pull the scripts from Asterisk Labs Github repository

Im pretty sure I have hit the Win inflection on this chart:
Geeks and repetitive tasks

Actually, this script generated about 40 setup exe’s for me in 20 minutes. If it takes about 5 minutes to export a setup.exe from the SEP console, Im certain I’m in front, even with script setup time, and definately with reduction in Repetitive Click Boredom.

Symantec Endpoint Protection: Sylink.xml hacking to automate SEPM migration

I have completed a couple of projects recently migrating customers from Symantec Endpoint Protection v11.0 to v12.1, including moving to a new SEP Manager. In these projects, the decision was made to do a fresh install of the SEPM, and move the clients into the new manager, without using replication between the old and new SEPM.

The file on the SEP v11 client called Sylink.xml tells the client which server to connect to, what the server certificate is and which group the client should join on the SEPM (among other things).

There is a tool on the SEP media part 2 in Tools\SylinkDrop.exe which can be used to swap the Sylink.xml file on the SEP client.

During these projects I was looking for a way to simplify the creation of the sylink.xml file, the projects both involved a large number of client groups – I didnt really want to manually export the communication settings for over 30 groups!

This lead me on a powershell path of discovery, and the realisation that powershell could load an xml file as a data object, manipulate it, and then write it out! This was exactly what I needed.

The process I used to generate the sylink files was:
– Create the group structure in the SEPM
– Create a groups.txt file: this file had a list of SEP client groups per line
– export a sylink.xml file from the destination SEPM: for example, download the communication settings file from the “My Company” top
level group.
– run the update_sylink.ps1 powershell script
– deploy the sylink files to the SEP agents with SylinkDrop.exe

The groups.txt file contained lines like:

My Company\Desktops
My Company\Desktops\WA
My Company\Desktops\NSW
My Company\Desktops\NT
My Company\Desktops\QLD
My Company\Desktops\SA
My Company\Desktops\TAS
My Company\Desktops\VIC

Note that SEP is case sensitive – the groups.txt file must match the group names in the SEPM.

When you export the sylink.xml, you will end up with a file that looks like:

<?xml version="1.0" encoding="UTF-8"?><ServerSettings DomainId="BC7791940DEADBEEF3A86829"><CommConf><AgentCommunicationSetting AlwaysConnect="1" CommunicationMode="PULL" DisableDownloadProfile="0" Kcs="18F84DEADBEEF2CF46D" PullHeartbeatSeconds="1800" PushHeartbeatSeconds="300" UploadCmdStateHeartbeatSeconds="300" UploadLearnedApp="0" UploadLogHeartbeatSeconds="300" UploadOpStateHeartbeatSeconds="300"/> <LogSetting MaxLogRecords="100" SendingLogAllowed="1" UploadProcessLog="1" UploadRawLog="1" UploadSecurityLog="1" UploadSystemLog="1" UploadTrafficLog="1"/>
<RegisterClient PreferredGroup="My Company\Workstations (location based)" PreferredMode="1"/> <ServerList FreezeSmsList="0" Name="Default Management Server List"> <ServerPriorityBlock Name="List0"> <Server Address="10.10.10.10" HttpPort="8014" VerifySignatures="1"/> <Server Address="SEPM"
HttpPort="8014" VerifySignatures="1"/> <Server Address="SEPM" HttpPort="8014" VerifySignatures="1"/> </ServerPriorityBlock> </ServerList> <ServerCertList>

<Certificate Name="SEPM">MIICujCCAiOgAwIBAgIQhjuQQqXvBWWzipD7elI3oTANBgkqhkiG9w0BAQUFADB3MXUwCQYDVQQI&#xd;

A4GBAI6RsCE0zyFwDY6rsKeOaGVtEtZNvz5Lbas2b0OYOX53GA7JbeJMWB5OqMZ5EM76PZx/toMZ&#xd;

vUN+ypsPydoiLKd7uMsNWaFGzP4JKJjiJsrhGi3l1pLlR553GxZz2UZ1zbX7knjjiReVLrniIyYd&#xd; CPFkI/DEADBEEF+fnUbxr259h</Certificate>

</ServerCertList>

</CommConf></ServerSettings>

The powershell script can actually update any token in the xml file, we are just using it to update the PreferredGroup item:

$xml_orig = New-Object XML
$xml_orig.Load("Sylink.xml")

$grps = get-content groups.txt
ForEach ($i in $grps) {
$i
$PreferredGroup = $i.toString()
$xml_new = New-Object XML
$xml_new = $xml_orig

$xml_new.ServerSettings.CommConf.RegisterClient.PreferredGroup = $PreferredGroup

$j = $i.Replace("\", "_")
$filename = $j.Replace(" ", "_")
$xml_new.Save($filename + "_Sylink.xml")
remove-variable xml_new
}

The script needs the groups.txt and sylink.xml to be in its currect directory.
The sylink files will be output to the currect directory, with the group name preceding, eg: My_Company_Desktops_WA_Sylink.xml.
All that is left is to run SylinkDrop -s “target sylink.xml” on the agents to repoint them to the new SEPM. I have used both Symantec Management Platform (Altiris) to do this, and AD group policy.

You can pull the scripts from Asterisk Labs Github repository

I have used this script to save a lot of time generating sylink files for migrations. This also gave me an idea for automating the creation of a large number of setup.exe files for SEP deployments: Stay tuned for more on that!