Tuesday, November 13, 2012

BsidesIowa 2013

BSidesIowa 2013 (2.0)
Let’s reboot this thing, and do it again……
When I first sat out to bring a solid security conference into Iowa, my two main goals around it:

  • Would bring speakers from outside of Iowa that would provide knowledgeable, and entertaining presentations as well as local talent that were either new to presenting or had limited funds to get to a bigger security conference. 
  • Would allow us to showcase the technology offerings that Iowa had for the Information Security programs, and help showcase some of the students involved in these programs. Give them a chance to network with industry peers and potential employers. 

After last year I am proud to say we have made some advancement in reaching these goals. The caliber of speakers we had last year was incredible for a first year event, and when you consider that the organizer had no clue what he was doing, I think we did well.
This year things are going to be stepped up again, and we are going to make this better than last year.
Here are the details that you have been waiting for:

  • April 6th and 7th
  • University of Dubuque, Dubuque Iowa
  • 3 Tracks
    • Management
    • Technical
    • Local
      • Geared toward new speakers, in the Midwest.
      • Will have a mentor to help with presentation
      • Can be as short as 15 min
  • Sessions are 55 mins long, with will include a Q/A session
  • CFP will be opened in the next few weeks
  • Sessions are held on Saturday
  • Sunday will have a 3-4 hour training workshop
    • Metasploit – Confirmed
    • Working on a few more
  • Will be working with Sponsors for a job fair for prospective employees
  • Lockpicking Village

As always we are looking for sponsors willing to make this a success. If you or your company is interested in sponsoring or volunteering please let us know.
Bsidesiowa - @ - gmail.com

Friday, October 19, 2012

FedEx Malware Overnight

** Updated with new IP/URL at bottom **

This post started off as a look back on one of my first deep dives into understanding some malware that I came across my desk at work. I had planned to do a brief overview of what I found, and what I learned.. Needless to say it has been 2 months since this has happened, I haven’t been able to delve in as quickly as I wanted to, but I keep going back and learning more things about this.

So this post will become multiple posts as I walk through the steps taken, tools used and what I learned. Hopefully it will allow me to make sure I understand what I have done, and I can help someone else learn the steps needed to resolve this.  This first post will provide the background of the infection, and what was discovered. While I do not have the Memory from the original machine and I have yet been able to reproduce everything, I still find this an interesting case.

So here we go.


One Monday in August I came into the office and noticed a few alerts that fired off for a virus called FakeAVLock. Looking at the first alert that fired I noticed that the alert had fired off on two different files on the same machine. The file names were:
1.     sxhhmslg.exe
2.     Label_Copy_FedEx.exe (was contained in Label_Copy_Fedex.zip)
The interesting thing about these alerts was the first one was a Real Time scan, and the other one was a Scheduled Scan. Contacting the user I was able to get access to the machine and the questions started coming as I looked at the machine.

My first step was to run a modified version of TriageIR. As that was running I navigated to the users Temporary Internet files where both binaries were found and noticed that the Label_Copy_FedEx.Zip had a create date timestamp 4.5 hours BEFORE the initial alert of the fqucxldq.exe. I also noticed a few 8 random characters named files that were created about every 4 hours. This was a remote machine, and I was unable to pull a Memory Image from it, and analysis was painful.

At this point this infection started to get interesting.
1.     AV Vendor claimed it Cleaned up the infection in Real Time Scan
2.     Real Time scan was 4.5 hours AFTER initial Infection
3.     Real Time Scan Alert time was 4 hours AFTER user had left the office
4.     No related symptoms to FakeAvLock was found on machine
1.     No Registry Settings
2.     No Scheduled Tasks
5.     Machine was still calling home every 4 hours after being “Cleaned”
Who Said Monday isn’t a fun day? 

Once I determined that the machine was infected I submitted a query to our network logs to find out if I had any other machines that had downloaded the file, and another one to see the traffic of this machine over the weekend. The initial logs I got back from the user query showed a very noisy calling home pattern. I decided to open it up in Mandiant's Highlighter to do some analysis on the machine. In the initial analysis I noticed that we have over 2500 connections to, almost 800 of which were from the machine I initially started to analyze.

With this information I sent another query to find out what these other machines that were connecting had visited, if we had any virus alerts for them, and what other information that was shared to these users. Looking at the pattern and the name of the file I found the following email that was received by all of these users. We also received a UPS phish a week later, same behavior.

Depending on the user who received this email if you clicked on the Print a Shipping Label you would be redirected to one of these domains:

  • hxxp://www.masrecargaenlineacolombia.com/Label_Copy_Fedex.zip 
  • hxxp://www.sentrysystems.co.uk/Label_Copy_Fedex.zip 
  • hxxp://treebuu.com/Label_Copy_Fedex.zip 
  • hxxp://petitemer.com/Label_Copy_Fedex.zip 
  • hxxp://www.babyboomerconnections.com.au/Label_Copy_Fedex.zip 
  • hxxp://2013kentuckyderbytickets.com/Label_Copy_Fedex.zip .
  • hxxp://fitinyourjeanscuisine.com/Label_Copy_Fedex.zip 
  • hxxp://synergicconsulting.com/Label_Copy_Fedex.zip
  • hxxp://www.apolarisa.gr/Label_Copy_Fedex.zip
  • hxxp://cambal.net/Label_Copy_Fedex.zip
  • hxxp://www.hotelbergama.com/Label_Copy_Fedex.zip
  • hxxp://resetonline.com/Label_Copy_Fedex.zip 
  • hxxp://qt-research.com/Label_Copy_Fedex.zip
  • hxxp://www.nilosasia.com/Label_Copy_Fedex.zip
  • hxxp://www.trevorfire.org/Label_Copy_Fedex.zip 
  • hxxp://serainsaat.com.tr/Label_Copy_Fedex.zip 
  • hxxp://boxerdelnettuno.it/Label_Copy_Fedex.zip
From what we had seen, once you extracted the Label_Copy_Fedex.Exe and ran the binary it would appear not to do anything,  as it reached out to one of the following sites and downloaded the FakeAV software.


After the machine became infected with the Fake AV software it started calling back to these ip’s. It also attempted to download two files from called sb.dll.crp and p3.dll.crp


Infection Fun Times

Based on what we knew at the time, we initiated our mitigation response. I continued to be puzzled with the fact that the initial machine kept calling home after it was supposed to be clean. So I went back to analyzing and trying to understand what I was seeing. 

By this time I had spent a couple of hours analyzing the registry with RegRipper, scheduled tasks, created files trying to build out a timeline. I looked for the “known” indicators for this virus on the machine and came up empty. I still could not find out what was calling home. Reaching out to the Twitter community for assistance, they pointed me to TCPView and ProcessExplorer.  

With the help from Twitter and those tools I was able to determine that my evil process was a hijacked SVCHOST.EXE (Pid 3708), it was maintaining a connection to maintaining connection to the proxies and All other infections of this malware were “cleaned” except for the calling home method. If the machine had been rebooted I would never have seen this…

Whats next?

In order for me to understand this a little better I have kept a copy of this malware, and I have been analyzing as I can. This is a nice piece of Ransom Ware. In the next few posts I will show what I have done with the malware and the tools I have used.

And a few other tools that I might come across that looks promising. By doing this I hope it will allow me to increase my skill set by working on a binary that I am “knowledgable” in its behavior.. 

*** Updated Domain Lists ***
Came across the following IPs and Domains from a similar UPS Phish:  <- appears to be primary site

These are dropping a Zip with the following executable in them. 
Virus Total Results  Anubis


Wednesday, July 4, 2012

The Trouble with TypedUrlsTime

Recently there has been some information released on the new registry key found with Internet Explorer 10, that is found in the Windows 8 Operating system. Basically this key will keep track of when a site is typed into the browser bar within Internet Explorer 10. If you want to know more there has been previous research on this key.

My Initial research on Windows 8, which I mentioned this key is here.
Jason Hale did some research on it here.
Amanda Thomson also did some research here.

What am I showing you?
When I first looked into the keys I saw some interesting anti-forensic tactics that can be utilized by modification of the keys and their data. I will attempt to explain what I saw.
This is my default data set, which shows both the TypedUrls and TypedUrlsTime values from Regedit, as well as from a RegRipper Plugin that Jason Hale produced. 

Original TypedUrls Dataset

Original TypedUrlsTime Dataset

Original RegRipper Values

First Test: 
The first test that I did was to delete the Url3 key from the TypedUrls, I was interested to see the impact of the TypedUrlsTime key without the corresponding Key with the the TypedUrls. 

URL3 Deleted in TypedUrls

Url3 remains in the TypedUrlsTime

After I deleted the value from the TypedUrls key, I opened Internet Explorer 10, and went to a new website. This site populated in my Url1, shifting the previous url1 and url2 values down. Since the url3 value was missing, none of the other values increased. 
Adding a New TypedUrls to this
Looking at the TypedUrlsTime we can see that we still have 10 entries. If we look at the Data Value in Url4 we notice that it is the same value as it was previously. We also notice that the value in the previous version of Url3 is no longer retained. 

Values of TypedUrlsTime after addition of a new site. 

RegRipper Plugin Values

Second Test: 
Since we have identified the behavior with these two data sets if we delete a value from the TypedUrls, I was wondering what would happen if we deleted a value from the TypedUrlsTime and retain the corresponding key in the TypedUrls. I decided to delete the Url3 version that corresponded with my visit to Wired.com. 

Visiting Espn after deletion of TypedUrlsTime Key. 
Url4 corresponds to Wired.com, it is now 00. 

We can see that even though I deleted the value, when you type in a new address that is added to the TypedUrls keys it will recreate the the missing TypedUrlsTime value. Since the OS does not have the actual time that was visited to pass to this key, the value is now 00 00 00 00 00 00 00 00. Looking at the RegRipper data, we can see that the new visit to Wired happened Thu Jan 1 1970.

RegRipper Data.. 1970!!
As we can see from this data the TypedUrls key is the primary key in that if it is missing data is not parsed to the TypedUrlsTime keys, but if the TypedUrlsTime key is deleted then it is recreated with the default dataset. 

Third Test: 

We have seen how deletion of the keys impact each other, I decided to look and see what impact can happen if I would modify the names of the TypedUrls keys. 

Baseline Dataset

Baseline DataSet

I decided to switch the names of my Url1 and Url8 key, you can see the new values listed below.

Dataset after Url1 and Url8 change
Here is the RegRipper values right after I made the change in the registry. As we can see my ESPN visit according to the plugin happened on June 17th, while my SANS visit happened on July 4th. Looking at the report below we can see this. 

url1 and url8 dates messed up. 

From these I decided to visit a new website which will now increment my values so we can see the impact that this might have on the values in the registry values themselves. I made NO changes to the values in the TypedUrls. The only changes that were made were to the url1 and url8 in the TypedUrlsTimes that were shown in the previous example.

Visiting Mediacomcc.com

TypedUrlsTime after new site added. 

As we can see the modified url1 and url8 keys have incremented as they should, with url1 moving to url2 and url8 moving to url9. Looking at the RegRipper plugin we can see the evidence that something is incorrect. We can still see where according to the TypedUrlsTime value my ESPN and SANS visits are out of order. 

RegRipper Plugin Values after new site

The next test I did was to go and modify the values found in the TypedUrls. I will be modifing the values in TypedUrls found at url10 (google) and url3 (principal). I will leave alone the values in the TypedUrlsTime. 

After changing the Url3 and Url10 values. 

RegRipper after the change
After my changing the names between the Url3 and Url10, when I go to use the plugin I have convinced the Registry that I actually did visit Principal in June instead of just a few days ago. From within the Internet Explorer History tab the correct date that I originally viewed these sites show up. 

Closing Thoughts: 

The addition of TypedUrlsTime adds another layer to help with timeline analysis of internet activity to see when  user actually typed out the url to connect to. Though the ease of anti-forensic techniques to modify the the corresponding values between TypedUrls and TypedUrlsTime this Registry value may not be considered a very forensically sound and useful for investigation or timeline analysis when attempting to know what sites a user connected to by entering the url into the address bar. 

Please let me know if I have over looked another way to extract the correct values of when I might have typed the values into my web browser. 

**** Update 7/4/10 ****
My hives if you want to analyze them are here

I have used RegSlack in an attempt to pull back deleted files. I was unable to extract anything from the slack space. This was the only Deleted Key that was found.

### Deleted Key  ###

Offset: 0x7abb8 [Wed Jul  4 15:04:53 2012]
Number of values: 0

Recovered 1 keys and 0 values: #0 keys from allocated space.

Rejected 0 keys and 0 values.

Monday, July 2, 2012

DFIR SUMMIT - Through the Eyes of a Summit Noob

What Did I Learn?
When asked about our experience of the SANS DFIR Summit, (Slides are here) we all have different opinions and views on what we learned and what we experienced. We digest the material differently and come to different conclusions on what was taught and what happened. For me this is no different, I walked away from this conference with a lot of knowledge from different aspects of the Summit itself. The Summit also showed that the Community has a Strong representation of Women Forensic Specialist.

Technological Knowledge

This Summit appeared to focus heavily on Forensics Research with a heavy emphasis on Mac, Cloud and Registry Research. There were a few presentations that seemed to step slightly out of that scope which tied with personal development and relationship building.

The first morning keynote was presented by Detective Cindy Murphy, and it was a discussion on how the field is changing, there are different perceptions and how to bring them together to form a more complete vision of our future. She talked about having 6 “Monks” that lead her abilities to understand different aspects of the field, and how to keep key things in perspective. This allows me to also consider my “Monks” and how they impact my development and interests as I become more specialized in my career focus.

I next attended Alissa Torres presentation on Reasons Not to “Stay in Your Lane” which discussed that as forensic investigators we need to understand offensive and anti-forensics techniques so we can understand better what happened during a compromise. Alissa made good points on how different offensive techniques can be masked to appear like they were done by legitimate users and we need to be trained to understand how these tools can be used. From her session I realized that in order to analyze a breach or malware infection, it would be beneficial if I understood the tools and techniques used to cause the breach or the behavior of the malware. She also showed how different incident response tools could be used maliciously and how their use might be missed as a false positive.

I sat in the Panel Discussion on how to Build and Maintain a Digital Forensic Lab. This discussion jumped into the experience of different professionals that have faced the challenges of building out a lab in different environments and how they proceeded. It talked about the Capabilities and uniqueness that each lab could have as well as some concerns with them. Walking out of this presentation I had a few pages of notes on what it takes to get the conversation started with management to get a lab looked at, that the process of getting to the capabilities of a fully functional high end lab took time. You need to understand the business need for it, and show the company that the cost involved in the lab is a good investment. Until you can start showing the ROI value the needs within the lab will be limited and you will need to work diligently in building out your practice.

I also sat in Christopher Pogue’s Sniper Forensics v3: Hunt, this year. I was lucky to catch v2 last year at GFirst, and Chris did an excellent job building off of it. The high overview of this presentation is that with the amount of Data we look at in an investigation can be over burdening and time consuming. We need to learn to define a scope and focus on it. As we find other outlying information we can add that to our investigation and then expand the scope. Once we have found our primary targets in our scope we can then spread out and remove other machines that show the same indicators. This allows us to find infections that are not traditionally being picked up by known “malware detection” options.

The end of the first day there was the SANS 360 talks, there was a lot of good 6 minutes presentations in there that talked about different tools and resources for DFIR analysis. Corey Harrell gave a brief overview on different metadata behavior artifacts for finding fraudulent word documents. Cindy Murphy showed how to understand and use Child Victim Age Estimation based on proven training. Harlan Carvey and Alissa Torres both talked about the numerous artifacts that can be found in Windows 7 and the value of Registry, UserAssists, VSC and Shellbags.

Harlan Carvey started the second day of the event off with Windows 7 Forensics, and showing what has changed from previous versions and how this information can be used during investigations. The artifacts that he brought up in his talk are the same artifacts that analysts and investigators should be looking at and understanding during investigations. These artifacts go to show what the user did and the impact on the system. Harlan also touched on some artifacts in Windows 8 that I have also been researching.

I also attended Nick Harbour’s presentation on Anti-Incident Response techniques which showed how different techniques can be used to appear like normal behavior on a machine. Overall some of the same techniques as Alissa described, but Nick’s presentation covered a lot more in depth anti-incident techniques that he has encountered. Some of these techniques include Hiding from running process lists, hiding network connections, process injections and thread hijacking. Nick did an outstanding job presenting actionable anti-Incident Response techniques that we should be aware of.

My final presentation that I sat in was by Mike Viscuso, and it was a discussion on how the current Incident Response model is quickly moving to the state of not being maintainable. It is important at this time to understand that Mike is the CEO of CarbonBlack, an application that allows a more focused approach to this. Mike did a very good job of keeping CarbonBlack out of the presentation and I think he took a beating in doing so. According to NetDiligence, the average cost of Forensics analysis during a breach is around $200,000. An example that he used was the Citadel Package and the deployment cost.  For 5 breaches it would cost an attacker $3,500 but it would cost the Defender over $1,000,000. An attacker is able to do almost 1409 attacks with the Citadel Package before he would be at the $200,000 price tag our first breach cost to respond. It becomes a game of can we continue to afford to spend so much on Forensics, when there are options available to decrease the investigation. Mike talked about how we use Security Camera’s in the retail world to help isolate and detect points of interest to analyze. By adopting more of a Traditional approach with the correct tooling we can decrease the impact of the forensic cost.

Personal Knowledge:

What I learned personally from the conference is that being with like-minded people can help foster personal growth and understanding on a topic.

As a presenter I learned that being asked to speak is a great opportunity to share your knowledge, and at that point in time, your audience is there to hear you. As a presenter you need to remain calm, and collected. You need to make sure that you have the slides in your presentation to go longer on a talk, because until you get used to presenting that you will speed up, and end your talk earlier. I also realized that we are the hardest critics on our selves, but the support of the staff and fellow presenters are incredible.

I learned that I could go outside my comfort bubble, meet people that I admire and look up to as professionals and could carry on a conversation. I learned that my research is valuable, and at times I shoot for the moon in what I offer, for example when I presented a few weeks back on FileHistory, not only did I do a Webcast, I released a RegRipper tool to extract the data, and I released my research. I have learned that it is ok to have a core group of trusted individuals to share data, and that open communication is important.

The personal growth that I was able to achieve at this conference will help me in both my professional and educational growth because it has strengthen my convictions that this is where I want to be. This is the career I want, this is the community I want to be active in. 

A special Thank You to Rob Lee and the entire SANS staff for putting this together. It was an incredible event and one that I plan on coming back to.

** This was originally written for school. I was going to do something else for the blog, but thought it expressed what I was thinking very well. **

Tuesday, June 26, 2012

Sans DFIR Summit 2012 - Slides

Today I was honored to present my first topic on Windows 8 Forensics at the SANS DFIR Summit in Austin. I want to thank Rob Lee and the entire SANS support staff for the encouragement for me to present and their dedication to putting on an incredible Speaker lineup.

Here is the slides and notes from my presentation, as well as a link to the research I have done previously on the Windows 8 Refresh. The data from the slides and presentation are current as of 6/24/2012. The Research paper may still need to be updated. 

What is coming next from me?

I am currently doing research on Storage Spaces that I hope to present to GFIRST in August. 

Windows 8 provides a new capability called Storage Spaces enabling just that. In a nutshell, Storage Spaces allow:
  • Organization of physical disks into storage pools, which can be easily expanded by simply adding disks. These disks can be connected either through USB, SATA (Serial ATA), or SAS (Serial Attached SCSI). A storage pool can be composed of heterogeneous physical disks – different sized physical disks accessible via different storage interconnects.
  • Usage of virtual disks (also known as spaces), which behave just like physical disks for all purposes. However, spaces also have powerful new capabilities associated with them such as thin provisioning (more about that later), as well as resiliency to failures of underlying physical media.

Tuesday, June 12, 2012

Windows 8 Forensic - File History

This research ties in with the Sans Webcast here

With the release of Microsoft’s new Operating System Windows 8, they have introduced a few new features that increase the capabilities of the operating system storage and backup offerings. In this article I will be covering the File History Services and its capabilities.

According to Microsoft, File History Service (fhsvc) is used to protect user files from accidental loss by copying them to a backup location[1]. File History Service is not enabled for any user by default, but upon connecting a removable media device, you will get an option to use this device as a backup location. Since File History Services is configurable by each user, it is enabled on a user by user instance. At the default level File History service will automatically protect the Default System Libraries (Music, Documents, Videos and Pictures), Files on the Desktop, Contacts and users favorites. Users can also create new libraries to include in the backup solution, or exclude currently backed up libraries from future backups.

When the File History Service is enabled numerous artifacts are created on both the local machine, and the target backup location. These artifacts include Event Logs, Registry settings, configuration files and incremental file backups in the target directory.

Some limitations of the File History Service is that the backups are not at block-level and do to the way that it handles login credentials it is unable to backup EFS files. The service itself runs as in the background as a local service using the local user credentials. [2] It is because the service runs as a local user account that each user must set up their custom configuration to File History.

The rest of the research can be found here
RegRipper Plugin for the HKU FileHistory Key is here

Monday, June 11, 2012

Let's Get This Party Started

About eight months ago I started a journey that has changed my skillsets and make choices to become more active in the DFIR community. It hasn’t been the easiest journey, but this roller coaster ride has been awesome.
On Tuesday June 12th I will be participating in my first webcast where I am the primary presenter. Being nervous does not go far enough to describe my current mental state. I am exhausted, running on fumes, and ready to crash hard, but all that is offset with the excitement of where this wild ride is taking me.

Over the course of the next three months, I will be presenting 3 different 1hr talks, and a quick 6 min talk. If I have not conquered my fear of public speaking by the end of August something is wrong.

In this talk I will take a look at the new FileHistory Services that Microsoft has released in Windows 8. I will discuss briefly what it is, how it’s configured, Artifacts created, and even release my first RegRipper Parser.

Windows 8 Forensics (pt 1 – Recovery Artifacts) at DFIR SUMMIT – PrincipalGroup10 to save 10%
In this talk I will look at the Recovery Options that are included in Windows 8, these are Restore Points, Refresh Points and System Reset. I will touch on how they are different, configurations artifacts that are created and the challenges that face forensic investigation regarding them.

In this talk I will look at the Backup and Storage Solutions that are included in Windows 8 and how they will impact investigation with the inclusion of Storage Spaces and Storage Pools, as well as more information on the File History Services.

While I know that I have no one to compete against with the Webcast on 6/12, I am up against some incredible Information Security professionals at both Summit and GFirst. The DFIR Summit is filled with some of the most talented researchers and professionals out there presenting on various Topics.

While the GFirst conference actually has a couple of Sessions at the same time as mine that I would love to attend. If you are in the Atlanta area, and interested in a Free top notch conference I would highly recommend this one. 

Monday, May 7, 2012

Windows 8 - Refresh Excerpt

There has been some interesting things recently about Windows 8 Forensics and the research being done.I have cleaned up some of my research that I have been doing for my up coming talks and am publishing as short excerpt here. Since the information is longer than a normal blog, I have uploaded it as a PDF here.

Feed back and questions are welcomed. I will be updating this with my final research thesis, and slides as new information and understand comes about. 

Thank you.. 


Windows 8 introduces two new options for system recovery, these options are: Refresh Points and System Recovery. Within Refresh Point there are two options; you can utilize the default refresh point or a custom refresh point.

Both Refresh options can be utilized by Windows 8 to remove malicious files and corrupted entries into the operating system. When using Refresh it is important to understand that the operating system creates a Recovery Image that makes a backup of the Windows System Files. For the default recover these Windows System Files are from when Windows 8 was first installed. When the Custom Refresh option is used than the Windows System Files are from the date that the Custom Refresh was created, the Custom Refresh also will contain the desktop applications that you have installed. Refresh Images DO NOT contain your Metro-style apps, documents, personal settings or user profiles, this is because that information is preserved at the time you refresh your PC.

The System Recover option in Windows 8 will return the Operating system back to the factory default. While using the System Recover there will be options on Using Recover with Multiple Drives, and how personal files are removed. 

Tuesday, May 1, 2012

Tools in the Toolbox Mandiant Red Curtian:

** Some how this missed it's cycle date.. not sure how I confused a 2011 date with 2013. I will be doing some more analysis with Red Curtain this summer with malware and scoring to see if I can better understand it. This is my Initial Review of the tool.. ** 

I have decided in order for me to understand my tools that I plan to utilize for DFIR I will need to research them and provide analysis of what I can conclude from them.  The first up is Mandiant’s Red Curtian.  While I am aware that there are other reviews out there, I felt with my background, and career focus some more light might be shed on them.

MANDIANT Red Curtain is free software for Incident Responders that assists with the analysis of malware. MRC examines executable files (e.g., .exe, .dll, and so on) to determine how suspicious they are based on a set of criteria. It examines multiple aspects of an executable, looking at things such as the entropy (in other words, randomness), indications of packing, compiler and packing signatures, the presence of digital signatures, and other characteristics to generate a threat "score." This score can be used to identify whether a set of files is worthy of further investigation.
The first impression of Red Curtain is favorable. The user interface is clean and well organized; there is not much to this application. You have the option to scan either a single file or a folder and all subfolders.

Lab Setup:
HP Elitebook,
                                Win 7, 8GB Ram, i5 Processor
Malware Being Analyzed: (Linked to Virus Total Results)
Symantec Analysis:
Files Submitted
Signature Protection Name
RR Seq#
New Threat
New Threat

Developer Notes:
mdhcp32.exe is a non-repairable threat. This file is contained in bad stuffs.zip.
tlbinf3232.exe is a non-repairable threat. This file is contained in bad stuffs.zip.

Mandiant Red Curtain Scan:
According to the following table, these items are typically not suspicious.

What I found most interesting is that a lot of my Incident Response Tools have a higher score then the malware tested against. The application Image Burn, and Unetbootin scored a Higher threat score (3.753+) then my malware did.

While I think that there is potential here, with my initial run of tests, relying on Red Curtain to alert me to a suspected piece of malware I would not advise.

Online Reviews:

Wednesday, April 18, 2012

Tools in the Toolbox - Triage

Imagine that you are on an Incident Response team. Imagine that your local Incident Response capabilities have adequate staff and a slowly maturing process. Imagine that your company has multiple remote locations. Imagine that in those remote locations your Incident Response capabilities are not as mature and there is a resource shortage. Imagine that there is a language barrier between local and remote staff. Imagine the slow response time due to incorrect files sent for analysis. Imagine the headaches and frustration of trying to troubleshoot a security incident when you do not have physical access to the box and you have to rely on others to collect the data you need. 

This was not something I had to imagine, it was something that I have been living for awhile now. It was a problem that needed to be changed, and with this tool it has allowed our investigations to become more effective. 

This is my story: 

The initial process that we utilized was very ineffective and time consuming. To get the correct usable data would usually require a business day turnaround, unless multiple machines showed the same indicators.  If multiple machines were impacted we still could spend 4 hours waiting on the data before we could analyze, and then only have a small segment of data to analyze.

The following questions were posed to the team:
  1. How can we rapidly get consistent data across every incident from remote locations?
  2. How can we standardize the collection process across all locations, so primary incident handler receive actionable data?
  3. Can we do this with a free toolset?
  4. Can we customize this based on our needs?
  5. Simplified Interface for data collection?

The team discussed some options on remote collection capabilities but based on the way the internal network is structured limited us in usability. During this time I stumbled across a post by Mike Ahrendt about his capstone project called Triage. According to Mike, the functionality of Triage is:
 The script is designed to perform basic triage commands, as well as acquire evidence automatically on the system.  I designed the script to be ran from a flash drive, but you can really run it from anywhere.  All reports and evidence will be collected in the script directory under an Incident folder with a time stamp ("mm-dd-yy Incident").  

The capabilities of the script promising so I downloaded and started working with the script to see if it could meet our needs.  Initial opinion of Triage showed that it had promise, but there were some issues with application. The biggest concerns I had with Triage was the need to run with Administrator credentials, inability to log actions taken, multiple steps to run the program from the GUI, being prompted for the EULA for a Sysinternal tool, and the folder naming convention for an incident.

While my concerns were there, knowing them allows us to still use the tool as long as we understood the behavior of Triage. I submitted a feature enhancement request to Mike on these concerns and he said that when he had time he would look into them.

We started using Triage locally for our Malware Incident Response, making sure that the tool was useful in what we did. We expected that using Triage would shave off some of our local response time and give us some useful data. We were pleasantly surprised that Triage actually trimmed our response time by 80%, and provided more usable data than we had previously acquired and analyzed. I noticed that my first step in analysis with Triage was to hit the Autorun.txt to find suspicious registry entries, previously my first step was attempting to hit all the known autorun locations in a registry dump file. On most incidents I could confirm an infection within 30 minutes; this was down from the normal 2hrs that it usually took. 

With the capabilities of Triage being proven in our decreased response time we started to analyze the capabilities of allowing the remote analyst to use. The process of deploying this tool to remote analyst required us to document the process required to use it. In testing Triage at a few remote sites we decided on a few code improvements that would need to be implemented for wide scale usage.

The Triage code was then enhanced to handle these improvements. The following were changed:

We removed the GUI from the code because we collected the same data and it was an extra couple of steps that were no longer needed. We changed the naming standards on the Incident folder to the following format ComputerName_Date_Incident, this allowed us to run multiple Triage analysis and store the information in a central place, without renaming every incident file as we upload it. Created a Command Line modification to the registry in order to bypass the EULA that is prompted by one of the Sysinternals tools. With the addition of the registry change I implemented a custom cmd.exe that was detailed in the Malware Analyst Cookbook.

  1. As of this posting, our customization to the Triage application has allowed us to do the following:
  2. It has increased response efficiency by standardizing the collection process of files by remote analyst. 
  3. It has decreased incorrect files being sent to us for analysis.
  4. It has decreased the analysis time when looking for suspicious and unknown autoruns.
  5. It allows for a training tool for new malware analysts and what to look at.
  6. It allows for keeping historic snap shots of suspicious machines that we were unable to find IOC’s on.
  7. Response time has dropped to about 1hr with getting files from remote analysts.

Right before I posted this, I gave Mike a heads up, and he got busy and started cleaning up the code and making some changes, so part of my wish list has been implemented.  If you haven’t taken a look at Triage I would highly recommend it.

Next Steps
  • Getting Robocopy to work in XP. I have the exe, just need to test it. 
  • Have Triage execute on machines when our AntiVirus Solution fires off.
  • Compress and move the Incident file off the host machine and onto a network share.
  • Modify Triage to use an INI file for variables, so that customization can happen without recompiling.
  • Create an Error tracking log.
  • Integrate into RegRipper to parse out Registry Settings.
Currently I am testing the next release of Triage, so expect it to be updated within the next week or so.