Monday, November 9, 2009

Why Fuzzy Hashing is Really Cool

For years, computer forensic investigators have put a great deal of stock in the effectiveness of MD5 hashing. Now to quantify that statement, I mean specifically using MD5 hashes to identify known malicious files. The key word in that sentence is known, but let's take that one step further to add the word “unmodified” known files. One minor change to a file, and the MD5 hash is now completely different, rendering the investigators search totally ineffective. So, what's the answer? Easy, fuzzy hashing.

Hash comparisons are either a yes or a no – either the hash matches, or it doesn't. But, that does not mean that the files are not the same, it just means they are not exactly the same. I am going to use a simple example, that will illustrate exactly what I am talking about.

The photograph of Oklahoma State University wide receiver Dez Bryant below was taken from, “http://media.photobucket.com/image/dez%20bryant/imandyduckworth/DezBryant.jpg” on November 09, 2009.


Using MD5Deep, I took generated an MD5 hash for this picture:

b2cedc90072bacc43fdcc533ad4f24ad /home/cepogue/Pictures/DezBryant.jpg

Now, if you were an investigator, and you were going to search for that image of Dez based on the MD5 hash, you would only find it if the image were totally and completely identical to this original.

To show how easy it is to modify an image like this, I used Ghex to open the image and scrolled to the bottom of the content.


Note at offset 5879 (the last line), there are only four characters, which on the right translate to a blank space, a question mark, and two periods. Using Ghex, I am simply going to replace the blank space with a period.



Look at offset 5879 again in the figure above. I replaced the blank space with the period, changing that last line from "20 3F FF D9" to "2E 3F FF 2E". A very minor change. As you can see from the modified image of Dez below, there is no visible change to the image.


Again, using MD5deep, I calculated the MD5 hash of the image, and it is totally different from the first image.

Here is the unmodified image hash one more time:
b2cedc90072bacc43fdcc533ad4f24ad /home/cepogue/Pictures/DezBryant.jpg

Here is the modified image hash:
df3e3d942610781f9b9d0b41683c46db /home/cepogue/Pictures/DezBryant2.jpg

The hashes are not even close. So, if an investigator was performing a search for this image based on the MD5 hash, he would fail to find it.

So, if you are an investigator, you may be thinking, “Aw crap...now what?! So ALL of the hash comparisons I have been doing could have failed while the evidence was still present?”

The answer to question is, “Yes...if the images were modified in any way...yes they did.” But, there is hope, and that hope is called fuzzy hashing.

Since the one to one comparison of hash sets is obviously antiquated and inadequate, Jesse Kornblum of Mantech thought up a fantastic solution called fuzzy hashing. Using a tool called SSDEEP, you can generate hash values that can then be compared to other files producing a percentage in which the file matches other files!

Using SSDEEP, I generate an output file from the first image of Dez that looks like this:

ssdeep -b DezBryant.jpg
ssdeep,1.0--blocksize:hash:hash,filename
384:HEOV6N0/xFXSw0x2K+PLfNDOPK2TYWImaMsYLB3q60tL5DwpXe9hZ4ksJWoTNpyY:HEI9Xg7+P9yImaNk3qrDwpXe9gf5xkIZ,"DezBryant.jpg"

I simply redirected the output to a file named dez.hash.

Then, I use that file to compare to the second image of Dez:

root@Linux-Forensic1:/home/cepogue/Pictures# ssdeep -bm dez.hash DezBryant2.jpg
DezBryant2.jpg matches DezBryant.jpg (99)

As you can see from the output, these two images are 99% similar.

Using fuzzy hashing can efficiently and effectively help investigators to identify files that contain a high percentage of similarities. While the file may not be 100% exactly the same, as proven by my example, that does not necessarily mean that they are not the same image. This same theory can be used with really any type of file. An investigator can then take the files with the highest percentage of similarities and manually review those individual files.

SSDEEP is a free utility and can be downloaded from http://ssdeep.sourceforge.net/.

Tuesday, November 3, 2009

Mount_EWF and Ubuntu 9.04

***Props to Steven Venter of Trustwave UK for putting this together. I used this today, with some minor modifications.***


So, I was faced with the need to mount a EWF image on my Ubuntu box so that I could use some of the TSK utilities on the image. Below, is how to get a tool called, "mount_ewf" working with Ubuntu 9.04.

So here's a quick update on getting EWF mounting capabilities installed on a new Ubuntu install [in this case the 32-bit version of Jaunty Jackalope Ubuntu 9.04]

The libewf software is now available at:
http://sourceforge.net/projects/libewf/

The files I downloaded were:
steve@jj:~/software/EWF$ ls -1
disktype-libewf.patch
EWF_file_format.pdf
libewf-20080501.tar.gz
libewf-beta-20090506.tar.gz *** I changed this too...I did NOT grab this file***
mount_ewf-20080513.py


== Install the required build dependencies
-- the
required Debian packages in Ubuntu are: zlib1g-dev libssl-dev uuid-dev
$ sudo apt-get install zlib1g-dev libssl-dev uuid-dev

== Create Debian (.deb) packages to install
Since the downloads are now standard source code format, I tried to create Debian (.deb) packages using the guidance here: http://www.quietearth.us/articles/2006/08/16/Building-deb-package-from-source

***This took me awhile to get working properly, as the "how to" is kind of vague.

First off, let's install the necessary tools:
# apt-get install autotools-dev fakeroot dh-make build-essential

Next, take the tarball you downloaded, in this case libewf-20080501.tar.gz
uncompress the tarball
tar -xzvf
libewf-20080501.tar.gz
cd into the newly created directory
libewf-20080501

Now, you are going to use the dh_make utility to make the debian control files
dh_make -f /path/to/tarball <-- this is important. You have got to tell tool the location of the original tarball...presumably, just down one directory. In my case, I dropped my tarball into /usr/local/bin (which is where I drop all of my install files).

Then select "S" for single binary.

Then run the following: (this has to be done as root)
# dpkg-buildpackage -rfakeroot

Step 1: Install required dependency packages:
$ sudo apt-get install autotools-dev fakeroot dh-make build-essential

Step 2: Copy the source code tarball to /tmp and extract the contents there steve@jj:~/software/EWF$ cp libewf-beta-20090506.tar.gz /tmp/
steve@jj:~/software/EWF$ cd /tmp/
steve@jj:/tmp$ tar -zxf libewf-beta-20090506.tar.gz
steve@jj:/tmp$ cd libewf-20090506/
steve@jj:/tmp/libewf-20090506$

Step 3a: No need to make the debian control files, since they are already there [in the debian/ sub-folder]

Step 3b: Build the debian package:
steve@jj:/tmp/libewf-20090506$ sudo dpkg-buildpackage -rfakeroot
** this ended with the following output:
signfile libewf_20090506-1.dsc
gpg: WARNING: unsafe ownership on configuration file `/home/steve/.gnupg/gpg.conf'
gpg: skipped "Joachim Metz ": secret key not available
gpg: [stdin]: clearsign failed: secret key not available

dpkg-genchanges >../libewf_20090506-1_amd64.changes
dpkg-genchanges: including full source code in upload
dpkg-buildpackage: full upload (original source is included)
dpkg-buildpackage: warning: Failed to sign .dsc and .changes file
steve@jj:/tmp/libewf-20090506$

Step 3c: List the newly created files:
steve@jj:/tmp/libewf-20090506$ cd ..
steve@jj:/tmp$ ls -ld libewf*
drwxr-xr-x 15 steve steve 4096 2009-05-08 18:41 libewf-20090506
-rw-r--r-- 1 root root 2262 2009-05-08 18:42 libewf_20090506-1_amd64.changes
-rw-r--r-- 1 root root 177340 2009-05-08 18:42 libewf_20090506-1_amd64.deb
-rw-r--r-- 1 root root 511 2009-05-08 18:40 libewf_20090506-1.diff.gz
-rw-r--r-- 1 root root 826 2009-05-08 18:40 libewf_20090506-1.dsc
-rw-r--r-- 1 root root 810174 2009-05-08 18:40 libewf_20090506.orig.tar.gz
-rw-r--r-- 1 steve steve 809523 2009-05-08 18:22 libewf-beta-20090506.tar.gz
-rw-r--r-- 1 root root 222562 2009-05-08 18:42 libewf-dev_20090506-1_amd64.deb
-rw-r--r-- 1 root root 195290 2009-05-08 18:42 libewf-tools_20090506-1_amd64.deb

== Install the newly created .deb packages:
steve@jj:/tmp$ sudo dpkg -i libewf*.deb
Selecting previously deselected package libewf.
(Reading database ... 109479 files and directories currently installed.)
Unpacking libewf (from libewf_20090506-1_amd64.deb) ...
Selecting previously deselected package libewf-dev.
Unpacking libewf-dev (from libewf-dev_20090506-1_amd64.deb) ...
Selecting previously deselected package libewf-tools.
Unpacking libewf-tools (from libewf-tools_20090506-1_amd64.deb) ...
Setting up libewf (20090506-1) ...

Setting up libewf-dev (20090506-1) ...
Setting up libewf-tools (20090506-1) ...
Processing triggers for man-db ...
Processing triggers for libc6 ...
ldconfig deferred processing now taking place
steve@jj:/tmp$


== To use the mount_ewf script, need to install python-fuse:
steve@jj:/tmp$ sudo apt-get install python-fuse


== Create a mount.ewf executable in the /sbin directory and grant it "execute" permissions:
steve@jj:/tmp$ cd
steve@jj:~$ cd software/EWF/
steve@jj:~/software/EWF$ cp mount_ewf-20080513.py /sbin/mount.ewf
cp: cannot create regular file `/sbin/mount.ewf': Permission denied
steve@jj:~/software/EWF$ sudo cp mount_ewf-20080513.py /sbin/mount.ewf
steve@jj:~/software/EWF$ sudo chmod +x /sbin/mount.ewf


== And that's it - ready to go:
steve@jj:~/software/EWF$ mount.ewf
Using libewf-20090506. Tested with libewf-20080501.
Usage:
mount.ewf [options]

Note: This utility allows EWF files to be mounted as a filesystem containing a flat disk image. can be any segment of the EWF file. To be identified, all files need to be in the same directory, have the same root file name, and have the same first character of file extension. Alternatively, multiple filenames can be specified in different locations in the order to be reassembled.


ewf segment filename(s) required.
steve@jj:~/software/EWF$

Once you get the tool installed, you can mount EWF images like this:

create a mount point...mkdir /mnt/suspect
mount.ewf -o ro badguyimage.E* /mnt/suspect

The raw image will now be mounted on /mnt/suspect, and you run your TSK tools against it. Nice! The kewl thing is that you can mount your external drive as RW, then mount the image as RO. This comes in handy if you are dumping unalloc with blkls, and you only have a 80 GB HDD in your Linux box (like me)...I use external drives for everything!

Friday, October 9, 2009

SecTor 2009 - A Great Success


I just returned home from SecTor in Toronto, Ontario, Canada, and hats off to Brian Bourne and crew for putting on a great con!

With this only being the third year for the con, I honestly didn't quite know what to expect. However, after seeing what the volunteers (no paid staff) had put together, it was every bit as professional and organized as as con I have ever been to.

There were some outstanding talks this year, all of which will be available with full audio on the SecTor website under "Presentations". Some of the highlights for me were Roy Firestein's talk on "Crimeware", Jibran Ilyas and Nick Percoco's presentation of the "Malware Freakshow" (which is the same presentation that was given at DEFCON 17 this past year in Las Vegas, and Adam Laurie's (aka Major Malfunction) lunch keynote on, "The Day in the Life of a Hacker". This is not to say that all of the talks were not very very good, these are simply the ones I enjoyed the most.

If you were not able to attend this year, I highly recommend it! Very very very good con!

Monday, September 14, 2009

Babel Fish

I read an article this morning on Forensic Focus from the UK based company CY4OR detailing the emerging trend of technical data in the courtroom. The author of the article posed the questions of should there be a higher level of technical expertise required on juries in cases involving computers? After reading the article twice, and thinking about it, my answer is no (at least where the US court system in concerned), and here is why.

In the US juries are supposed to be comprised of "your peers". Now, while most folks in the US are technically aware, to think that they are "savvy" is a bit of a stretch. So finding a "peer" where most IT folks are concerned is going to be tough. Most people while some are very intelligent, are your stereo typical "end users". A good example is my pastor, Alex Himaya from the Church at Battle Creek. Alex has a MS and a Phd - a very educated and intelligent man. Well spoken, well traveled, and well respected both inside the Christian community and out. However, you put a computer in front of the man, and...well...he becomes an "end user". He can get around in Windows XP, he can do his work, but that's about it. Now that is not a slam in any way on Alex, however it shows that even people that have Phds are not any better with computers than your typical high school student (an in many cases the HS students are far better).

That puts folks like us, not just IT professionals, but computer forensic investigators, in the top 1% - 3% of computer users. We should be the upper tier of computer professionals, we should know both how the systems work and why they work that way. The technology should never be the limiting factor in our investigations. And if we run across something new or unfamiliar, we should be able to research it and figure it out in a very (like minutes to hours) short period of time.

Being a corporate investigator, over the years my customers have ranged from CEOs of fortune five companies to single location restaurant owners. I have delivered forensic reports to customers that have degrees in IT and have a pretty good understanding of what I am saying as well as people who know as much about computer science as Dunder Mifflin's Michael Scott! So what's the key to delivering a comprehensive yet understandable report? Your mom!

You think I'm joking...I'm not...your mom is the key! When you write your reports, do it in a manner that your mother could understand (if your mom is not available, any non-technical person you trust will suffice - unless of course your mom is a computer expert of some sort...then my example is blown and you will have to pick somebody else to help you with your report writing). Explain something that is technically difficult plainly and without being condescending.

For example, I have recently written a white paper on the top 10 reasons level 4 merchants are compromised. My target audience is small business owners whose primary concern is not computer security or PCI compliance, rather providing dry cleaning services, burrito plates, discount clothing, etc. In my white paper I break down technical concepts like egress filtering, secure data wiping, and port identification in a manner that my mother (no lie...I used her to help me write my paper) could easily understand. I used common terms and word pictures to illustrate technically advanced concepts clearly without making the reader feel st00p1t.

My forensic reports, like most, are broken down into sections - as I'm sure yours are (if they aren't, they should be). Doing this will enable you to address several different audiences in the same report. Your executives are likely just interested in the high level information - what happened, how, and how they can fix it. Technical or security staff members may be interesed in the specifics of what happened...ports, malware, theft, exfiltration, etc. Make sure you address each different audience ina clear and concise manner. I know I have said this before, but it won't hurt to say it again...DON'T be verbose for the sake of being verbose. Clear, concise, to the point and move on.

Now...how does this all tie together in a courtroom? Well, you are the SME. You have the technical knowledge and the jury does not. The key is not to throw technical terms and abbreviations at them to the point where they just tune you out and start wondering what's for lunch. Use common terms and analogies that they can easily understand. Jesus was a master at this! You don't have to be a Christian to appreciate how Jesus used the everyday to explain the things of Heaven to his disciples. In much the same way, you are doing the same thing. If you have to get froggy and break it down with some techie love, then fine, but make that your fall back, not your first option. Remember, your job on the stand is to get the jury to understand why the evidence you are presenting is relevant to the case, and how it proves whether something either happened to didn't happen, not to show them how smart you are.

Be the Babel Fish!

Wednesday, September 9, 2009

Autopilot?

A recent post on Forensic Focus got me thinking. Basically, someone asked if there was another tool like Harlan Carvey's RegRipper that could be used to validate their findings. After talking with Harlan at some length about this, we pretty much came to the same conclusion that there are a lot of folks out there who are stuck in the old school, running on auto pilot.

Let's get something straight from the get go here, I am totally for output validation when and where necessary. Since certain tools do things in certain ways, it may be important to use another tool that comes to that same result in a different way to validate that the first tool is not doing something jankity.

Case in point was the gig I had in which I was asked to determine if some office documents had been tampered with. Some tools use metadata to display chronological information while others use the OLE data. Some tools can extract chronological data without having to mount the image, others require the image to be mounted. The point here is that the tools do things in a slightly different way.

RegRipper parses registry hives. There was a funny post where a chap stated that RegRipper is not a registry viewer, so you can't mount the hives and have a "look around". While this is a true statement, I thought it was indicative of the "old school" of forensics. What are you going to look around for? Are you going to perform "Registry Analysis" with NO IDEA what you are looking for, why, or which keys do what? This is where the term "Auto Pilot" comes in. So many folks simply have blind reliance on their tools to do the work for them. They have no idea what the tool does, how it does it, and where the output in generated from. They just load, fire, and report...this tool did this...how many other tools can I get to do the same thing? Maybe by using 17 tools to take an MD5 hash, people will think I am really smart and KNOW that my MD5 hash is a good and proper MD5 hash!

What I am getting at here is that you should have a basic understanding of what the tools you are using actually do, and how they actually do that thing. I am no coder, so I could not pull apart regripper and tell you which lines to what, but I CAN read. Harlan has done a great job with documenting how regripper works and even allows you to write your own plugins! If you took about 30 mins and reviewed the documentation, you would know that regripper simply parses the data from the registry hives in a readable format. It takes the more complex keys (like those that are Rot13'd) and translates them into plain english. That's it...no smoke, no mirrors, no voodoo magic. If you want to validate your findings, get a hex editor and do it by hand.

There are a couple of takeaways here. First, understand your tools. Have at least a basic understanding of what they do and how they do it. Then you can make an educated decision if you need another tool to validate your findings. Second, don't be on auto pilot. Don't simply run a tool and then state in your report that 'Tool BLAH showed me BLAH." Instead, state what you are looking for, why you are looking for it, and THEN state what the findings were.

Remember YOU are the subject matter expert. Your case findings should be repeatable if another investigator took the same data and used the same tools. If you document your goals clearly, and the steps you took which brought you to your conclusions, you should never have a need to defend your tools.


Wednesday, September 2, 2009

SecTor 2009


I just found out last night that my paper on "Sniper Forensics" has been accepted to SecTor 2009! It will be a talk that shows the advantages of taking a focused approach to forensic investigations to include faster more accurate results, which means happier customers.

Also, Jibran Ilyas and Nick Percoco (fellow Trustwave teammates from the SpiderLabs) will be giving their "Malware Freakshow" presentation from DEFCON. It's awesome to have THREE speakers from Trustwave at one event like this!

If you are in CA, or just want to go to another security conference this year - I hope to see you there!

Tuesday, September 1, 2009

Plan the Work, Work the Plan

I have heard of investigators (and unfortunately, witnessed a few myself) that will simply go into a case without really knowing what they are looking for. They don't clarify expectations with the customer, don't think about what it is that they are trying to find, and end up just "looking for bad guy stuff". Can I just share with you what a monumentally horrible idea that is? If you don't know what it is that you are looking for, how will you ever known when you find it? This is why it is so critical to create an investigation plan BEFORE you start poking around in your data.

Creating an investigation plan is one of, if not the most important steps an investigator can take in preparation for a new case. It allows you to clearly outline what your objectives are and provides a framework for the direction of the entire case. All too often this critical step is skipped in the interest of time. What some folks don't realize is that by not having a comprehensive investigation plan, they are actually increasing the amount of time their case is likely to take.

The first question that needs to be asked at the onset of any case is, "what are my objectives". What are the goals of the case? What information does the customer want? What questions do they want answered? Once you have the specific items the customer wants to have addressed, reiterate them to ensure that there has not been a breakdown in communication somewhere.

"I am hearing that you want me to try and determine, X, Y, and Z. Is that correct?"

I know it may sound a bit juvenile, but really, everything hinges off the customer's expectations. So at the risk of misinterpreting those expectations, and failing to deliver what the customer has paid for, it is a necessary step. Ensure that both parties are "on the same sheet of music", so that when you deliver your final report you can state, "Hey...you asked me to find A, B, and C....HERE is A, B, and C".

This is where corporate investigators differ from our brethren in the law enforcement community...to a certain extent. We have a clear set of goals that our customers have paid for. They have the expectation that they will get answers to those questions. The SOW is signed, and we get to work and get them their answers in the time allotted by the contract.
In the law enforcement world, there are no timeframes and often no clear direction of what the goals are. Recently, I learned that most local, state, and federal agencies that deal with cyber-crimes are pushing out cases in anywhere from six months to three years! In that time, they may stumble upon three or four criminal activities perpetrated by the owner of the suspect system. They look under every rock, they search every crevice. They have the luxury of time (for the most part)...we do not.

Once the goals for our investigations have been established, we can apply the Alexiou Principle to further clarify our actions.

The Alexiou Principle states:
1. What question are you trying to answer?
2. What data do you need to answer that question?
3. How do you extract that data?
4. What does that data tell you?

Your questions need to be as specific as possible. You cannot simply say things like, "I want to find all signs of bad guy stuff", or "I want to find everything that this guy did wrong." Some good examples of well worded questions are:

1. How did the intruder gain access to the customer's network
2. What mechanism did the intruder use to gather customer data
3. How did the intruder get the stolen data off the customer's network

These can be answered clearly in with one sentence each.

1. The intruder gained access to the customer system by using a weak pcAnywhere password.
2. The intruder used a packet sniffer to detect and compile track data in transit.
3. The intruder used FTP to send files containing the stolen track data to his server.

There will obviously be much greater detail surrounding each question, however this is a good example of how you can be very precise in your answers. Don't take two paragraphs to say what you can say just as well in two sentences. Most customer's are not interested in verbosity, they just want to know what happened, and how.

Once you have your questions outlined, you can begin to search for the data that will provide you the answers. For example, if one of your questions is, "How did the intruder gain access to the customer's network" you are going to look in places that contain data about system access. You are NOT going to scan the machine for viruses, look for pornography, or check for rootkits. Why not? Because they have nothing to do with system access. You WOULD check in event logs, application logs (like pcAnywhere, or LogMeIn), firewall logs, ntuser.dat files, and the system and software registry hives.

With as much data that is in volatile memory, RAM dumps, and on system images, it's very easy to get overwhelmed - something referred to as "analysis paralysis". You have theories buzzing around in your head, "What if the attacker did this? What if he did that"? Don't fall victim to that kind thinking. Keep your hypothesis tied to the data. Let the data guide the direction of your case. Don't try to force the data to fit your ideas about the case.

We only have a limited time to deliver our final reports that clearly and concisely meet the customer's expectations. We do not have the luxury of time, and cannot possibly find everything that may be "wrong" with customer systems. We have been hired to answer questions...that's it. So answer them thoroughly, and in a manner that the customer can easily understand. If you stumble across something they have not asked (or paid for) then bonus...include it in the report as an additional finding, but don't go looking for them.

I have heard customers at the conclusion of a case state, "Why did you do X? I didn't ask you to do X. I asked you to do Y and Z! I want all of the money I spent on you finding X refunded to me. It was not in the contract, and I am not paying for it!" Also, I have been on the other side of that conversation in which a customer told me, "Why didn't you find Z? I wanted you to figure out Z!". To which I replied, "Hey...remember the SOW conversation we had, we outlined the goals of the investigation? Remember and you agreed to all of those items, and we put them in a contract...that you signed? You asked me to figure out A, B, and C...which I did...very clearly. If you want Z, that's fine...I will find Z, but we will need to add hours to the SOW." They didn't have any rebuttal because I MADE SURE to cover the expectations before sending over the SOW.

Develop your investigation plan based on what the customer wants. Restate their goals to them to ensure there have not been any miscommunications. Apply the Alexiou Principle to each of the goals, and get working - the clock is ticking.

Monday, August 31, 2009

Timewarp

OK...so it's been a long time since my last post...sorry...got really busy with cases.

Anyway...one of my recent cases prompted me to write this blog posting (along with some much needed prodding from Harlan). In this case, I was asked to determine the authenticity of a series of documents. Specifically, could I tell if the documents had been altered in any way so as to obfuscate their original chronological data. To determine this, I was provided an EnCase image of a removable storage device that contained the documents. Nothing more.

So instead of simply loading the image into EnCase, checking out the MAC times, and calling it a day, I decided to do a proof of concept. I needed to find out what abnormal looked like so that I would know it when I saw it.

I created an Excel spreadsheet using Microsoft Excel 2008 version 12.1.9 (090515) on an Apple MacBook Pro running MAC OSX 10.5.7, Darwin kernel 9.7.0. The Excel file was named, "pogue-test.xls" and had a MAC time of 11:31AM. I copied this file over to a Windows test box running Microsfot XP Professional v5.1 SP3 at 11:33AM, and subsequently opened it at 11:39AM on 08/07/2009.

Next, a new column was created in the spreadsheet and it was saved. As expected the Access and Write times changed. The new times showed 11:43 AM 08/07/2009.

I then used Timestomp (an anti-forensics tool commonly used to modify timestamp information) to modify just the Last Written time - pushing it back one week to Friday 07/31/09 at 5:05:05AM. As expected, when the I used the "dir /T:W " command, the new date appears, NOT the original date.

I then used Harlan Carvey's "oledmp.pl" and "wmd.pl" to display the OLE times. The results displayed from the cmd line showed the time modification, but the Perl scripts did not. This is due to the fact that the Perl scripts pull the OLE times, which are different than the file times displayed by the operating system, which use metadata.

Timestomp was used again, to modify all of the timestamps (as can be seen with the –z flag) to “Thursday 7/30/09 6:06:06AM”. When the times were listed with "dir /T and each of the time variables (:C,:W,:A), they all reflected the NEW time.

Now that the file appears to have been modified as interpreted by the OS, I again used Harlan Carvey’s Perl scripts to extract the OLE data. The scripts displayed the REAL times and not the MODIFIED times.

A second exercise was performed using TrueCrypt v6.1a to create a 10MB volume. This volume was mounted as a physical drive on a system running Windows XP v5.1 SP3, and subsequently imaged with FTK Lite v2.6.1. The resulting raw image was copied to an Ubuntu Linux 9.04 host running kernel 2.6.28-14. The Sleuth Kit (TSK) V3.0.1 was used to create a body file from the image (using the “fls” utility). The bodyfiile was then parsed with “mactime”.

The times on “pogue-test.xls” reflect the date/time when the image was created and the spoofed modification time of “Thu Jul 30 2009 06:06:06”. The file was then extracted from the image using FTK Lite v2.6.1 and saved to the local system. Again, using Harlan Carvey’s Perl scripts “oledmp.pl” and “wmd.pl” the actual MAC times were extracted as display in the exact same manner as in the first test.

The conclusion derived from this proof of concept exercise was that the operating system and subsequently EnCase (if used) would display the spoofed file times. By using forensic specific utilities (like Harlan Carvey’s Perl scripts) that extract timestamp information from the OLE of the Excel document, I was able to display the actual times.

If the files provided to me had been modified in any way, there should have been a variance between the timestamp information displayed by the operating system (metadata) and EnCase and the Perl scripts that were used.

Then I thought, "What if the document creator modified his local system time with the date/time icon from the Windows control panel? What would that look like?" So, I conducted another proof of concept exercise.

I used the date/time utility from the control panel and set the year on my system to 2020 instead of 2009. I then opened up IE and typed in some URLs, opened a cmd prompt and ran some commands, and opened a couple of programs using their desktop icons. What I opened or typed in is irrelevant, the important thing is that my actions will be tracked by the UserAssit key and the TypedURLs key in my ntuser.dat file. The last write times for each of these actions should reflect the year 2020 and not 2009.

I extracted my ntuser.dat file using FTK imager lite v2.6.1 and parsed it using Harlan Carvey's RegRipper. Low and behold, I was correct. All of the dates recorded by the registry for my actions reflected the spoofed date. ALSO, when I changed the system date back to 2009, the timedate.cpl last write time also showed the spoofed year of 2020.

So, there are a few of good takeaways from this case and proof of concept.

1. Don't rely on a single tool to give you your answers. Don't be lame and simply load your image into EnCase can call it a day. That's weak, and it's not forensics...it's called being lazy.

2. If you don't know what "abnormal" is going to look like, figure it out. Conduct some proof of concept exercises so that you at least have some idea of what you are looking for.

3. Make sure you inform the customer of what you were able to do, and what you were not able to do. If you were not able to perform certain actions, tell them why, and request that data. In my case, I requested an image of the original system so that I could look for additional data points in the registry that would support chronological modifications.

Good stuff! Sorry again for the break in writing...I will try not to lapse like that again!

Friday, July 10, 2009

SANS Forensic Summit 2009 - Report

SANS Forensic Summit 2009 Report

Kudos to Rob Lee for putting on the best Forensic Summit I have ever attended or been a part of! Being able to hear speakers like Harlan Carvey, Oive Carroll, Richard Bejtlich (pronounced BAIT-LICK), Jesse Kornblum, Jamie Butler, Troy Larson, and Eoghan (pronounced OWEN) Casey all in one event is pretty impressive. Now throw into the mix representatives from the FBI, Secret Service, DoD, Georgia Tech, and various local, state, and federal agencies and you have something pretty special. The quality of the speakers at this year’s summit made this THE conference to be at in 2009! Again, great work Rob!

So, with all of these great forensic minds in one place, what were the hilights? Was there a pervasive theme, or many scattered ones? Are we all as forensic investigators and incident responders seeing the same things, or does each agency face unique challenges? To answer those questions, “Yes, yes, and yes”.

Obviously the various agencies represented face challenges that are unique to their organization. Most interesting to me are the challenges faced by the US Department of Prisons! Inmates are extremely clever in acquiring, hiding, and using cellular phones…much more so that I ever imagined. In some cases phones are being inserted into FROGS, which are subsequently launched over the prison walls/fences. While that seems funny this is a huge problem faced by the prison systems. With cellular phone, inmates can still conduct much of their criminal activities from within the prison walls – kind of defeats the intent of putting them behind bars in the first place.

Most law enforcement agencies shared the common challenge of funding and personnel. Money is tight which affects every aspect of their jobs. Let’s face it, forensic hardware, software, training and education, and books are expensive…not to mention what you have to PAY someone who is experienced enough to perform comprehensive forensic investigations. The agencies represented indicated that they are making due with less and getting things done, albeit slowly. Many cases are pushing several months and in the most extreme examples, several years! Compare this to my average case that doesn’t last much longer than 3 – 4 weeks (max), you have a drastic disparity.

I had a good conversation with Ovie Carroll, Director of the Department of Justice Cyber-Crime lab, one afternoon in which we talked about the merit of having Law Enforcement agencies outsource some of their casework to external organizations. I think this is a fantastic idea that would leverage the expertise in the private sector (a HUGE percentage of which are either prior service military, former Law Enforcement, or both and have held or current hold high level security clearances) to accomplish casework more quickly and efficiently. Additionally, these relationships could be used to provide low cost (and in some cases FREE) training and education to our cash strapped brethren. I cannot stress enough how much I feel like this concept could provide much needed assistance in an area where it’s desperately needed. If you have not already done so, reach out to your local Law Enforcement agency and find out if there is anyway you can assist!

The Law Enforcement agencies also shared that they are all facing the same central issues of identity theft and carding (credit card theft). Among other crimes, these two are surfacing to the top of the list nationwide. Working for Trustwave, the majority of my cases involve carding and I can assure you that this is a multi-billion (that’s right I said BILLION…with a “B”) business for hackers and is not going anywhere, anytime soon.

I also noticed a couple of central themes that emerged from the various forensic and incident response panels – getting back to basics, and information sharing.

By getting back to the basics, I mean approaching your casework with a solid foundation of forensic theory, methodology, and technical understanding. Planning your work – working your plan! Knowing what data you are going to look for, and then surgically going after and interpreting that data – allowing the DATA to develop your theory, not cramming the data into your preexisting theory. Knowing your tools…what do they do, why, and how. Then carefully, and methodically documenting what you did, how you did it, and what the results were.

The other theme that surfaced was the need to share information. It seems that both the private sector and the various Law Enforcement agencies are suffering from a “stove pipe” mentality and intel is not being shared – which is a crime in and of itself! Now obviously I am not talking about information that would violate a Non-Disclosure Agreement (NDA) or compromise a case, BUT we can be sharing information like emerging threats, trends, malware data (hashes and/or artifacts), and the sources of certain attacks (at least IP addresses). Again, I spoke at length with Ovie Carroll, Harlan Carvey about this, but our conversation also included; Special Agent Jennifer Kolde of the FBI San Diego office, Special Agent Andrew Bonillo of the USSS DC office, and Chris Kelly of the Office of the Attorney General for the Commonwealth of Massachusetts. The same feelings were shared by all parties…share what you can, when you can. Doing so will only help everyone! At the end of the day, aren’t we all after the same thing – catching the bad guy? If that’s the case, then as a body of professionals, let’s really strive to cast aside the departmentalism that has prevented the flow of information to this point, and focus on frequent and directed intel sharing.

I could spend the next several pages going over the great talks and the takeaways from the conference, but that would probably make my fingers hurt, and you would probably get tired of reading. Suffice it to say that if you missed out on the conference this year, DON’T DO IT NEXT YEAR! I will echo the statement of Rick VanLuvender from First Data Corp who said, “If you can only make one conference this year, THIS is the one to make”! I could not agree more! This was a fantastic event!

In the next couple of weeks I will be writing about the lessons learned from the conference. If there is something you would like to see covered in more detail, or if you attended the conference, and I am not blogging about something you would like to see me cover, please let me know! My email address is in my profile, or if you were at the conference you likely have my business card. I hope you are looking forward to the next few posts as much as I am about writing them!

I will leave you with one final thought. In the handbook of the top 20 computer security jobs, Incident Responder/Forensic Investigator was #1! Yes, if you are reading this blog, you are either related to me (my wife and mother follow my blog for morale support…I love them but they usually have NO idea what I am talking about) or you are in the same field I am. That being the case, you have the coolest, sexiest, most sought after job in the computer security world. If that doesn’t excite you…well…then you are either brain dead, you have no pulse (which would make you physically dead), or you really just don’t get it and you should probably find another line of work.

Monday, June 29, 2009

Y the 101?

As some of you have noticed, the first few posts on TheDigitalStandard are centered around foundational principles of investigation. I did this intentionally because I believe whole heartedly that without integrating these critical building blocks of investigative theory into your personal methodology, you will not - and I really mean this - WILL NOT be anything more than an average investigator.

I started with "Semper Gumby" where I wrote about being flexible and allowing the evidence guide you rather than your theories dictate what the evidence said. I followed up with, "Trust" where I wrote about trusting your instincts, and never being satisfied with simply doing the minimum, but allowing yourself to be woken up at zero-dark-thirty to work on a case. I finished up with "The Alexiou Principle" in which I wrote about having clearly defined expectations, you establish an investigation plan, and you follow through with the plan by asking and answering quantifiable questions.

Now, I realize that this may sound rather harsh, so please let me assure you - I intended it to be. I assume that if you are taking the time to read this blog, you are trying to improve your forensic and incident response skills. It was not too long ago, that I was where you most likely are now...interested, overwhelmed, inundated. Interested in this line of work...how to break into the field, where to begin, who to contact, who will give you a chance? Overwhelmed with tools, techniques, terms like MD5, non-repudiation, analysis, and file carving. Inundated with tools, EnCase, FTK, TSK, Helix, SMART, F-Response, and MFL (just to name a very few).

So you being where you are, I would also assume that you want to get better. You want to HAVE these tools, and more importantly you want to know how to USE these tools! How do you parse a registry? What is file carving anyway? How do I use a regex to find credit card data? What does all of this look like? AHHHHHHH!!! There is TOO MUCH!

Let me assure you, you are correct - there is too much for an one person to know. Be very very wary of anyone who tells you the contrary. I don't care they've "been doing this for 20 years". They don't know everything. None of us do. So what? Are we doomed? Are we all bound to mediocracy? The answer is a resounding, "No"...no you are not...IF and only IF you have a solid foundation of understanding upon which to build.

Build your foundation on critical concepts like "Locard's Exchange Principle", "Occam's Razor", "The Alexiou Principle". Use critical tools like organization, clear concise questions, and sound logic. Develop useful habits like taking good notes, using a buddy to discuss your case work, and remaining mentally pliable.

Listen, I have heard it said that the Peace Corps is the toughest job you'll ever love. I think that is crap. Being a Computer Forensic Investigator is BY FAR the toughest job you'll ever love. So if you are like me, and you really love this line of work, then why not be the best at it? Be the best there is! To do that, you have to develop the core skills necessary for greatness. Look at some greats in the world of sports...Tiger Woods, Curt Schilling, Walter Payton, Michael Jordan...they were/are the best at their particular sport because they worked hard. Harder than anyone else. They were the first on the field, and the last to go home. They hit just one more putt, reached for one last yard, and threw just one more batter. They were anything but average and each of them shared one thing in common - mastery of the basics that made up the game.

The core message here is that if you are going to do this job, don't be average. Master the basics...get SO good at those that you could do them in your sleep. Then, you will be ready to truly be great! It's a long journey which I have yet to complete. I work hard everyday to get better, learn more...get that one last regex working before calling it quits.

Master the basics, and be great! I hope to be there someday too!

Saturday, June 27, 2009

The Alexiou Principle

One of the most common mistakes I see made by computer forensic investigators of all levels of experience, Law Enforcement, and Corporate, is the lack of a clear direction. They know there is "bad guy" stuff out there somewhere, so they go attack it without first establishing an investigation plan. In my experience, this methodology NEVER yields solid results, and ALWAYS leads to confusion, wasted time, and questions going unanswered. As forensic investigators using such absolute terms like "never", and "always" is a NOGO, but in this case I find that it's true. Please comment you have experienced the contrary.

In the spirit of coming to you with solutions and not problems, I would like to share the investigative theory that I use and that have been proven to work EVERY time..."The Alexiou Principle". Named after its creator Mike Alexiou, the Alexiou Principle states four questions for the investigator to answer:

1. What question are you trying to answer?
2. What data do you need to answer that question?
3. How do you extract that data?
4. What does that data tell you?

Every case has a question - in fact most cases have multiple questions. Regardless of what the circumstances are, there is something you are trying to answer. I encourage my team to make sure these questions are clear with their customers to ensure that expectations have been set appropriately on both sides. Few things are worse than coming to what you believe to be the conclusion of an investigation and have the customer ask you, "What about X"? and you're all, "What? We never discussed X"! To avoid this hammer out the questions the client wants to have answered and review it with them! That way, they know what you are going to do, and more importantly, YOU know what the expectations on you are! I also encourage my team to log their questions in their Case Notes. Doing this not only helps them to keep track of what it is exactly that they are doing (preventing scope creep and analysis paralysis), but it also allows them to track what the answers to those questions are and how they came about finding those answers (making the writing of their final reports SOOOOOO much easier).

Once you have established what it is that you are trying to answer, it's time to figure out what data is going to give you the answers you are looking for. Depending on the question and the operating system there are likely several places that you will need to look. The key here is being able to phrase precisely and clearly what it is that you are looking for. For example, I have worked several cases involving AS/400s. Now although my first job out of the Army was as an AS/400 administrator, that was more than 10 years ago and I was only in that role for about six months. So any knowledge I had about the AS/400 or Z/OS is totally gone (probably replaced by important facts like which restaurants have the best bread pudding, and how to help my daughter hold a "proper" tea party)! But who cares!? Not me IF I can either ask the customer's admin, or I have Internet access. Say for instance I wanted to know user activity on an AS/400. A quick search on the web showed me that I would need to check the DSPSECAUD to see if security auditing was activated, and then I could display the audit journals by running DSPJRN JRN(QAUDJRN). I know that is kind of an obscure example, but you get the idea.

Once you have identified your data, you need to extract it. How you do this is really up to you and will depend greatly on your case, the evidence you have on hand, the data you have already collected, and the tools in your toolbox. I won't go much more into this question, because I think it's pretty self explanatory. The one thing I will say is that you should treat this question like a when we did math problems in like 5th grade. SHOW YOUR WORK. State in your notes and in the final report that you extracted data X, Y, and Z for the purposes of BLAH.

Finally, and most importantly (in my opinion) is what the data tells you. Some common mistakes investigators (including myself!) make are jumping to conclusions about the data, misinterpreting the data, or forcing the data to fit into their theories. When you are looking at your data and think you have an answer, go back to your notes and outline the process you just took to get there. Seriously, this takes all of 3 minutes, but will prove to be invaluable to your case. I will state my question, indicate where the data exists, indicate how I extracted that data to include the tool(s) and version(s), and what I believe that data is telling me. Once I have all of that information written down, I will read it over a few times to make sure it says what I want it to say (as opposed to simply being the first thing I wrote down - which is often nothing more than a series of jumbled thoughts and comments). Then I show it to a buddy and ask him if it makes sense. A peer review of your conclusions is a great tool you can use to ensure you are not making incorrect assumptions, or misinterpreting data. Then you would repeat this process for every question you have been asked to answer by your customer.

If this seems really easy to you, that's because it is. Too many investigators introduce unneeded complexity into their investigations by not focusing and having a clear investigation plan. Maybe these people think they need to make things seem complex so that they appear to be smart, or like "real" pros...seasoned and hard core? I don't know, in my opinion having an attitude like that is counter productive and a huge waste of time.

Being a good investigator has nothing to do with how many $5 words you can throw out, or how many years of experience you have! It's about figuring out what happened! I promise you, if you apply the Alexiou Principle and go into your cases with a PLAN, you will be a better investigator, regardless if this is your first case or your 500th.

Tuesday, June 23, 2009

Trust

So, as you are probably aware, I am working on a case right now that has an interesting pair of malicious binaries. In a previous post, I stated that I was not able to get one of the binaries working either in a VM or on my dirty box. Well, after re-reading that post, and sleeping on it (or at least trying to), I couldn't stop thinking about it...something was not sitting well.

My instincts told me that I was missing something. So rather than simply dismissing it, or chalking it up to indigestion (which I hear Peanut Butter Cup cereal will do to you), I got out of bed at 0400, started the coffee pot, and decided to trust my instincts.

After trying to run the malware again unsuccessfully, I realized that I had not "Googled" the error message. That is something I usually would not have forgotten, but for whatever reason, I had. So I did...and what came back had something to do with the Microsoft .Net framework. I then found the download page from MSDN (which if you don't have bookmarked, you REALLY should) and grabbed "dotnetfx.exe" - let's remember that name for later (in a post I am working on for tomorrow)! So, I ran the executable, installed .Net v2, and kicked off the malware again. This time NO ERROR! While I was not sure what it was doing that this point, I was confident it was doing something as the error message that I had become so accustomed to was no longer plaguing me! The point here is that I trusted my gut...something was not sitting right and I was determined to find out what...which I did! And THAT ended up being HUGE in this case!

Next, I fired up Process Explorer and restarted the malware. It appeared for like a second...maybe two, then disappeared. Well, I had already opened the binary in both a hex editor and PEDump without any interesting results. I had to get at the process as it ran! I tried to dump RAM, but I couldn't do it quickly enough to catch the process as it ran. After about 45 minutes of launching the binary (which I could do from the cmd line) and trying to click on the process when it popped up in Process Explorer, I got it! What I saw in the strings as the process ran in memory was that it was trying to open and FTP connection to a remote IP address...AND...it gave me something that looked a whole lot like a username and password. Again, by trusting my instincts, I not only got the malware working, but I got an IP, a username, and password (later validated by the USSS...so I can state with confidence that I got it!)

So as I sat on my back porch celebrating, my neighbor (Glen Painter) came over. As I talked to him about my victory, I told him I still had one question. The malware definitely FTP'd the customer data off the network, but how did it get it in the first place? Well, my neighbor happens to be a very good .Net Developer. So we downloaded the free version of Red Gate's .Net Reflector and decompiled the binary into C#. By taking this last step, I was able to see that the malware was listening on the TCP port that the Point of Sale (POS) software used to transmit credit card numbers from the POS terminals to the back of house (BOH) server. As it listened, it created a file that contained the results of the traffic sniffing! This file format was the exact same as the format of the files on my forensic image that I suspected were being exported! By trusting my friend, I was able to learn something new, put a new tool in my arsenal, and really bust this case wide open. Thank you GLEN!

In this particular situation, the technology was not my obstacle...it rarely ever is. Truth be told, I hit a rut. I thought what I had done was good enough, and I was prepared to go on. So that made me wonder...how many other investigators do the same thing? Get stuck on something, but instead of waking up at 0Dark30 and knocking it out, they slap whatever they have into their report and call it a day.

I also have to give credit to Harlan. He was up around 0500, and I was able to bounce some ideas off of him. This brings me to my last point...trust your friends (#2). Look, we have a tough job. Every case I get is difficult in some way, but that's what makes it fun. It's a challenge! The key is to look that challenge straight in the eye, and kick it in the teeth...not fold at the first sign of adversity. Having a trusted friend that you can bounce your thoughts off of is a great tool. You need someone you can talk candidly to, who will tell you when you are way off, and who you won't lose any face with when you make mistakes.

Since Harlan and I don't work together anymore, we obviously don't share customer data, so we just stick to the forensics. After about 30 minutes of taking to him, feeling a bit sheepish a time or five, and gathering my thoughts, I was back on track, and able to find what I was looking for and then some.

Look, if you don't get woken up in the middle of the night thinking about case work, I will go out on a limb and say you may be in the wrong line of work. I have solved many many cases between the hours of midnight and 0600 when something was bothering me so much that I couldn't sleep...I HAD to figure it out. I'm sure there are investigators out there, who can be in the middle of difficult case, have some unresolved issues...throw some hash sets around hoping for a hit, and when they don't get anything they go to bed and sleep like babies. I am just not one of them.

Trust your instincts, Trust your friends, and Trust your abilities. Trust really is one of the best tools in your toolbox!

Monday, June 22, 2009

Semper Gumby

I am taking a brief detour from my Live Analysis series to post something that I feel is extremely in CF and IR - the need to be flexible. I LOVE the line popularized by the character Gill Grissom the TV Show "CSI"..."Let the evidence guide you".

Having been an investigator for a number of years now, and approaching 100 cases (yes...I keep track) I have seen a very common and very frightening mistake made by both veteran and rookie investigators. They generate some kind of idea about the case early on with really looking closely at the data and then they try to force the evidence to fit into their theory. When this starts happening, it doesn't matter if the data supports their theory or not...I think it becomes a pride issue...they simply cannot be wrong!

Having worked with some outstanding folks like Harlan Carvey, Don Weber, and Colin Sheppard we have always said..."Leave your ego at the door". Seriously...having an overinflated sense of self or having the lack of humility will NOT serve you well in this business.

In the case I am working on right now, I had two thoughts early on...one was Malware and the other was a packetsniffer. Well, in the first case, I found a process through Live Analysis using Sysinternal's Process Explorer that made me suspicious. I noted it in my Case Notes and moved on. When I got back to the lab, I extracted the binary and took and MD5 checksum of it and compared it against that from a known good (I dowloaded the binary by the same same from Microsoft). Well, the MD5s matched. Theory blown.

Next I found a process that I thought for SURE was packet sniffer (heh...it was even named packetsniffer.exe)! The problem was that when I ran Promisc Detect during my volatile collection process the NIC was NOT in promisc mode. Now regardless of what that process was supposed to do, it was not doing it at the time I collected the data. So, when I got back to the lab I extracted the binary and ran it on a VM with the same OS as the customer. It failed to run, and gave me an error. Thinking maybe it had a red pill, I put it on my dirty box and tried it again...same thing...errored out. Just to make sure it was not trying to "trick" me into thinking it was an error when it was really working, I opened the binary with a hex editor and extracted the strings to see if the error message was present...it was not. Also, I ran Promisc detect on my host to see if my NIC was in promisc mode...it was not. Theory blown. The best I could do in this instance was indicate that I found the process, but that I was unable to duplicate a scenario in which the binary was actually sniffing packets.

The real takeaway here is to not let your theory guide the evidence - like Forrest Gump putting one of those round pegs into a square hole. Rather, let the evidence guide your theory and always be prepared and be comfortable with being wrong. You need to be able to switch directions and quickly. You CAN be wrong, so leave the ego at the door.

Semper Gumby!

Saturday, June 20, 2009

Live Analysis Part I - Changing of the Guard

There has been a drastic paradigm shift in the forensic community that folks like myself, Harlan Carvey, Jibran Ilyas, Colin Sheppard, Cory Altheide, and Don Weber have been a major catalyst for. The days of "image everything" and letting tools like EnCase and Gargoyle sort them out are gone. If you expect to remain relevant in the forensic community, and have an impact on the industry, you NEED to know how to perform live analysis.

The first thing we need to look at is the benefit of analyzing a live system. You are dealing with the compromised system! You may never get a chance to lay your hands on that keyboard again and there is a wealth of information that can be learned from real time analysis. I will cover some of the tools I use and why, but there is a major point which needs to be understood before we proceed. I know how all of the tools I use interact with the several Windows platforms (2000, XP, 2003, 2008, and Vista). I have tested them and know what kind of fingerprint they are going to leave. MAKE SURE you do the same for any tools you use, and document what you are doing, when, and why. This will allow you to identify any "tracks" you make on the system should a forensic image be later required, for data reduction purposes.

As you look at a live system, you need to have something in mind that you are looking for...ie...go in with a plan. Simply looking at a system and "trying to find bad guy stuff" is both unrealistic and illogical. As malware (simply defined as any program used for malicious intent) has become more advanced, its presence has become less obvious. You WILL NOT see processes called "hacked.exe", "P0wn3ed", or "slkjdhfbc.exe" - you WILL see processes called, "lsass.exe", "svchost.exe", and "winlogon.exe". The later are processes you would normally see on a typical Windows system, and would likely not draw any suspicion from an admin or your "average" investigator. This is why you need to go in with a plan.

The most successful model I have used (an still use) is the Alexiou Principle (named after it's creator Mike Alexiou). It asks the following questions:

1. What question are you trying answer?
2. What data do you need to answer that question?
3. How do you extract that data?
4. What does that data tell you?

By employing the Alexiou principle, you can sit in front of a live system with a goal in mind instead of just "willy nilly" tromping all over your suspect system.

In the next post, we will cover some live response tools, what the output looks like, and what that output means.

Windows Forensic Analysis 2/e Review

I recently purchased and read (thanks to two cases back to back with a nice layover in Chiacgo due to weather) Harlan Carvey's new book, "Windows Forensic Analysis Second Edition".

In my tenure as a security professional I have read a LOT of books. I can honestly say that without question, this is the best, most useful forensics book I have read. I am not going to go into a breakdown of chapters, however I will say that the chapters on Live Response, Memory Analysis, Registry Analysis, File Analysis, and Executable File Analysis are fantastic. My copy is full of hi-lights and margin notes! I have also already started working in some of the tools from the DVD into my volatile collection scripts.

What Harlan does really well is makes very complex topics (like the Windows Registry) very understandable. He also is able to tie in why it's important to you as an investigator, and what kinds of things you need to be on the lookout for.

Quite frankly, if you don't have this book yet - you need it! If you are serious about your trade, and want to learn more about how to conduct a comprehensive analysis of Windows based systems - you need this book! Seriously, the chapter on Registry analysis ALONE is worth the price of the book.

Buy it now...seriously...I cannot emphasize that enough!

SANS Forensic Summit 2009

Don't forget that the SANS Forensic Summit is coming! There will be some fantastic speakers including yours truly!

Also, I found out recently that Syngress will be sending out multiple copies of my book, Harlan's new book, "Windows Forensic Analysis, Second Edition", and Eoghan's Casey's "Malware Forensics"! You can come get your copies signed and meet the authors, and ask questions.

This is going to be a great event and I am really looking forward to it! Hope to see you there!

Welcome

Welcome to The Digital Standard! After much prodding from my good friend Harlan Carvey, I decided to create this blog.

A little bit of background on me, and why you should care that I have a blog in the first place. I am a Senior Security Analyst for the SpiderLabs at Trustwave (www.trustwave.com). I have been in the security industry for going on 10 years, and been working in the computer forensics field for the past four years. I have a BS in Applied Management, an MS in Information Security, the CISSP, CEH (Certified Ethical Hacker), CREA (Certified Reverse Engineering Analyst), and QSA (Qualified Security Assessor) certifications, am on the HTCIA Board of Governors, and am a former US Army Signal Corps Warrant Officer...oh ya, and I am the principle author of "Unix and Linux Forensic Analysis" by Syngress.

This blog will be dedicated to tips, tools, techniques, interesting case work, emerging threats and attack vectors, and anything else dealing with computer forensics which I think is applicable.

So why the name, "The Digital Standard"? Well, I think the "Digital" piece is pretty self explanatory, so I will skip over to the term "Standard". As a soldier, I learned at a very young age that "the standard is the standard". Being somewhat about such a underwhelming statement confused, I continued my search for truth. I quickly learned that the "standard" was a basis for measurement to which I was being assessed on a number of things from my haircut to the score on my PT test. It was the "bar", and I was to meet or exceed it, or expect consequences.

For the purposes of this blog, I think the name fits since I strive to be the best forensic investigator I can be. I read, study, and research all in an effort to get better with every case. I feel that THIS is the standard...constant improvement without a destination...just continually getting better.

Thank you for visiting and I hope you will return often!

Chris