Tuesday, March 30, 2010

Timeline Analysis Part 4 : Timescanner

In our last episode of Timeline Analysis, we covered the magic that is “log2timeline”…thank you Kristinn! The question I left y’all with (remember, I do live in Oklahoma) is, “How great would be if we had a tool that would leverage the log2timeline-y goodness, but not require you to feed it the local system logs”? I told you such a tool existed, and told you it was called, “Timescanner”. In this post, we will cover this tool.

Timescanner is another tool written by Kristinn, and does exactly what we talked about…it searches the target (either live file system, or mounted forensic image) and searches for the types of log files supported by log2timeline. It then parses those files and add them into the body file, which we would ultimately use to generate our timeline. Let’s take a look at how the tool works.

Again, this is a Perl script written natively in Linux. Getting it working on my Windows XP box with Active Perl 5.10.1 was no proverbial walk in the park. It only took me a few hours of updating perl modules (.pm)s and various other dlls. One thing you will absolutely need to install in “WinPCAP”. Before even starting to try to get all of the pms for Timescanner, go grab it and install it. It will save you some headaches in the long run.

Recall from my last post, that whenever you run a Perl script for the first time, it’s a good idea to determine which modules you may need to load. So to do this…you might try something like this…

C:\tools\log2timeline> more timescanner (If you prefer, can you just open the script with any text editor)

This will give you a full text listing of the Perl script, but the section we are really interested in is this…


Now, this is not to say that these are ALL of the .pms you will need, but making sure these are installed properly will minimize the amount of tweaking you will have to do to get the scrip working.

Once you have your .pms loaded, you care ready to kickoff Timescanner.

The syntax is pretty basic and will look like this…

C:\tools\log2timeline> perl timescanner –d C:\

I skipped the –z option for timeZone, since I am in the central time zone, and I don’t need to make any modifications. However, if you are working a case in one timezone, and your evidence is from a different timezone, this will be a useful option. To see which timezones are supported, simply use this command…

C:\tools\log2timeline> perl log2timeline –z list

OK…so really…all you have to do it run this command…

C:\tools\log2timeline> perl timescanner –d C:\ > bodyfile4

..and timescanner will do the rest! Then, you simply use mactime again to generate the timeline and you will have added to your timeline, all supported log file types from the target drive/image!

Here is a snippet of my timeline…

Here you can clearly see my actions for the day! From what news I was reading from the BBC to creating the file I am using to write this blog entry…all right there!

So, what’s even cooler is that I now have a file that is also fully searchable!

Let’s say I wanted to see the activity from May 11th, or May 10th? Or what about hits for certain keywords like “pinball” or “DEFCON”? Well, just use grep and you can find them right away! Like so…

Now, the screenshot I provided above from MS Excel has been slightly modified to enable you to see the full paths of what Timescanner will report on. I simply highlighted the fields I did not want to see, right clicked on them, and selected “hide”. To get them back, just right click and select “unhide”.

So now, all you would have to do to create a Super Timeline is to simply use the same bodyfile as your output for each of the tools, then use that bodyfile as the input for mactime. It should be noted that this entire process, from the first time you run FLS, to the final compilation of the timeline with mactime, should take less than an hour. So as an investigator, what is that worth to you in terms of how quickly you can generate a full timeline of your suspect system and how much data is actually available to you? My answer is, “a whole freaking lot”!

If you are one of those investigators who are constantly on the search for a “silver bullet”, then I am sorry to disappoint you, but this is not it. There is not “Forensicator Pro” (hehe….thanks for the good joke though Ovie/Brett), there is no “Find All Evidence” button, and “X” never ever marks the spot. What this WILL do for you is give you a clear snapshot of the status of the system at any given, and stored, point in time. It is a tool for generating data points by which you can build your theory of the incident based on the available evidence. Tools should never replace sound logic, forensic methodology, and due diligence.

In addition to my instructions provided on my blog, these tools (and many others) are precompiled and freely available in the SIFT Workstation v2.0. It is a FREE VM image that can be downloaded from the SANS website that I’m certain you will find very very useful. A word of warning though, this image is Linux based, so if you are not comfortable using the command line…well…you need to get that way. May of the more powerful tools out there do not have GUIs, and for good reason. There is no joy or glory is being a “push button Monkey”…and frankly, you can be a very thorough investigator if all you know how to do is click on buttons predefined by somebody else. No offence to anyone, but if you truly want to get better at being a forensic investigator, you really need to become comfortable with the command line.

Hmmmm….sounds like another idea for a blog series?!?!?!

Tuesday, March 23, 2010

Timeline Analysis Part 3 : Log2timeline

In my last two posts we have covered how to use FLS to create a bodyfile of the active file system, and how to regtime.pl to integrate the last write times of the registry keys in the hive files into that bodyfile. Now, we are going to cover a great tool called, “Log2timeline” which is going to allow us to integrate local system logs such as the Windows event logs, browser history files, and anti-virus logs into our same bodyfile. Rob Lee recently posted about how to do this using the new SIFT toolkit v2.0 on the SANS Forensics blog.

I have not covered this before, so I will digress a bit into Perl and getting Perl scripts to run on your local machine. If you don’t have Active Perl installed, get it. It’s extremely useful and necessary if you want to run any sort of Perl scripts on your local machine.

Perl scripts have these things called Perl Modules or .pm files. They are sort of like dlls in that they are “chunks” of code which can be used over and over by different scripts instead of the developer having to rewrite that specific function. When you are going to run a new Perl script for the first time, you need to open the script in some sort of text editor and see which .pms the script is going to need. In the screenshot below, I just used the *nix command “more” to look at the first section of the script. Which reminds me, if you don’t have UnxUtils installed on your Windows box, you probably want to get that as well. It contains the Windows version of a slew of *nix commands that are again, extremely useful.
Here is the section of the code you want to look for…



If you don’t have these modules installed, there are a couple of ways to get them. First, you can use the command, “ppm” which stands for “Perl Package Manager” followed by the module name. So if say you wanted to add the “Time::localtime” module, your command would look like this…

C:\tools> ppm install Time::localtime

There are other times when you want to install the module from a specific URL. For example, the University of Winnippeg in Canada has a great site for installing different modules. In that case, your command would look like this…

C:\tools> ppm install http://cpan.uwinnipeg.ca/PPMPackages/10xx/.ppd

Now, be advised that this can be a total pain! Getting Perl scripts to work can involve a LOT of elbow grease…much like compiling source code on a Gentoo distro! Don’t fret though…all of the packages and dlls you need are available for free on the Internet. It just takes some time and patience. A word of warning…DO NOT try to cut and paste .pms in what you think are the right directory. It’s SO not that easy. Use the ppm command and find the right ppds. Google is your friend. =)

Once you have all of the Perl modules and dlls you will need to run your script, y
ou can proceed. Thus ends the mini Perl lesson.

Something else you will probably want to do (for conve
nience) is put your Perl distro into your path. This will prevent you from having to change directories to evoke Perl when running various scripts. This is pretty easy. Simply right click on “My Computer” and select “properties”. From there select the “Advanced” tab, and click on the “Environment Variables” button. In the “System variables” window, scroll down until you see the word “path”. Double click on the word “path” or simply highlight it and click on the “edit” button. From here you can add any directory you want into your path, with each entry separated by a semi-colon. Again, having these directories in your path will prevent you from having to actually be in that directory to launch whatever it is you are trying to run.

Download log2timeline and uncompress it into its own directory…I put mine inside my C:\tools directory creating a directory called, “log2timeline”. From the command line, you
can see which files are currently supported by log2timeline by issuing the following command as seen in the screenshot below.



So, for my example, I am going to parse my local Mozilla Firefox history file into bodyfile format. To accomplish this and for ease of use, I copied the file into my current working directory. This is not necessary though. I could simply provide the full path to the file. So, my command would simply be…

C:\tools\log2timeline> perl log2timeline –f firefox3 places.sqlite

Here is a snippet of the output…



Now, to make this part of our timeline, all we would need to do is redirect the output to our bodyfile, and parse it with mactime…like so…

C:\tools\log2timeline> perl log2timeline –f firefox3 places.sqlite > bodyfile3

C:\tools\log2timeline> perl C:\tools\TSK\mactime.pl –d –b bodyfile3 > timeline3.csv

And…when I open up the timeline with MS Excel, it looks like this…




Here you can see that I likely installed Firefox on December 11, 2009, and checked some stuff about Oklahoma State University (Go Pokes!), as well as Googling Boone Pickens Stadium (I was actually looking to a good seat for my season tickets for the 2010 football season). Anyway…using log2timeline can add an entirely new dimension to your investigations by giving you a one stop view of all of the system (Windows evt logs) and application logs.

Now if you come across a type of log file that is not currently supported, don’t worry. Since this is an open source script, feel free to write your own module for that specific type of log file. Or, if your Perl skillz are a bit lacking, you could contact Kristinn, and I bet he would write it for you (no promises).

So step back for a second and think about what I have covered in these last three posts on creating timelines. Using FLS, you can create a timeline of the meta data timestamps of the active file system. Then, using regtime.pl (a la Harlan Carvey), you can extract the last write times from the registry hive files, and include them in your timeline. Now add in log2timline, and you can now add in all of the local system and application logs. What you will have at the end is what Rob Lee calls, a “Super Timeline” that pretty much encompasses everything that has taken place on the machine that has been recorded by these various mechanisms. You can now search by date , or time, or application, or keyword, or whatever and get a quick snapshot of what was going on at that time, or with that application.

This is a HUGE development in our profession! Not only will this add speed and efficiency into your investigations, but it will provide multiple additional data points that will either help you solidify your working theory, or will provide the additional resources you will need to reshape your theory to fit the data. Pretty sweet huh!

So, now you may be saying to yourself…This is so great Chris! I want to buy you dinner! But determining which logs to include in my log2timeline commands is kind of a pain. And I have to hit up arrow like 17 times and change the type of log and the path. What would be really kewl is if there was some sort of tool that would search for me. It would take a path as input, then it would search for all of the log files supported by log2timeline, and parse those into my bodyfile. Well…I have some good news for you…there is just such a tool! It’s called, “Timescanner” and I will cover it in my next post!

Saturday, March 20, 2010

Timeline Analysis Part 2 : The Registry

OK...so in my last post on Timeline analysis, I showed you how to create a Bodyfile using The Sleuth Kit tool FLS. Then we used a second tool called mactime, to create the timeline in both csv and txt format.

Now, we are going to add the last write times from the registry keys from the NTUSER.dat, system, security, software, and SAM hives. These hives are located in:

C:\Windows\System32\config <-- All registry hives C:\Documents and Settings\ <-- NTUSER.dat file First, I used FTK Lite v2.6.1 to export my local hives and dropped them into my regtime directory. Regtime is a tool wrtten by Harlan Carvey, and is now part of the SANS SIFT Toolkit v2.0.



Now, in my example all of the hives are in my current working directory, but this does not have to be the case. If you are working from an image file, you can use the -r option in regtime to specify the full path to the hive file...like this...

perl regtime.pl -m -r /Windows/System32/config/software

NOTE...the mount point would be something like /mnt/badguy_image if
you are using a linux system or the SIFT workstation, or Z:\Windows...blah blah blah...if you are using a Windows system.

So in the screenshot below, I simply parsing the system hive for you to see what it would look like.



Pretty cool, huh. NOW, what you can do is redirect the output from this command to the original bodyfile that you created using FLS. Now your timeline will include both the active file system (at least the metadata entries) and the last write times for all of the registry keys.



So, now if you recall from my previous post, I used mactime to generate the timeline. In this example, I would use this command...

mactime.pl -d -b bodyfile > timeline.csv <-- This will generate an outfile that I can open with MS Excel or OpenOffice Calc mactime.pl -b bodyfile > timeline.txt <-- this will generate an outfile that is a flat text file. So now you can see from the screenshot below that I have file system metadata right next to last write times from the registry.



Just think about the impact to your cases! You have the ability to take a quick look at times when things are or had taken place, all at once in one clean csv file. So if the customer gives you a rough timeline regarding when the "incident" may have taken place, you can easily grep through the timeline for JUST that day, or a few days prior to the incident. If something or someone had done something nefarious, you will be able to see what files were created, and which registry keys were affected. Then, you should have a clear path to direct your investigation and gather more data to build your case.

Now you may be wondering, what about local log files? Stuff like the Windows event logs, or IE history? Fear not! In my next post I will cover how to incorporate those files into your super timeline as well! Nice!