1 / 30

Lecture 4: Evaluating Tools

Lecture 4: Evaluating Tools. 6/5/2003 CSCE 590 Summer 2003. Tools vs. Knowledge and Methodologies. Knowledge of underlying technology and sound methodologies win Technology continues to evolve Tools become outdated and change too

cassia
Download Presentation

Lecture 4: Evaluating Tools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 4: Evaluating Tools 6/5/2003 CSCE 590 Summer 2003

  2. Tools vs. Knowledge and Methodologies • Knowledge of underlying technology and sound methodologies win • Technology continues to evolve • Tools become outdated and change too • If a person could not explain something without the use of a specific tool, testimony would be severely compromised in a court • I don’t need Encase. I can recover that file with just a hex editor and a toothpick!

  3. Structure of a Forensic Investigation • Evidence preservation • Lead formulation • Focused searches • Temporal analysis • Evidence recovery

  4. Collection and Preservation • Must preserve integrity • All conclusion based on integrity of copy process • First time evidence comes into logical danger is at this stage, even • when memory resident software based Interrupt 13 hard drive blockers are installed • when mounting evidence drive under non-DOS operating system

  5. Testing Initial Tools • First tool touching an evidence drive • Must not change it and • Must preserve the evidence • If an evidence drive is to be mounted under a non-DOS operating system: • Boot and initialization procedures must be reviewed to insure the drive is not written to • Even if it is to be mounted read-only

  6. Testing an Imaging Tool • Materials: • Several blank hard drives • A hexadecimal editor • The tool you are testing • Wipe hard drives • Mark drives with a specific recognizable pattern • Take hashes of the drives

  7. Testing an Imaging Tool • Image them with the tool • Take hashes again and check for differences • Compare original and restored image to make sure all data is there and identical • For specialized device drivers, perform the tests with and without them • Check the ability to image ‘nothing’ • Image one completely wiped drive onto another completely wiped drive

  8. Testing an Imaging Tool • For gaps between and around partition tables: • Use a hex editor to compare • Logical hashes won’t include these and you can’t do physical hashes (bad blocks, clusters) • Document BIOS access modes, supported buses • Document all drive and partition specifications • Sizes, hardware, allocation, architectural data • Use two different operating systems’ partitioning and diagnostic utilities to collect this data

  9. Testing an Imaging Tool • Computer Forensics Tool Testing http://www.cftt.nist.gov/ • Don’t forget, as technologies change and tools are updated, frequent re-validation and re-testing is necessary

  10. A Note on Hex Editors • Even baseline tools used to test other tools need testing • Example: Norton’s Diskedit • Can no longer access a drive location by cylinder, head, sector • Can’t handle extended partitions on large drives • Targeted at single partition, LBA mode, Windows only environment and customers who did not want to use a hex editor • Most hex editors do not understand specific drive architectural structures like partition tables and boot records • Use several tools to validate one another’s results

  11. Formulating Leads • Two requirements • Acceptable level of ‘hits’ both true and false generated • False positives (not really a lead) can be expensive and time consuming • False negatives (missed leads) can halt an investigation • Output must be meaningful and useful to all parties, including non-technical • Lawyers may want it in an Access database

  12. Focused Searches • Searching medium for specific information instead of general leads • Child pornography • General requirements for focused searching: • Ability to delimit searches ` • Ex. we want Micros, not Microsoft • Scalable • Don’t want limitations on the number of search terms

  13. Focused Searching Requirements • Shell pattern searches • Regular expression searches • Hexadecimal searches • Unicode searches • Common Windows character representation • Unattended batched or scripted mode

  14. Shell Pattern Matching • * - match 0 or more instances of any character • ? – match single instance of any character • [abcde] – match a single instance of any character in the brackets • [0-9] – match a single instance of any character in the range 0-9 • [a-zA-Z] not [a-Z] • [^0-9] – match a single instance of any character except those in the brackets • {pattern1,pattern2,…} – match any pattern in the list # ls h* # ls h? # ls *[ej]* hello.txt hh hello.txt hh

  15. Regular Expression Pattern Matching • [1234], [0-9], [^0-9] – same as shell patterns • . – (period) matches a single character • pattern1|pattern2|… - match listed pattern • ^ - match from beginning of line • $ - match end of line • * - match 0 or more repetitions of previous regular expression • + - match 1 or more repetitions of previous regular expression • ? - match 0 or 1 repetitions of previous regular expression • \ - remove special meaning of following character

  16. Temporal Analysis • Date and time info on files, data chunks, etc • MAC times: File Modification, Access, and Creation times • Some tools do UTC conversion on NTFS partitions – watch out

  17. Deleted File Times • Hard to conclusively prove deletion times on a file or directory in MS file systems • Unix, easier as long as inodes, blocks, etc have not already been overwritten • Journaling filesystems provide a wealth of data • If journal entries have not been overwritten or replaced • FAT and VFAT undelete tools need to be thoroughly tested: • Sometimes tools pick out newly re-allocated subdirectories in a deleted directory in place of the deleted subdirectories

  18. Recovery Tools • Recovering files on media and deleted files on media • Recovering file slack space • File slack space example: • Have a file 2200 bytes long • Operating system allocates disk space in 1024 byte blocks only • It take 3 blocks (3072 bytes) to store our file • There is 872 bytes of slack left over

  19. File Slack Space • Have been entire shadow file systems built in slack • Is hard to do, because slack must stay stable. Deleting regular files returns the file’s space (including slack) to be reused • Slack can contain info from a background program running when file was created

  20. Deleted File Recovery Tools • Some tools include slack space when recovering files • Some stop at proper offset = real length of file • Must know how your tool works • If a tool recovers slack too, sometimes have to trick applications by modifying file’s size to get the file to open properly without the slack

  21. Evaluating Performance • dd and netcat • http://users.erols.com/gmgarner/forensics/ • With standard dd, netcat, and md5sum on Linux 40GB drive > 6 hours imaging • If you use gzip to compress stream before netcat, even longer • I/O intensive • Need to maximize performance

  22. Buffers • Most time spent copying between buffers • When imaging a logical volume, data is copied between: • Drive’s internal buffers -> file system’s in-memory cache • File system’s in-memory cache -> application’s read buffer • Application’s read buffer -> application’s write buffer • Application’s write buffer -> system output buffer (or write cache depending on hardware) • System output buffer -> output device’s internal buffer • Output device’s internal buffer -> output device

  23. More Buffers • Add on compression (gzip/zlib) • Copy to zlib input buffer • Copy from zlib output buffer • Piping to md5sum • Also two more copies • md5sum must also be run on source side and destination side • Piping to netcat • Two more copies

  24. Other Factors • Increase in time is linear (constant rate) with each copy • No multi-threading in forensic imaging tools • I/O threads spend a lot of time blocked – waiting for slower I/O operations to complete

  25. Improvements? • Incorporate cryptographic checksumming (md5) into the imaging application, dd • Both recorded in MD5 log file (MD5 digest) • Input path of original data in [..] • Output path of image listed second • Recorded checksum is actually computed against the input to dd • Example output \f6d426a3a8fcf8e365606b6eec5f2c40 [\\\\.\\A:] *d:\\floppy.img checksum input path output path

  26. More Improvements • netcat (nc) also modified to do MD5 checksums like dd • md5sum modified to verify against dd’s MD5 output log • Which file does the md5sum verify against: file listed in the input path to dd or the output path? • By default, verifies against output path

  27. Example – dd with MD5

  28. Output File floppy.md5 • Contains: \f6d426a3a8fcf8e365606b6eec5f2c40 [\\\\.\\A:] *d:\\floppy.img • Also modified md5sum to work with the modified dd’s MD5 output files: • Check output image against floppy.md5: • md5sum –c floppy.md5 • Recompute MD5 checksum: • md5sum –o floppy.md5 \\.\A:

  29. Windows Device Names • For the purpose of dd: • \\.\PhysicalDrive0 • The physical device • \\?\Volume{87c34910-d826-11d4-987c-00a0b6741049} • The volume name • \\.\C: • The mount point of the volume in Windows • \\.\PhysicalMemory • Contents of memory • Use volume_dump to get all this info + ugly volume names

  30. Example – volume_dump

More Related