1 / 20

A correlation method for establishing provenance of timestamps in digital evidence By:

A correlation method for establishing provenance of timestamps in digital evidence By: Bradley Schatz*, George Mohay, Andrew Clark From: Information Security Institute, Queensland University of Technology, GPO Box 2434, Brisbane 4001, Australia. ABSTRACT

gaye
Download Presentation

A correlation method for establishing provenance of timestamps in digital evidence By:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A correlation method for establishing provenance of timestamps in digital evidence By: Bradley Schatz*, George Mohay, Andrew Clark From: Information Security Institute, Queensland University of Technology, GPO Box 2434, Brisbane 4001, Australia

  2. ABSTRACT “Establishing the time at which a particular event happened is a fundamental concern when relating cause and effect in any forensic investigation. Reliance on computer generated timestamps for correlating events is complicated by uncertainty as to clock skew and drift, environmental factors such as location and local time zone offsets, as well as human factors such as clock tampering. Establishing that a particular computer’s temporal behaviour was consistent during its operation remains a challenge. The contributions of this paper are both a description of assumptions commonly made regarding the behaviour of clocks in computers, and empirical results demonstrating that real world behaviour diverges from the idealised or assumed behaviour. We present an approach for inferring the temporal behaviour of a particular computer over a range of time by correlating commonly available local machine timestamps with another source of timestamps. We show that a general characterisation of the passage of time may be inferred from an analysis of commonly available browser records.”

  3. Introduction • In Digital Forensics timestamps are used to coordinate between things that happen digitally with things that happen in real life. • Current methods are imprecise. • Two Main Ideas: • Determine if there is “uniform behavior” of computer clocks • Determine if timestamps = time sources.

  4. Clocks vs. Time • Real Time Clocks (RTC) – maintains PC’s time when a PC is off • Windows calls RTC “civil time” • Linux calls RTC “civil time” or UTC (Coordinated Universal Time) • Clocks in PC known for not keeping accurate time e.g. time drift – clocks skews

  5. Synchronization • UNIX uses Network Time Protocol (NTP) • Windows (2000 and higher) uses Simple Network Time Protocol (SNTP) • 2K has it off by default • XP syncs once a week • Syncs in a domains from domain controller

  6. Inaccuracy Causes • Clock configuration – In Windows, time zone = default • Tampering – User changes time on purpose • Synchronization protocol – Syncs to 2 seconds (20 s in some cases). Also enables attacks. • Misinterpretation – Investigator needs to be knowledgeable. • Bugs – software programs with bad algorithms could effect time

  7. Timestamps in DF • Timestamps are very important in digital forensics investigation • Several approaches include “corroborating sources of time…” purposed by Gladyshaw and Patel (2005). • Weil (2002) suggests relating timestamps in cached web-pages to the time cached files were accessed.

  8. How they tested • Setup a Windows 2K domain controller • Linux box was a proxy server and NTP server (which synced to outside NTP server) • PC’s in the network used proxies

  9. Problems with Test • Experiment ran from February 2006 to March 2006. • Bug in Python program caused monitors log files over 4K to crash • 2nd experiment 20 days from 21st of March 2006.

  10. Solid line shows near uniform drift

  11. Using Windows 2K shows drift

  12. Using Windows XP shows drift Vertical lines shows sync with domain controller Rebooting PC caused two peaks

  13. Using Windows XP PC did not update the RTC

  14. Milan lost 1 second very 14 minutes before syncs

  15. Timescales vs. Sources • Timestamp sources included: • Pages visited by typing a URL • Pages visited by clicking on a hypertext link • Documents opened within Windows explorer clicking (i.e. .xls, .doc,.) • Index.dat binary file • They tried other tools to view it, but ended up making their own (pasco2 http://www.bschatz.org/2006/pasco2/)

  16. This figure shows records for two days As part the this experiment they related timestamps vs. ISP’s logs Internet Explorer limited history time caused difficulties Used clickstream correlation algorithm – “time periods between hits (visits to a particular URL) would form a unique signature.” There were false positives and false negatives.

  17. Algorithms Used • History correlation algorithm • Non-cached records correlation algorithm

  18. Results • History correlation algorithm had better results

  19. Conclusion • Window PC’s time keeping behavior requires synchronization • Useful in Digital Forensics • Use of timestamps and interpretation of them can be complicated, but with the right algorithm the results can be useful

  20. WORKS CITED • http://dfrws.org/2006/proceedings/13-%20Schatz.pdf • http://www.wikipedia.org/

More Related