1 / 34

Web Tap: Detecting Covert Web Traffic Kevin Borders, Atul Prakash University of Michigan Department of Electrical Engin

Web Tap: Detecting Covert Web Traffic Kevin Borders, Atul Prakash University of Michigan Department of Electrical Engineering and Computer Science, 2004 Presented by Nate Salemme nate.salemme@hp.com. Disclaimer.

polly
Download Presentation

Web Tap: Detecting Covert Web Traffic Kevin Borders, Atul Prakash University of Michigan Department of Electrical Engin

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Web Tap: Detecting Covert Web Traffic Kevin Borders, Atul Prakash University of Michigan Department of Electrical Engineering and Computer Science, 2004 Presented by Nate Salemme nate.salemme@hp.com

  2. Disclaimer • Content taken from Proceedings of the 11th ACM conference on Computer and communications security • Presented by Kevin Borders & Atul Prakash • Images and graphs also borrowed from • http://www.cisa.umbc.edu/courses/cmsc/444/fall05/spyware/webtap.pdf • Presentation template borrowed from Mike Putnam. Thanks Mike.

  3. About the Authors Atul Prakash -Professor in the Department of EECS at the University of Michigan. -He is also currently serving as the Director of the Software Systems Laboratory. Kevin Borders -Graduate student at the University of Michigan -Involved in Eta Kappa Nu

  4. Outline OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Introduction • Threat Model • Web Tap Filters • System Evaluation • Vulnerabilities • Related & Future Work • Conclusion

  5. Introduction OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Hackers life use to be easy • Direct connection to Internet • No protection • Backdoors and Trojans easily spawned • Programs like AOL made this easy • Security became BIG concern • Firewalls • Proxy Servers • Mail Servers

  6. Introduction OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts The Firewall

  7. Introduction OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Hackers get creative • Firewalls leave open port 80 (HTTP) • Use outgoing HTTP as attack vehicle • Examples • Spyware, Adware • User information can be hidden within legitimate outgoing HTTP traffic • System resources severely hindered through some malicious spyware

  8. Introduction OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Web Tap • Definition: “A network-level anomaly detection system that takes advantage of legitimate web request patterns to detect convert communication, backdoors, and spyware activity that is tunneled through outbound HTTP connections” – Web Tap Guys • Deployed at an organization’s proxy server or router • Acts as an extension to the proxy/firewall where all outgoing traffic is passed through • A ‘training period’ is used to calibrate Web Tap

  9. Threat Model OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • HTTP Tunnels • Backdoors Programs • Spyware

  10. Threat Model OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • HTTP Tunnels • Allow non-HTTP services to be access through an outgoing HTTP session • Wsh(Microsoft Script Host)allows file transfer and remote shell access over HTTP • Firepass creates a tunnel between a client process and a remote service

  11. Threat Model OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Backdoor Programs • Usually spawned by a user opening a Trojan from email attachment or Internet • Trojan runs on computer as a client and makes ‘calls’ to a server hosting a certain script • These calls are hidden within outgoing HTTP • HTTP headers or POST data

  12. Threat Model OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Spyware • Installed by piggybacking on legitimate software (WeatherBug, Kazaa) • Uses the same methods as described with Backdoor

  13. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Web Tap was written in Python • Easy to code • Type Safe • Platform Independent • Web Tap reside in a module where all outgoing HTTP traffic is funneled through this module and either analyzed real-time or logged and analyzed offline • Web Tap calibrated based on 30 users over 1 week training period

  14. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Some “hope to’s…” • Hope tokeep additional state in the header of outgoing requests to verify integrity • (Right now just calculates # of bytes in header) • Hope to measure other statistics • Request type (image, html, CGI, etc) • Request Content • Inbound Bandwidth • Inbound Content

  15. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Web deploys the following filters • Header Formatting • Delay Times • Individual Request Size • Outbound Bandwidth Usage • Request Regularity • Request Time of Day

  16. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Header Formatting Filter • Parses each header • If header is indicative of a non-browser request, sound alarm • Example- IE sends out header with XP signature when all computers are running Windows 98 • Good at detecting unwanted clients • AIM Express • iTunes • Gator

  17. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Delay Times Filter • Measure inter-request arrival time for specific clients • Goal is to detect programs that makes requests with set timers • “Jumps” in CDF indicate areas of concern (30 seconds, 4 minutes, 5 minutes)

  18. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Individual Request Size • Requests to most sites contain little information • Hackers needs to send out large amounts of data to transfer files off a remote host • Out of 1600 sites • 11 sites > 3 KB • 4 sites > 10 KB • Most effective setting is at 3 KB 99.28%

  19. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Outbound Bandwidth Usage • Outbound bandwidth expected to be LOW for normal web browsing • Outbound bandwidth usage will increase when hackers use HTTP for covert communication • Measure both aggregate and per site bandwidth; per site used • Lower bound set at 20 KB (bytes/day) per site per user • Upper bound set at 60 KB (bytes/day) per site per user Anywhere in here is good

  20. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Request Regularity • Due to bandwidth constraints of previous filters, Hackers spread requests over long time period • Legitimate web traffic is bursty • Too many requests indicate website is being accessed by automated program • 16% Threshold chosen for 8 hr plot

  21. Web Tap Filters OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Request Time of Day • People tend to follow a set schedule of browsing times • When requests are made outside of normal browsing period, alerts can be raised • Very effective in corporate environments (set schedules)

  22. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • The TEST • 40 Days, 30 clients at the University of Michigan • 1 Week Training Period • ALL FILTERS were active • 428,608 requests logged • 6441 unique websites

  23. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Header Format Filter • Detected 5 out of 30 clients that had some form of Adware • Other non-desirable clients detected (AIM Express, iTunes) • NO FALSE ALARMS

  24. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Delay Time Filter • Low false alarm rate (1 every 6 days) • Some legit sites blocks that used timers (espn.com, nytimes.com) • Recommended that System Admins create “allowable sites”

  25. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Request Size Filter • High false alarm rate (34%) • Mostly ASP and shopping cart scripts • Again, create database of trusted sites

  26. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Request Regularity • Using both count and variance measurements • Approximately 1 false alarm every 3 days • Found Adware such as browser search bars that other filters did not pick up

  27. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Daily Bandwidth Filter • As threshold decreases, false positives increase • 60KB reasonable for small group sizes • 20 KB roughly 1 false alarm per day

  28. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Time of Day Filter • Training period lengthened to the first TWO weeks • During training period spyware and adware programs were active! • Time of Day filter pretty much useless

  29. System Evaluation OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Web Tap vs. Third Party HTTP Tunnel Programs • Wsh, Hopster, Firepass • These programs help people inside a network bypass firewall restrictions • All detected by Web Tap, sweet • Web Tap vs. Backdoor program (Tunl) • Tunl written for windows (since it’s vulnerable) • With no workload, set off 3 filters • Minimal workload, set off more filters • Moderate workload, even more filters • Pointless

  30. Vulnerabilities OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Single Request Size Filter • Large data transfers can be broken into multiple smaller transfers • Delay Time Filter • Delays could be randomized to prevent detection • Time of Day Filter • Schedule requests when users are active • Request Regularity • Keep a running count of activity and stay below threshold • If threshold not known, then filter can be avoided by emulating the regularity of a common site • Bandwidth limit filter • Keep a running count of total bytes that have been sent that day. Don’t exceed threshold

  31. Related and Future Work OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts Related work • Signature Analysis [Ad-Aware, Snort, Spybot] • Signature rules used to detect attacks • Web Tap relies on anomalies rather than signature • Signature Analysis is limited since new attacks are developed. • Human browsing patterns [A. Bestavros, D. Marwood, T. Kelly] • Relies on human browsing patterns • Web Tap uses some of the same browsing patterns (delay time, request size, bandwidth usage) • WebTap uses this information to determine if it’s legitiment; previous research used it for performance reasons • Content-filter Proxy [MIMEsweeper, Websense] • Block certain websites through a proxy server • Hackers can still get around this by other web proxys • http://www.freeproxy.ru/en/free_proxy/cgi-proxy.htm

  32. Related and Future Work OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts Future Work • Create database that contains hosts that tend to set off alarms • Reduce false positives • Proxy caching • Place proxy before Web Tap • This would help isolate legitimate web request from the anomalous ones • Compress large transactions • Reduce false positives for bandwidth filter • Example; 3.87 KB POST request can be compressed to 2.07 KB • Good Hackers are likely to already have compressed their requests which would prevent further compression

  33. Conclusion OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Web Tap monitors outgoing HTTP traffic as opposed to the actual attack on a server • Design filters cover wide range of Hacker tactics • Only concerned with the detection process • 30 users, 40 days, 1 week training period • Successful at detecting spyware, adware, HTTP tunneling programs, backdoors • Vulnerabilities explained • Manageable number of false alarms

  34. Thoughts OUTLINE Introduction Threat Model Web Tap Filters System Evaluation Vulnerabilities Related & Future Work Conclusion Thoughts • Good paper, easy to read and well explained • Interesting approach • Problems • User groups will be different depending on size, characteristics, etc. Each implementation of Web Tap would need to be customized • Sites with refresh counters would trigger alerts (espn.com gamecast) Not good. • They don’t mention flash crowds • Spyware/Adware screws up Time of Day filter • Tunl • ... • Applicable for schools and companies. Home?

More Related