1 / 21

The Ghost In The Browser Analysis of Web-based Malware

The Ghost In The Browser Analysis of Web-based Malware. Niels Provos, Dean McNamee, Panayiotis Mavrommatis, Ke Wang and Nagendra Modadugu Google, Inc. 鲍由之 11,23,2010. Overview. Aim to present the state of malware on the Web and emphasize the importance of this rising threat.

zanna
Download Presentation

The Ghost In The Browser Analysis of Web-based Malware

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Ghost In The Browser Analysis of Web-based Malware Niels Provos, Dean McNamee, Panayiotis Mavrommatis, Ke Wang and Nagendra Modadugu Google, Inc. 鲍由之 11,23,2010

  2. Overview • Aim to present the state of malware on the Web and emphasize the importance of this rising threat. • Phase 1: observing the malware behavior when visiting malicious URLs and discover if malware binaries are being downloaded as a result of visiting a URL. • Phase 2: keeping detailed statistics about detected web pages and keep track of identified malware binaries for later analysis. • Furthermore examining how malware takes advantage of browser vulnerabilities to install itself on users’ computers.

  3. Phase 1: Detecting • Definition of malicious web page • Causing the automatic installation of software without the user’s knowledge or consent.

  4. Method • MapReduceHeuristical URL Extraction: • Rating • No details

  5. Results of Detecting • Increasing amount of base per day because of optimizations

  6. Categories • Four different aspects of content control responsible for enabling browser exploitation: • advertising • third-party widgets • user contributed content • web server security

  7. Web Server Security • Adversary gains control of a server • May changed over time

  8. User Contributed Content • Blogs, comments, etc. • Example: a poll

  9. Advertising Trust

  10. Third-Party Widgets • Counter

  11. Exploitation • Finding vulnerable network services and remotely exploiting them. • e.g. worms • NAT & Firewalls

  12. Drive-by-download • Javascript, VB, Flash, pdf • <javascript> <iframe> tag • systematically catalogs the computing environment, instead of blindly trying • Tricking the User • Social engineering • Adult movie

  13. Phase 2: Trend and Statistics • To capture the evolution of all these characteristics over a twelve month period and present an estimate of the current status of malware on the web. • Obfuscation • Classification • Remotely Linked Exploits • Distribution of Binaries Across Domains • Malware Evolution

  14. Obfuscation • Decoding • Classification • Trojan • Adware • Unknown/Obfused • Assuming that two binaries are different if their cryptographic digests are different.

  15. Remotely Linked Exploits • Easy to change • May lead to bottleneck or a single point of failure • The top graph presents the number of unique web pages pointing to a malicious URL and for all of such URLs.

  16. Distribution of Binaries Across Domains • To avoid bottleneck or a single point of failure mentioned above

  17. Malware Evolution • the majority of malicious URLs change binaries infrequently. • However, a small percentage of URLs change their binaries almost every hour.

  18. Expection • The majority of malware is no longer spreading via remote exploitation but rather as we indicated in this paper via web-based infection.

  19. Conclusion • An intro to malwares which is popular during the internet at present • No specific details about any tech • Appears to be a statistical report • No care about specific vulnerabilities but rather about how many URLs on the Internet are capable of compromising users. • BASE: A huge amount of data because of its privilege

  20. we assumed that two binaries are different if their cryptographic digests are different. • If the majority of URLs on a site are malicious, the whole site, or a path component of the site, might be labeled as harmful when shown as a search result. • For each URL, we score the analysis run by assigning individual scores to each recorded component.

  21. Q&A • Thank you

More Related