1 / 18

Optimal Data Structures for Efficient Crawling and Virus Detection in Massive Datasets

This document explores advanced algorithms for handling massive datasets, focusing on optimal data structures for web crawling and virus detection. It discusses the efficiency of different strategies, including the use of Bloom Filters for tracking visited URLs and virus checks against a predefined dictionary of checksums. The study highlights the effectiveness of utilizing trie structures, comparing their efficiency with brute-force methods while proposing an optimal solution for minimizing storage and enhancing performance. Insights into empirical formulas for optimal parameters further enrich the discussion.

truman
Download Presentation

Optimal Data Structures for Efficient Crawling and Virus Detection in Massive Datasets

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Algorithmsfor Massive Datasets The power of “failing”

  2. 2 TTT

  3. Not perfectly true but...

  4. Opt k = 5.45... m/n = 8 We do have an explicit formula for the optimal k

  5. Other advantage: no key storage

  6. Crawling What data structures should we use to keep track of the visited URLs of a crawler? • URLs are long • Check should be very fast • No care about small errors (≈ page not crawled) Bloom Filter over crawled URLs

  7. Vj i i+z F Anti-virus detection D is a dictionary of virus-checksum of some given length z. For each position i, check… • Brute-force check: O( |D| * |F| ) time • Trie check: O( z * |F| ) time • Better Solution ? • Build a BF on D. • Check T[i,i+z-1] єD, if BF answers YES then “warn the user” or explicitly scan D O(k*|F|) or even better...

  8. Upper bounds

  9. Upper bounds

  10. Recurring minimum for improving the estimate + 2 SBF

More Related