1 / 29

Rustock Botnet and ASNs

TPRC 24 September 2011 John S. Quarterman, Quarterman Creations Serpil Sayin, Ko ç University Andrew B. Whinston, U. Texas at Austin Supported by NSF grant no. 0831338; the usual disclaimers apply. Rustock Botnet and ASNs. Spam, Botnets, Security, and Policy.

jatin
Download Presentation

Rustock Botnet and ASNs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TPRC 24 September 2011 John S. Quarterman, Quarterman Creations Serpil Sayin, Koç University Andrew B. Whinston, U. Texas at Austin Supported by NSF grant no. 0831338; the usual disclaimers apply. Rustock Botnet and ASNs

  2. Spam, Botnets, Security, and Policy • Starting with some published ASN rankings • Drill down to Rustock and other botnets • Show some effects of a takedown • Specific enough to be actionable affected orgs • They could use to detect and fix vulnerabilities • How to get the orgs to pay attention? • Reputational rankings to produce peer pressure • A few simple policy suggestions

  3. After the Rustock Takedown

  4. Rustock Takedown and Slowdown • December 2011 Rustock Slowdown • 16 March 2011 Rustock Takedown • Which ASNs were affected? • Effects on overall spam? • Using data from CBL blocklist • To ASNs and orgs using Team Cymru data • Rankings and graphs by SpamRankings.net

  5. Rustock Takedown Rank Effects

  6. Takedown Hardware Outage 16 March 2011 Takedown Daily Graph

  7. Slowdown December 2011 Rustock Slowdown

  8. Recovery Slowdown Takedown Dec 2010 – July 2011 Top Botnets

  9. Slowdown vs. Takedown • Slowdown: gradual and temporary • During slowdown: • Maazben and bobax took up the slack • As Rustock returned in Jan, bobax went back down • After slowdown: • Maazben also retreated to its old levels • Rustock #1, Lethic #2 • Takedown: rapid and much longer-lasting • But other botnets took up the slack

  10. Increases During slowdown Dec 2010 Top Spamming ASNs

  11. 4766 #1 4766 #9 March 2011 Top Spamming ASNs

  12. Lethic March 2011 AS 4766's Botnets

  13. Bobax Lethic Dec 2010 AS 9829's Botnets

  14. Lethic Maazben March 2011 Top Botnets

  15. Opportunistic Botnets & Spamming • Knock one down • Two more pop up • Spammers can just rent from a different botnet • Other botnets can use same vulnerabilities

  16. Rustock Lethic Dec 2010 Top Botnets

  17. Congratulations Rustock Takedown! • Takedown had more lasting effect than Slowdown • Congratulations! • But in both cases other botnets started to take up the slack • Whack-a-mole is fun, but not a solution • Need many more takedowns • Or many more organizations playing • How do we get orgs to do that?

  18. Cyberwarfare responses 1st: key escrow 2nd Internet off switch at CONUS (Maginot Line) 3rd CERT, FIRST, etc. 4th botnet takedowns 5th economic and reputational incentives for distributed diverse commons governance Cyberwar meets IT Security • Generations of warfare • 1st: massed troops • 2nd tanks and heavy artillery • 3rd maneuver • 4th IEDs and suicide bombs • 5th open source gangs in it for the money

  19. Spam as a Proxy for Infosec • Most orgs keep security problems secret • Because they think it will harm their reputation • Ahah! Publish reputation and they'll care • Need available proxy for security • Anti-spam blocklists have spam data • Spam comes from botnets which use vulns • Just as a sneeze means disease, outbound spam means poor infosec • (Other diseases may not sneeze; for those other data; come back to that later.)

  20. Peer pressure and Medical orgs • Peer pressure is key: rank similar orgs (Festinger, Luttmer, Apesteguia; see paper for refs) • Spam data is for every org on the Internet, not just ISPs; any ESP (Email Service Provider) • We ranked medical orgs (worldwide, U.S.) • Within 2 months they all dropped to zero spam • Confirmation from [confidential] medical org: • 'The listing on your site added additional impetus to make sure we “stay clean” so in that regard, you are successful.'

  21. How Rankings Work • Rankings must be: • Frequent, comprehensive, and detailed • Must compare peers • To be usable: • Marketing: brag about good rankings; bad rankings are incentive to get better so can brag • Sales: good reputation for customer retention • Diagnostics: drilldowns for clues to what to fix • Producing more comprehensive application of existing Internet security methods

  22. Many rankings examples • FT Business school rankings • Vehicle Blue Book • Credit rating: Moody, S&P, Fitch • And by far the most numerous: sports scores • In leagues, for teams, for players • Detailed, earned run average, etc. • And composite overall

  23. Further rankings from spam data • Botnet rankings: botnets use known vulnerabilities; orgs infested by botnets prob. have those vulns; not good for their reputation • Vulnerability rankings: an org infested by several botnets which exploit common vulns very likely has those vulns • Infosec experiments: an org can change its infosec and watch rankings to see which infosec works • Single IP address drilldowns: which addresses are spamming, which botnets infest them

  24. Derivative rankings • Normalized (addresses, customers, employees) • Susceptibility (speed of infection by botnets) • Recidivism (frequency of re-infestation) • Improvement (change over time) • Composite (weighted average of all the above)

  25. Internet field experiments • We are releasing rankings for one country • Then later for a similar country • Does the second country change similarly? • Can experiment with many rankings • Per country, per org category, per data source • Does peer pressure on disclosure change behavior? • The rankings themselves provide ways to determine how well they work

  26. Policy: other data, other rankings • SpamRankings.net pioneers reputational peer rankings related to Internet security • Available now because spam data is available • Similar rankings could be made with other data: • Phishing sources and servers • Breaches, vulns, etc.: you can think of more • A simple policy suggestion: • Require making other specific data available • Enable multiple rankings by multiple agencies • Transparency for diverse cooperation (Elinor Ostrom)

  27. Not needed New Internet protocols Punitive laws Reports only to govt Sporadic reports Selected by reporting orgs Dept. of Homeland Internet Security Needed and Not Needed • Needed • More data sources • Publicly available • Frequent, comprehensive • More research (data correlation, ranking effects, law, policy, etc.) • Independent ranking and certification agency(ies) • Many diverse, cooperating entities (rankers, ranked, academia, industry, govt)

  28. Acknowledgments • This material is based upon work supported by the National Science Foundation under Grant No. 0831338. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. • We also gratefully acknowledge custom data from CBL, PSBL, Fletcher Mattox and the U. Texas Computer Science Department, Quarterman Creations, Gretchen Phillips and GP Enterprise, and especially Team Cymru. None of them are responsible for anything we do, either.

  29. iiaradmin@utlists.utexas.edu antispam@quarterman.com Contact

More Related