Sunday, 2 September 2007
Will botnets compete with Amazon S3?
Reading about the Storm Worm's botnet being bigger than supercomputers I was reminded of a prediction I've been making for a while. Spamming and phishing and other bad behaviour relies on overwhelming miniscule conversion rates through huge volume, so it has to free-ride on others' resources to actually make money. However, large distributed computing is being commoditised, by Amazon's S3 and E3C and others. At some point the botnets will realise that they can make more money by competing with Amazon or Akamai to store data in their stochastic cloud of compromised computers. A variant of memcached with a redundant hashing algorithm, or maybe an adaptation of Freenet would be obvious places to start; for all I know this already exists.
Subscribe to:
Post Comments (Atom)
2 comments:
no sla, no trust metric. pretty evil i'd say. :)
I dunno.
Botnets, by definition, are parasitic infections on legitimate users' machines; in many jurisdictions they're illegal. (And where they're not illegal, they should be!)
What customers would be crazy enough to trust massive quantities of their data to something so fragile, in legal terms?
Spammers nowadays can store all their important data on Command-and-Control servers, off the botnet entirely; as botnet nodes are shut down, killed off, or filtered, it doesn't affect the *data* in any way-- just the output volume. That's proven to be a key part of the botnet design, as far as I can tell, and I think there's a good reason for it.
BTW, in the past we have seen spammers attempting to justify a semi-legal spam-proxying app as instead being a form of distributed computation software -- ie. something their users installed through their own free will. However, it became clear they were talking bullshit -- in fact their software was just plain old malware, iirc, and the users hadn't opted in at all.
Post a Comment