Excessive use means data usage that is not characteristic of a typical residential user of the service as determined by Comcast.[...]Comcast currently identifies well less than 1% of Comcast High-Speed Internet customers as excessive users each month. [...]Many excessive users consume more data than a business-class T1 line running at full capacity in a month. [T1 is 1.5 Mbit/sec - Comcast claims to offer 12 Mbit/sec for PowerBoost, and 6/8 Mbit/sec standard] [...] Currently, each month Comcast identifies the top bandwidth users of its High-Speed Internet service by determining aggregate data usage across its entire customer base nationwide.
What they are saying is that they use a crude averaging model, and penalize you if you don't fit, for example by using the connection capacity they promise more than 10% of the time. Now, this could be called Procrustean, but it reminds me of The Producers, where Bialystock and Bloom sold a hundred people 10% shares of the show, assuming it would fail. Sadly for Comcast, people like Dave are finding new uses for the net's bandwidth, and not just checking email sporadically any more.
Conventional internet service user models are based on users downloading more then they upload, from common big media sites that can be easily cached. However, as Odlyzko pointed out, citing Lesk's now decade-old work, the dominant form of data creation is photographs. Now all these photographs are actually digital, and we want to share them so others can see them. Because we aren't allowed to run our own servers by the likes of Comcast, we have to upload them to Flickr or Photobucket or Picasa to share them. This gives us an 'upload more than you download' network flow, as we send them up at full multi-megapixel resolution, but browse a few of each others' at thumbnail or reduced size. And that's before we even consider video uploading (which I've noticed Comcast throttles at 0.4 Mbit/sec for me).
Comcast hit the news before by sabotaging Bittorrent transfers by faking reset packets, but what Bittorrent is really doing is arbitraging around the asymmetric network bandwidth delivered by these outdated user models.
Bob Briscoe recently wrote an interesting proposal on handling congestion by TCP signalling to reveal the costs of congestion. This was spun by George Ou as an attack on P2P protocols, but the underlying principle of penalising those who cause congestion is an interesting one. The question I'd like answered is that if I have a gigabit network at home, and the internet backbone is multi-terabit, when Comcast throttles my uploads to 400 kilobits, aren't they the ones causing the congestion?