WikiLeaks and LimeWire: Stupid on top of Silly

As I discussed in my last post, there are any number of questions that naturally arise out of this whole “revelation” that obtained at least some of it’s leaked documents not directly from whistleblowers, but from Peer-to-Peer networks (P2P) like LimeWire and others. The new allegations originate from Triversa, a private company that has apparently been snooping P2P networks in search of just such “valuable” information. Its hard to know exactly where to start, but let’s begin with the evidence:

According to two different articles on the subject, including this most recent one, Triversa is hanging it’s hat on claims that “four Swedish computers had issued 413 searches for file formats among the 18 million or so nodes the company believes is on P2P networks.” There are a number of problems with this claim on it’s face. First, the numbers don’t really suggest anything all that organized. Four computers? Only 413 requests? If someone was really, seriously trying to find information on LimeWire, they could certainly be much more efficient than this. They would undoubtably use bots or even botnets – networks of host computers, all working together – to make millions of requests. Even four computers could make requests in the hundreds of thousands over just a few days.

But let’s say for argument’s sake that individuals searched out this information the old fashioned way: typing into a search bar and seeing the results. What evidence do they have that the individuals in question worked for WikiLeaks? Did the four different computers have different IP addresses? Similar? IP addresses identify computers on the Internet and those addresses that are in similar ranges, like house numbers on the same street, may indicate that the computers they identify might have some connection. Is this connection present in these cases, or no?

In a world filled with information technology and search engines, if we’ve learned anything, it’s that people aren’t that original. This is especially true on peer-to-peer networks, where by definition, users come together to share information of interest to all of them. Just because four computers all searched for the same thing does not mean it was a coordinated attempt by a single entity, even if it was the exact same search terms. Are these four computers the *only* computers making similar requests? Once things as enticing as plans to secret military bases hit the peer networks, it won’t be long before they’re everywhere. And by the way, that they’re all in Sweden is not terribly convincing, either. There are actually rather a lot of people in Sweden…. and none of them are Julian Assange.

But hold on: what’s this bit about 13 18 million computers the company believes are on P2P networks? We have a private company trawling the Internet, looking for what it perceives to be bad guys, now? Halliburton on the web? And only 13 18 million computers? Seriously? One study suggests 62 million homes in the US alone have Internet-connected computers. So, I’m not even entirely sure that the company is really as good as they would like to believe. ~ Editor’s note: whoops! That would be 18, not 13. Not that this changes much.

Finally, the claim that, “The PDF file, which Triversa claims it observed one of the aforementioned Swedish computers downloading, contained sensitive information and eventually wound up on Wikileaks’ website,” is – in the interest of kindness and discourse – not altogether convincing. I have a copy of Microsoft Office on my computer, an application Triversa almost certainly “observed” being downloaded on a file sharing network or two. But how does the existence of a file in two locations indicate any connection whatsoever? Anyone who knows the most basic things about digital files must surely see this as spurious.

It is very difficult to determine what amount of evidence has been revealed to the media. There are certainly a great many other details that could be – probably have been – left out. But taking the claims as published at face value, they certainly leave much to be desired.

Keep the Kid Away from the Computer

If the claims made by Triversa leave a bit more to the imagination than we would like, that is not to say that I doubt that top-secret government documents could make their way onto P2P networks – or BitTorrent sites like, for that matter. Information is slippery stuff to say the least, and really, no one seems to be disputing that such docs are available. But that such sensitive information is so easily obtained online represents, if true, a pretty big black eye for our security apparatus in this country that should probably not be ignored.

I tend to believe that secret documents are available online if only because my years of experience doing deskside support and telephone technical support prove that such mishaps are not only possible but drearily, predictably likely. Windows security gaps, user ignorance and just plain old carelessness rule the day in PC security. It only takes one engineer bringing his work home with him – where his kid installed Kazaa – to suddenly make all kinds of unfortunate information available on the Internet.

As I mentioned in the previous post, all kinds of security apparatus exist and are in very successful use across the Internet. Your online bank account, for example, is just about as safe as houses. Even your email account is, for the most part, reliably private and can be made even more private with very little effort. Similar security measures could have easily prevented, if not the dissemination of the data, then at least the successful reading of the data by people who weren’t supposed to read it. Most notably, Public Key Encryption is not just available to but an invention of our federal government.

PKE basically scrambles information in a file sufficiently that there is only one way to unscramble it, which is to have the Private Key that provides the necessary algorithms. If PKE had been applied to the secret files in question, they might still have been disseminated on the Internet, but as a useless scramble of characters that was of no use to anyone. No encryption could have been in place if the data was leaked out to the Internet at large – and at least in the case of the PDF mentioned above, it was.

And so we come to a fork in the logic of this article, which is really a fork in the logic of information technology at large: is the lesson here that information is inherently free and not to be obscured, or that better diligence would have prevented a security breach? The history of IT is littered with the corpses of failed security measures. Yet someone is always building the next better mouse trap. Julian Assange and Richard Stallman stand for their ideals of social justice in open defiance of the very idea of information security; billions of dollars are spent every year trying to keep water in a leaky sieve, from Microsoft Genuine Advantage to CIA efforts to bring Assange to “justice.”

Regardless of where you come down on the question of Internet security, national security, journalism, free software, copyright, whistleblowing, corporate security firms or peer to peer networks, it will be interesting to see where this latest ripple in the saga leads us. There seems no end to the various threads and topics we find ourselves weaving into this one story.


Metered Internet, Another Domino Falls: AT&T

Of the companies who have long declared a tiered Internet the only just system for the Big Telcos, AT&T has been on the forefront for a long time, now. Back when Net Neutrality was a big buzz word in Washington, it was quotes from AT&T executives that really got the dander of the Save the Internet crowd up. Now that broadband companies have found a back-door route into the tiered Internet world (because, do not let them fool you: Tiered Internet and Metered Internet are functionally the same thing), it’s no surprise that the AT&T Broadband folks are looking into making the same switch:

AT&T Considering Metered Broadband – GigaOM

Bend Broadband, Comcast, Time Warner Cable — they’re all considering or going the route of the tiered (aka metered) broadband. Now add AT&T to that list, according to a report in CED magazine.

Let me just state once again the point that I think is the most relevant, here: there is a fundamental type of discrimination inherent in this system which measures total download capacity, a concept completely foreign to most Internet users.

And just as an antidote to this new system, what if there was a site that would allow you to completely use up any extra bandwidth at the end of the month, just so you’ve gotten what you paid for? Imagine the load on the network if people just started downloading text files at 10gb at the end of the month – not because the files themselves are of any value, but just so they can get the full use of their Internet connection?