In a blog post dated August 6th, Google’s head of Webmaster Trends Analysis, Gary Illyes announced that effective immediately, Google rankings will favour sites serving content from an HTTPS address. This form of communication is encrypted between the server and the client, and so discourages snooping by those with malicious intentions:

For these reasons, over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We’ve seen positive results, so we’re starting to use HTTPS as a ranking signal. For now it’s only a very lightweight signal—affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content—while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.

This all sounds pretty decent so far, right? Still, I’m not sure that it actually is a good thing, when you step back and look at the full picture. In the most positive light, it could be construed as an ineffective distraction to real security. In a more negative light, Google’s new tactic could be seen as strong-arming the Internet, to the detriment of low-income Internet properties.

What is HTTPS?

HTTP stands for HyperText Transfer Protocol, and is the vehicle by which the majority of what people think of as the Internet is delivered. If you look at the address bar for this website, you’ll see that the first few characters are http://. That tells the browser to use HTTP.

If the same traffic is encrypted, which means scrambled so as to be unreadable by anybody but the server and you, the first few characters will be “http*s*://.” The “s,” you see, is for “secure.”

It is fairly routine for your email, your bank and increasingly, your social networks to all be served up in this way. Encrypting your communications ensures some level of privacy from criminals, particularly encrypting the transmission of username/password challenges for logging in.

For the website in question, the price of admission to this secret world is what is known as an “SSL Certificate.” This is a set of secure data that only that server has, with which they encrypt the data they’ll be sharing with you. Basic SSL Certs with barebones support come in around $9 a year, which is a very affordable bar to entry for most Americans.

Now for the bad news

All of this sounds great, it really does. A more-secure website, especially one with usernames and logins, is a better one. But does that make one website a more authoritative voice or a better resource? Because that is what Google’s mission is supposed to be about, if we’re still concerned with that sort of thing.

Search is about content, not someone else’s priorities

If I wanted Google to make the decision for me where I “should” spend my time, as opposed to who has the content I’m looking for, I’d probably be asking for it. But that’s not why I use Google and that’s not why, as a publisher, I rely on Google’s rules to get my pages in front of your ocular tissues.

Where spam pages are concerned, Google is well within it’s mission to cull the herd. I don’t need to find myself in spam hell because I searched for a common term, nor do I want my site listed among the sleazy crop of Russian honey pots. But security is a personal matter about which I can make my own decisions.

Security is a state of mind

While we’re on the issue of the ambiguous term “security,” let’s keep in mind that, just because someone else can’t snoop your communications with a website, that in no way presupposes that visiting the site is “safe.” What’s to say the site itself isn’t doing dodgy things with your data? Google can’t guarantee that, nor should it try.

Wait. Google is talking secure communications, now?

Whether or not it was their fault; whether or not Google was pressured by the government to allow holes in their security that the NSA could snoop through, the fact remains that they did exactly that. To hear Google now carping about secure communications on the Internet is rich, to say the least.

Wait. SSL Certificates are secure, now?

Perhaps you recall, and perhaps you do not recall, the big security freak-out of a few months back? Heartbleed? Yeah, that whole thing. That’s when the world’s most affordable SSL Certificate system, OpenSSL, was found to have a gigantic hole in what was supposed to be it’s encryption.

No one with any knowledge of Internet security found it surprising that Heartbleed was discovered in the era of NSA snooping. It was exactly the kind of back-door intrusion loophole the NSA must have been employing. So now, Google wants us to trust certificates that they themselves helped undermine.

The “Google Tax”? $9 a year doesn’t sound like a lot to Middle Class America.

But any new cost of doing business matters, especially for those with lower incomes. And regardless of how much of a burden it is or is not, there is something counterproductive to the “free and open Internet” Google claims to want in requiring yet another fee to pay.

It seems to me that Google’s HTTPS plan is too disruptive in all the wrong ways, and not disruptive enough in the ways they would prefer it. I’m hoping this is another Google Wave-esque idea that goes the way of the dinosaur sooner rather than later.

Twitter users – especially power users – love their clients. We get attached to them, almost more than to Twitter in some senses. We rely on the look-and-feel of specific tools to do what we do on Twitter quickly and effectively, and we get pretty nervous when things change.

Such is the case with the recent news that Twitter plans on cutting off support for a couple of its more prominent client versions, TweetDeck AIR and TweetDeck Mobile:

In a blog post, TweetDeck, which was acquired by Twitter in 2011, said that it would be discontinuing support for its AIR, iPhone and Android apps, and the mobile apps would be removed from their app stores at the beginning of May. It also warned that continuing to use the apps until then could be problematic — they rely on an older version of its API which it will be conducting tests on in the future, which could lead to outages for users.

This announcement left myself and others in a small panic, wondering what we might do without our favourite client:

.. and so on. The problem is: the original article isn’t entirely clear what “desktop version” means and makes no mention whatever about the version most people currently use, the Chrome or FireFox extensions. For that, we connect the dot not connected in the original article and read the original blog post:

Over the past 18 months, we’ve been focused on building a fast and feature-richweb application for modern browsers, and a Chrome app, which offers some unique features like notifications. We’ve recently introduced many enhancements to these apps –– a new look and feel, tools like search term autocomplete and search filtersto help you find what you’re looking for more quickly, and automatically-updating Tweet streams so you immediately see the most recent Tweets. Our weekly web releases have been possible because we’ve nearly doubled the size of the TweetDeck team over the past six months (and we’re still hiring).

So clearly in the minds of those at TweetDeck at least, the Chrome and other extensions are here to stay. That will come as a comfort to a lot of Tweeps I know.

But this move is part of a wider move on the part of Twitter to focus the use of its API in more limited ways. It has been widely reported that Twitter wants to unify the experience of working with their product, which makes sense: as new users come online, the confusing panoply of clients that all look and function differently is an impediment to wider market saturation. Unifying the experience is great. For them.

Doing so, however, means taking our clients away from us. I’ve searched forever to find a decent Android client and settled on Plume. But I know Plume’s days are numbered, and Twitter’s mobile experience lacks the fluidity of managing columns of lists so I can monitor my news sources and friends effectively.

So the question still needs to be asked: if unifying the experience means cutting the power users whose content drives Twitter’s appeal off from the tools that allow us to do our thing, is Twitter also risking cutting itself off from the quality content that makes it worth reading?

Nathan Yau of @flowingdata writes about the end of Data.gov:

Data.gov in crisis: the open data movement is bigger than just one site | Nathan Yau | News | guardian.co.uk.

Here’s the part that blew me away: there is such a thing as Data.gov! Never knew it was there, but according to the article, it took $4 million a year just to run a website I never knew existed.

Now I grant you: I’m not in the data business in the sense of being any kind of researcher. But at the same time, I’ve done many searches looking for statistical data including median income levels, employment by state, by occupation and others. And in all that time, I’ve run across lots of useful information from the Bureau of Labor Statistics and Census.gov, but never even came up with a single Google search result from Data.gov. Not one that stood out, anyway.

Open government is good. But its only good to the extent that its effective. Making each department responsible for reporting its own numbers seems like a wiser course of action. And already demonstrably more effective.

BBC News has an interesting article about the future of “Web2.0” sites and development, interviewing the man who coined the term Web2.0.  Seems like, if he only knew the silliness about to be unleashed on the Internet, he might have named it something different or avoided it altogether.

But much though he may sourpuss at the irrelevance of some Web2.0 applications, the fact is that we are by and large fairly frivolous people with fairly frivolous interests.  It doesn’t diminish the Web2.0 brand to see that silly little social applications have been built, it reinforces the relevance that the Web2.0 evolution has had in that powerful concepts have invaded the simplest of communication.  To be sure, loading down browsers with a ton of irrelevant JavaScript crap is not what the originators had in mind.  They had it in mind that we would “harness collective intelligence.”

Weep for the lost opportunity if you must.  But what they didn’t have in mind – indeed, what the visionaries of our society so rarely ever have in mind – is the sheer volume of our collective intelligence occupied at all moments with the research and development of fart jokes.

I’ve been playing around with new plugins to support the mobile side of this website, and came back to my original plugin, but with the latest version 1.3: WordPress Mobile Plugin by Andy Moore.  The interface has improved greatly with this newest edition, including a lot of the metadata such as time posted and category.  Unfortunately, I don’t really use categories, but I’m thinking about hacking this plugin to include tags instead, soon.

So when you have a minute, come check us out at the same URL as you do on your PC!

DFE Mobile in Emulator Window