I’ve been holding onto this article for a week, now, but I’m finally going to flag this for you now. It’s a great article with plenty worth reading in it. And in it, author William Davies asks the question many journalists have been asking since the election: are we living in a post-factual world? He weaves together the history of statistics as a tool of government along with what seems like a highly-energized world-wide rightward shift that seems to intentionally fly in the face of statistical and scientific fact.

I’m generally suspicious of any “grand unifying theory” of politics that blends political winds in the U.S., Britain, Eastern Europe and the Philippines. A general trend is worthy of consideration, but trying to blend them into some singular force evades the real human emotions and political grievances in play in all of those countries. It also highlights the weakness of a political system which recognizes only two diametric poles: any movement in any direction necessarily has to be viewed as either a rightward or leftward movement. Our political polarization has left us bereft of the vocabulary to describe it any other way.

About the best you can probably say about the combined shift in global politics is: “When the going gets tough, the tough take it out on the less-tough.” Regardless of the individual struggles in any one nation, there’s little doubt but that the population of refugees and asylees worldwide has reached the highest recorded levels. The trillions of dollars of global wealth lost in the subprime fiasco of 2008 has continued to trickle down, year over year, emptying bellies among the world’s poorest. Daily reports of terror attacks have eroded the confidence of even the safest people.

People – or at least enough people – in wealthier nations are increasingly saying “no” to pretty much everything. They’re focused on their countries, first. They’re withdrawing from unions. They’re reneging on promises. Yes, they are increasingly “clinging to their guns and their religion.”

But to the central question of whether our current culture is leaning not only rightward, but also away from science and statistics. It’s worth noting that science – yes, science – has already weighed in on this idea. The truth is that our political persuasion has little to nothing to do with our justifications. Our preferred reality has everything to do with an emotional connection to our beliefs. We generally choose to bolster our beliefs with facts that confirm them. And we do so after the fact.

For those whose beliefs swing right as defined by American politics, there is precious little in the way of scientific or statistical information to support their beliefs. And the number of available statistics is getting smaller every day.

The U.S. has actually admitted less refugees last year than it did in many other points in it’s history as recently as 1995. An American is 6 times more likely to die of shark attack than of refugee attack. And we’ve got a 1-in-49,000 chance of dying in a terrorist attack and a 1-in-400 chance of dying of a gunshot wound. An amazing shrinking and increasingly-unqualified pool of scientists believe climate change is either a hoax or attributable to “god” or whatever Conservatives insist on believing.

Americans generally are not with Conservatives on gay marriage. We’re not with Conservatives on marijuana legalization. We’re not impressed by private school vouchers. No one but a damned fool believes Mexico’s paying for the wall. The HPV vaccine will not make your daughter a slut, nor will vaccines cause autism. Obamacare is not the worst thing that’s ever happened to health care.

So pity poor Conservatives who insist on believing things for which there is no support whatsoever. Because their happy-go-angry bullshit train has just elected the man that’s already leaving a lot of them gobsmacked and red-faced. Small wonder, then, that the political right of our country are discarding facts, evidence, science and statistics as hokum. We are not living in a “post-factual world,” just because your beliefs are no longer supported by facts. You’re living in a bubble.

School graduation rates, crime statistics and home values. These are the things that people look up when considering purchasing or renting a home. But along with those issues, parents especially might be inclined to check the local sex offender registry, to see if there are people in the neighborhood that parents would prefer not be around their children.

It hasn’t always been this way: it was only back in 1994 when the Jacob Wetterling Crimes Against Children and Sexually Violent Offender Registration Act established the requirement that state law enforcement register sexual offenders. It was the more commonly-known amendment to that bill, Meghan’s Law, that established those registries as public.

But a new study by Alissa Ackerman, a professor of social work at the University of Washington, Tacoma suggests that in the intervening 18 years, those registries have become over bloated and inaccurate. Studying five of the largest state registries including New York, Texas, Illinois, Georgia and Florida, she discovered many registered offenders had either died or moved out of the communities they were registered to.

New York State was the second-worst offender in the list, with a staggering 52% variance between registrants and current locations. Out of 32,930 offenders listed, only 15,950 could be verified. A search of that registry shows 1,387 of those offenders registered in Monroe County.

The search page also includes an explicit disclaimer about the accuracy of the data contained therein that would seem to be at odds with the mission of the database.

DCJS attempts to ensure that the information in the Subdirectory is accurate and complete. However, the information on the Subdirectory is reported to DCJS by other sources. As a result, DCJS makes no express or implied guarantee concerning the accuracy or completeness of this data.

Accuracy and completeness

There are a few obvious problems with a registry including inaccurate information. The first is: if Offender A isn’t where they say he is, then where is he? It would be difficult to argue that the registry “tracks sexual offenders” if it’s not accurately tracking them when they move. The registry offers what they call “Sex Offender Relocation Alerts.” How can the public trust that they’re accurate?

For the person selling their house or renting an apartment, the erroneous listing of a sex offender in your neighborhood could be as bad as actually having one there.

Kristen Munson ( @MrsMunson ), brand evangelist for the rental property search and resource website NewDigs.com, says that requests for sexual offender data are occasional. She stressed that while her company does not get regular requests for offender registry data, the subject comes up enough that the company plans to adopt sex offender registry data into their system sometime this year.

“This is the first I’m hearing of inaccuracies, which would definitely be a concern for us,” she states.

While Mrs. Munson does not have any first-hand knowledge of rentals in jeopardy because of registered sex offenders, she does say, ” I have heard people say they wish they had known, or they wish their landlord had told them that a known sex offender lived in their building.”

DFE attempted to contact the New York State Department of Criminal Justice Services for this article. At the time of publish, I have not received any response.

The Pew Research Center does some regular polling on a number of issues. In particular, they like to gauge the public’s interest in various topics and match that up against the total hours of media coverage the topic is given. The idea is to measure the extent to which the media is actually covering what people want to see.

And its a good idea. One of the fall-back excuses for the worst excesses in lurid media coverage of Casey Anthony-type subjects is that “people want to see this, so we have to show it.” The polling data often shows that the stated desire of their audience is often at odds with this assertion.

But that’s not quite the end of the story. Behavioral scientists will tell you that the minute someone is aware they’re being observed, their behavior changes. Polling is inherently observational and requires a human operator to ask questions. So how do we know that what people say they want to watch on the news is the same as what they actually want?

We don’t. In fact, polling science calls this a response bias: the tendency for respondents to answer with what they think the person on the other end of the phone call wants to hear rather than what they actually feel.

With this in mind, its hard to imagine how a poll asking people what they want to hear in the news could possibly be accurate in a literal sense. Nobody wants to read bad news, but many of us feel the obligation to at least appear concerned about things like the economy.

Here at DFE, I recently did a survey of my audience and asked about the various subjects I’ve previously covered. Respondents were asked to tell me whether they’d like to read more or less of a given subject. There are a number of reasons that a poll like this is not representative, starting with the fact that everyone who responded had me in common: they all like the same website/Twitter feed, ergo they have a specific bias that would likely show up in polling.

The poll itself was entirely non-compulsory, allowing respondents to skip any questions they liked. Which means of course that it suffers from the voluntary response bias: the only people who participate are people who really wanted to, therefore have strong opinions on the questions they answered. On the other hand, the poll I conducted was done online, so it didn’t suffer the response bias inherent in person-to-person contact. So, while a poll of this nature is far from scientific, it does I think point pretty clearly at people’s actual opinions more than their perception of the poll’s bias.

And the response was overwhelmingly negative on economic news. This jibes with what I’ve seen in my click-through rate: the rate at which people click on the links I’ve posted to Twitter and FaceBook by day, which showed very weak numbers when I posted economic news. And while following me on Twitter is non-compulsory, responding to the survey was non-compulsory and answering specific questions is non-compulsory, it should be noted that I’ve covered economic news for nearly a year on every single Monday. None of my followers were unfamiliar with what I was posting, but they followed me anyway.

All of this is fodder for plenty of arguments and debates, to be sure. Do people really want to hear the economic bad news? Does my poll shed any usable light on the subject? What about the veracity of the Pew poll? Any way you come down on the subject, I think its important to consider these questions when viewing the results of any poll, let alone the below Pew Research poll. Mainstream news services have the unfortunate tendency to just post the data without critical analysis – or worse, with the invested biases of politicians.

Troubled Economy Top Story for Public and Media | Pew Research Center for the People and the Press.

Via @flowingdata, we have a very interesting map, indeed:

Growth Rings – Maps Of U.S. Population Change, 2000-2010.

As the author points out, there is some notion of the suburban flight present in these maps, except to say that this flight is normally a pretty cyclical thing: people move out of the city and into the ‘burbs while their kids move out of the ‘burbs and into the city. These maps seem to display a much larger and more prolonged trend. He also points out the following:

Ah, the classic flight to the suburbs, but with a twist! Click through and look closely, and at the very center of the biggest cities – within a stone’s throw of downtown – you’ll see a tiny, resurgent dot of blue. Apparently, at some point in recent history, a home address amongst the skyscrapers became desirable again.

So we have an increase in population in the suburbs and exburbs, accompanied by an equal population boom in centers of cities. There’s a pretty easy explanation for this: subprime lending schemes.

I recall the map of Monroe County from the hight of the Subprime fallout and it looked almost exactly like the maps I’m seeing here. The center of Rochester was deep red, as were Mendon and other southern exburbs, while the more stable neighborhoods of Rochester and Brighton seemed almost unfazed by foreclosures. And that should not be surprising: “subprime,” or “less than desirable credit” mortgages happen in two populations most often: the poor, and the upwardly-mobile who are overextended.

So, while the author of this blog post interprets the blue patches as revitalization projects in downtown areas – and he may even be right in some cases – the map serves as evidence of the long-term effects that ten years of bad loans have had on suburban sprawl. It will be interesting to see some other map similar to this in ten years time.

One of my favourite haunts on the Internet is FlowingData.com, where Nathan posts some of the coolest charts anywhere on the Internet. As a political blogger, I’m very used to looking at trendlines for public opinion, economic indicators and the like. But when I get to use some of that – admittedly limited – analytical prowess to view completely different types of data, its a real treat. For example:

Visual evidence that movies are getting worse.

Nathan’s contention is that, because the polarization increases over the years, that means that the movies are getting worse. The theory being that if everybody loved it, the movie must have been better.

That would probably be true if there were no other factors involved. But I rather think that the price of the movie – and its attendant expectation level – is also a powerful driver of the division. If I get to watch a movie for three bucks on a Saturday afternoon, I’m less likely to require it to blow me out of my seat. But if I have to shell out eight bucks? I better get a fucking cameo.

Which brings up another big thing for me: comedies should be no more than an hour and a half, period. After that, you’ve just overstayed your welcome and played the joke out. But I think the pressure to make a movie worthy of the huge sums they make us pay compels directors to include more of the movie than should have been.