Saturday 22 December 2012

Welcome to the New Nanny State


The Tories first coined the term “Nanny State” as a criticism of an overprotective and overbearing government that interferes with the personal lives of its citizens. It’s strange then that David Cameron’s government has embarked on an unparalleled scale of monitoring personal activities ever envisioned. 

I'm not sure about you but the type of Nanny I want is one that has money set aside for hard times and is supportive of modern technology even though she doesn't understand it because she trusts you to use it for good. The type of Nanny David Cameron obviously had is one that blames you for your misfortune, checks your internet history to make sure you are looking for job, thinks you may be a terrorist and blocks what she doesn't trust you to see. 

The idea for an automatic pornography block for children is nothing new. Many other countries have toyed with the idea. Australia recently decided that such measures would be too difficult and costly to implement with no guarantee of protecting children. 

Claire Perry MP first raised the issue which lead to a public consultation concluding that parents already have the safeguards they need. “UK Rejects Automatic Porn Filter” the internet cried. No so fast. The Daily Mail has been running a year long campaign to “protect our children from porn” and today crowed that it had won. That’s right all Dave has to do now is flick porn block switch under the desk in Number 10 and all the children will finally be safe. Actually...

While the concept of blocking porn sounds easy in principle, it’s actually very difficult to achieve in reality. The first is what classifies as porn? How nude does someone have to be before you call it porn? Even if you come up with a rating system how to you automatically classify this? 

Websites may contain a wide range of language, pictures and video that may be submitted by users. If you find a nude image posted by a user on a public forum do you add the whole site to the block list? Any user generated site is susceptible to porn uploading. Should Wikipedia and YouTube

On the subject of scanning for porn. Scanning imagery for nudity is massively processor intensive task confined to the likes of Google with their complex algorithms in it’s huge data centres. Even then it’s nowhere near fool proof. Are our ISPs suppose to do this? Even though most basic measures would mean that ISPs would have to make an investment and pass the cost on to everyone. I’m not sure that I want to pay more for my internet so that parents can pass the blame for monitoring their children’s activities off onto BT. 

You would think that when Claire Perry states that the sexualisation of children must be stopped that she would start with sexually suggestive music videos, unrealistic fashion imagery and sexually suggestive clothing all directly marketed at children rather than online pornography which is not. 

The moral of the story is that the Tories neither understand technology nor trust the public. They create systems based on a pessimistic view of humanity. Parents can’t be trusted to monitor and regulate their children on the internet, the unemployed cannot be trusted to find jobs so should be monitored and everyone is potentially a terrorist who should be watched. If you navigate this sea of pessimism using the moral compass of the Daily Mail then you are unlikely to find your way to reasoned shores. I suppose it’s not all bleak though, at the very least one can hope that such a filter will block filth like this, this, this, this, this, this, and this.

Thursday 6 December 2012

The incredible story of Cuban medical internationalism

Source: New Internationalist