Data and Government 2.0

Oct 8, 2008 19:16 · 801 words · 4 minute read

[Cross-posted from the Headshift blog]

mashup*, who bill themselves as a “membership based community of executives, entrepreneurs and investors affected by and working within the commercial application of digital technology, products and services” put on an event last night looking at “Data and Government 2.0”.

It was held at the splendidly-named Speechly Bircham, which despite sounding like a character from The Avengers is actually a law firm with rather swish offices off Fetter Lane.

There were a range of speakers from different parts of government and the public sector, including a few who are right at the heart of government. William Perrin heads up the Power Of Information Taskforce in the Cabinet Office, which is about as close to the seat of power as it can be. He had a number of examples of the Power Of Information- for example, tax advice being handed out on which is crowdsourced community self-help around information that you would normally expect to come from HMRC.

One of the major issues that he faces is turning around the culture of government - getting across the idea that these kind of innovative projects don’t have to be huge, in fact many can be done for a few thousand pounds. If one of large usual-suspect systems integrators had tried to build, they’d have charged somewhere in the region of £8-10m for it, and would have given up 12 months into the project complaining that it was impossible.

Steve Palmer is the CIO of the London Borough of Hillingdon, a large council in North-west London and generally considered to be one of the more advanced public sector organisations in terms of the way that they handle their information. One of his major issues is that he’s dealing with the legacy of past abuses of data - he made the point that Hillingdon weren’t able to use the Electoral Roll as the basis of their smartcard rollout because it’s less than 70% accurate. That’s at least partly due to people being reluctant to be on the Roll in the first place after they’ve been sold off to private companies for marketing purposes.

From the non-government side of things was Mike Bracken of the Guardian - and formerly of MySociety, the group who have done more than any other to force the pace of change and shine some bright lights in some very murky areas. His point was that local governments in particular are still stuck in a mindset of websites, when what’s actually needed are APIs rather than service. Rather than trying to anticipate the needs of ordinary people - and often getting it wrong by thinking in terms of the type of measures that SocITM find appropriate - it’s better to allow access to data and let people take it into their own hands.

Also speaking was John Sheridan, Head of e-Services in the Information Policy and Services Directorate of The National Archives - he spoke about the “Unlocking Service”, a site which acts as a conduit for requests for data and funnels them to the appropriate area of government. The idea of having a one-stop-shop approach seems to me to be a good one, although personally I’m still troubled by some of the concepts of Crown Copyright in the first place. It seems that a lot of the current work is “fiddling around the edges” of a system with more fundamental problems.

The so-called trading funds - the Met Office, Ordnance Survey, Hydrographic Office and so on came in for a bit of a kicking from the floor when the questions opened up. The problem for many in the audience was that by taking a very narrow view of the value of the data, the trading funds were missing the bigger picture - while it’s (relatively) easy to quantify the revenues that you can make by charging for information, it’s much more difficult - and hence doesn’t get done - to look at the wider societal benefits that might accrue by allowing open access.

I came away from the evening with the feeling that we’re beginning to see a change in attitude in certain sectors of government around the way they handle and grant access to their data - but that progress is at best uneven, and there’s still a long way to go. Partly that’s a legacy of the obsession with competition and markets that’s been a hallmark of the government behaviour of the last thirty years or so - but the kind of rapid innovation that Web 2.0 technologies make possible may be starting to change that. Whether the likes of the Ordnance Survey will ever be convinced of the public capital of the data that they own remains to be seen, but the trends seem to be heading in the right direction.