Friday, May 30, 2003
Interesting piece on the register noting that "some research or other" found that 65% of 500 businesses had no idea whether the money spent on their website was good value or not. Research is, as the Tim infers, only as good as the questions, the underlying agenda and the researcher, but gut feel says that this is probably right. At a session over breakfast yesterday, one of the attendees was talking about the need to put a credit in the accounts for everyone that used the website (the more they do, the higher the credit I guess). So, someone who checks the company address maybe credits a virtual 25p to the website coffer (offsetting the expense of build in a sort of double-entry manner), if they go through the FAQs and don't call the help desk, maybe they credit more. Seems to make sense to me- and it wouldn't be hard to come up with some simple metrics that allowed you to figure out what was going on. Ian Dunmore, over at Public Sector Forums, has cottoned onto this notion of "value" in the past as well as more recently. You'll need to register and answer a few nearly-onerous questions (as Ian said to me, this stuff has to pay for itself so you give some "value" to him and in return, he gives some value back - seems a fair trade).
at Friday, May 30, 2003 Posted by Alan
Monday, May 26, 2003
I know that government gets a lot of stick for its web presence. Some of that stick is even deserved: too fragmented, hard to find things, not enough transactions - that kind of thing. But, we're not doing too badly versus the private sector. Here are some examples of plain, downright awful, stupid design from today alone: - The famous, internationally renowned bank that keeps issuing an error saying "check amount" (and nothing more) when I'm trying to transfer money between two accounts. Thinking this must be some kind of amount limit, I tried all kinds of different numbers. None worked. Do you know what it was? You had (HAD!) to type the pence field in - even though it wasn't separate. So, "£1000" wouldn't work. But "£1000.32" would. How maddening is that? - The estate agent (many are guilty of this) that let you search by property value, post code or whatever. But they won't let you search by property name. So many new developments are coming up with their own name these days (viz Monte Vetro, Imperial Wharf, Butlers Wharf etc) that I just want to be able to see if they have any from those names. Can I do that? No. - The well known Money Management programme that has released a new version in which everything has changed. One incoming funds transfer was, for goodness knows what reason, labelled "Funds Transfer from Katja" (definitely not any Katja I know - she would not be sending money). From then on, every single funds transfer has the same label. What's that all about? So, a frustrating day at the hands of the Internet. Someone, somewhere, has forgotten to use these services themselves before foisting them on the unwitting public. Carry this on and the only brands that work will be the ones that do it well: Amazon, errr ... Amazon? Who else?
Gerry, I don't know who you are or what brought you to your conclusions (on content management software), but I love what you have to say ... 1. Get real. New software will only solve a small part of your content problem. If you want great content you need great people far more than great software. 2. Put an editor in charge of the purchasing decision. By and large, IT managers don't understand content. Nor do marketing executives. 3. Spend the time to properly specify what you need. What is it about so many organizations that they never seem to have the time to do things right? 4. Get away from the kitchen sink specification. Specify what you need, not what you want. Yes, we'd all love these fancy extra features. But the more features, the more costly to install, and the more complex to operate. 5. Stop thinking you're so special. Standardized solutions can deliver a much faster, cheaper result, that is much more satisfactory to the reader And so say all of us. Amen. Actually, I do know who you are now that I've read the "about" section on your website. Sounds like a man who should know best.
Saturday, May 24, 2003
Read this. If you have anything to do with your organisation's website, read it. It's an "Online Journalism Review" story telling how BBC news, a site with 2 million unique visitors and 10 million page views a day planned out its redesign, e.g. Why didn’t the BBC use all the white space on the side of the screen? Why was the site ‘all squashed up’? Some found BBC News Online ‘cluttered and confusing’. It required too much scrolling and took too long to download. And it was less appealing to medium and light news users. The accepted wisdom was that multi- media consumption was the engine that drove much site traffic to News Online. BBC News Online, after all, had an almost unrivalled access to timely, good quality audio and video. But the stats showed that much of the online multi-media was being ignored. When a user approaches an object such as light switch or a control panel on a car, they form a mental model and anticipate what that object will do. It’s fair to say that, until now, our site hasn’t offered enough of these clues. It hasn’t given an instant system model to the user. In part, it has been a mess, to be honest The use of generic code allows content to be supported and upgraded across different sites and in different languages with far less effort Pages will be constructed from modules, allowing greater freedom than a traditional template. The weight of the home page has been cut from around 160 K to 100 K The individual journalist will have more responsibility for producing the whole package, including bespoke multi-media and interaction opportunities for the user Great stuff. A lot of the same thinking that we have put into DotP. Great to see one of the most popular sites in the world though thinking about what it needs to do to improve, rather than resting on its laurels (and no, I'm not going to get into the debate about licence payer money and what should be done with it).
I got some stick for my post about liking Cameron's new Imax film on the Titanic. Something about being the only person on the planet who did like it. Well, I saw the new Matrix on Wednesday night and loved it. So, think twice before you go and see it.
My P800 mobile phone is a bit of a swiss army knife. I wouldn't move to any other phone on the market right now, but that's not to say that it's perfect. In the good old days, when I had a mobile phone and an Ipaq (or even a Palm) ... anyone wanting to book a meeting would call me and, whilst I was on the phone, I'd power up the Ipaq and check my diary, entering the appointment in the right place. Likewise, if someone left a number on my voice mail and I needed to jot it down, I could do it all on the Ipaq. Now, with everything integrated, I can't do that. I have to say "hold on", open up the calendar app, look at my diary, hold the phone back up to my ear and repeat until we find a date that matches. Now I know that some of that would go away with an earpiece, but have you honestly tried to keep your earpiece in whilst doing everything else you're supposed to do during the day? Greg Papadopoulos, a senior guy at Sun, says this way better than me ... here are 3 key points he made at a recent conference: The first is the law (formulated by Gingell) that networks continuously morph logical structure through a process of decomposition, distribution, specialization and re-integration. The second is a basic observation that all consumer-facing technologies become fashion. And the third is a qualitative extrapolation of the physics of computation (Feynman, Landauer, Wheeler) to a world where atoms and bits become more intricated over time, leading to the concepts of "infra-destructuring" and "bitmass." What he means is that we've integrated all the components now because we have to - the services and technologies we've got today aren't ready for a distributed, wireless accessible calendar for instance. But pretty soon, we're going to smash that model apart and distribute the components again - so the camera in the P800 will be part of my sunglasses, the phone will be on my belt, the phone directory will be in the ether (so it's accessible from any device I've got) and the pad that accesses it all will be a wafer thin screen that sits in my top pocket. Everything will communiate with everything else because (i) it can, (ii) it's not expensive anymore and (iii) because people will demand it. Profound words.
A colleague from another part of government asked me to comment on the feasibility of two points yesterday. Both excellent questions and very topical in government today. a. Expanding CMS functionality to bring in Electronic Records Management functionality. b. Publishing content to Intranet and Internet using the same CMS. Here's what I answered: My sense on both (a) and (b) is that there is big risk in trying to do too much with any one tool. if you try and add numbers up in word, for instance, you quickly see that excel is a much better tool for it. Likewise, writing text in excel is pretty painful. tools are usually most effective when they do one thing really well - few people carry swiss army knives around - because you never know what the thing that takes a stone out of a horse's hoof is for and the screwdriver thing almost certainly doesn't fit the screw that you need to undo etc. And IT tools are certainly not swiss army knives. They barely do one thing well without a lot of hard work. So ... for instance ... Your internet presence is focused around the citizen. its navigation, search, look and feel are all designed around themes that make sense to the customer and where everything is accessible. Your intranet is designed around what makes sense to your staff and the processes that they need to follow. it has a different information architecture and whilst staff should certainly (i think) be able to see exactly what the customer sees, they also need to see more - expenses forms, hr policies, access to internal data and so on. On top of that intranets need more security - I should not be able to see your pay if you are my boss, but you might be able to see mine for instance. Ditto with records management. Content for the web is written in bite-sized chunks with occasional documents attached. if you want things to be accessible, you don't do too many big files or pdfs, you certainly don't add spreadsheets very often. You also want to keep records for a long time, maybe decades, but you don't want all of them readily accessible - some go off to long term offline archives, or tape robots or whatever. You will also generate millions more records than you will develop content for the website so the retrieval and indexing process is probably different. My sense is get a tool that does one thing really well and leverage it to the max. Don't try and make word do spreadsheets.
Friday, May 23, 2003
Bear with me for a bit. Here's something to try at home: Stand up Hold your hands as far apart as you with arms outstretched. That represents the scope of a typical project the day it kicks off. Now bring your hands together so that they're a little less than shoulder width apart. That's about the scope you'll be left with once you've gone through requirements review processes, procurement elimination, cost/benefit analysis, etc. Now clap your hands together and leave them an inch apart. That's the requirements that your user base will ultimately desperately need every waking day whilst they are working on the system that you have designed, procured, built, tested and delivered over the last however many months. The noise you heard was the big bang of implementation! Now, if you could get from furthest apart to closest together in a dramatically faster time, would you be prepared to sacrifice the loss of functionality in return for the faster return on investment, the more rapid delivery of business and customer benefit and the overall reduction in risk? Then you could start work right away on delivering the other functions that were less useful. Be honest. What would it be worth to you, your business, your customer base to deliver the right functions ahead of time? Would it be worth enough for you to take what someone else had already done and exploit? After all, how many of you have written your own word processor recently? How do we get other key applications recognised as the equivalent of "word processors" - i.e. things that you get someone else to build for you and then you deploy rather than things you have to define, procure, build, test, deploy a la waterfall model?
Thursday, May 22, 2003
I was spurred by a link on Voxpolitics to Tom Watson’s blog and from there to a list of Labour MP’s with online presence. I ploughed through a few of them and was a bit startled about the variety – some nicely done sites, some amateur sites, just the one blog as far as I could tell. The other night I had the briefest of chats with a couple of MPs. They were clearly enthusiastic about the Internet and its potential - and were able to recite stats on how many emails they received per week (stil in the hundreds, even though the spam is filtered out now) ... but they didn't know the answer to "what next?" So I wondered if there was a deal to be done with DotP here. I give an instance of DotP to each political party (got to be fair here I assume), at cost price plus refresh costs. We design an information architecture for a political party – so cabinet ministers, ministers, MPs, prospective candidates etc get their own areas where they can publish their bio, their contact information, their thoughts on key issues, even a blog if they so wish. We could even get clever and pull in some outside data (I guess, but I don’t know the rules around this kind of thing) to couple it with voting records or gifts received, or whatever is available. Maybe even individual discussion forums and so on. The commonality of format would, I think, allow us (the citizens) to compare different MPs and what their thoughts were on issues that were important to us. And, after all, one of the strongest points of the Internet is the ability to make comparison information available quickly and easily. Then, drawing on James Crabtree’s biro-introspection post, where Paul Waller (he of OeE e-democracy and an all-round pretty smart guy) ponders whether e-voting would be of democratic benefit because it "realigns the temporal and deliberative act of voting" If he’s right, then people with more and better information available at home than they would get at the polling booth might either make a different (more informed) choice or, ideally, be more encouraged to vote and increase overall turnout - which ought to be the root measure of e-democracy. The common architecture that we’d built for every party would allow someone to type in their postcode (or, more likely, their address as I think postcodes don’t map exactly to ward boundaries) and get detail on the candidates standing, what their background was and so on. The technology should not be the differentiator here – but what certainly should differentiate candidates or MPs is whether they have used what is there to create a presence, foster a sense of community and keep the local people informed. Friends could mail links around noting their MP’s site and his/her views on a given topic and that might get some viral marketing effect going that would bolster e-voting. Now wouldn’t that be a thing to behold? Admittedly, I haven’t the faintest idea how to make something like that happen … but it sounds great to me.
at Thursday, May 22, 2003 Posted by Alan
Wednesday, May 21, 2003
I've been using Cloudmark for several months now and it's proven effective at blocking a good chunk of spam. The stats show that roughly 50% of the mails I receive to my main account are spam and get automatically blocked (I block a few extra too and it's bothering me that there are so many). On their home page they show the following stats: 66,502,000 emails processed today ................... 14,291,000 spams caught That's roughly 20% spam, so why am I up at 50% or more? Today, Cloudmark popped up a window offering "version 1.0" (I've been on umpteen beta versions) for $1.99 a month ... if you're new to the product there's a special price of $3.99. Now neither of these are huge amounts - and, if the stats are reliable, then the programme has already saved me a few hours that I would otherwise have spent deleting useless messages. But what kind of a model is that? I can see the benefit for Cloudmark, a regular income stream. Too low a charge for most people to bother cancelling it, not high enough to worry too many people on initial signup. But where's the incentive to improve things? Now if there were a deal where I put a year's revenue in an account for them and for every message they stopped, $0.01 was debited (say), but for every message that I had to manually block I got $0.02 (say), then I'd be more interested. After all, it regularly seems to let pretty obvious spam through ... ones with very odd titles that I can tell right away are spam. I'd get even more interested if I got $0.04 for every spam it blocked that wasn't spam. Now that would be an interesting model. I'd be happy to see some different amounts there and maybe even a performance clause that got a bonus if they stopped more than 99.9% of spam for instance. Why I'm thinking about this is because we've got a campaign running now to get more people in the UK to use the Internet. I think the stats are something like 50% use it regularly and up to 62% have used it recently ... but there's a core of 30-something% who don't want to use it and half of them don't know what it is I believe. The campaign as its own website of course, although its not one of mine. So, if (suppose) it was my mother that I was introducing to the web, setting up email for the first time so that she could send me photos of her time as Mayor of Lambeth, how happy would I be about 20% of the messages she receives being about viagra, porn, government loans or whatever? Of course, I'd be completely unhappy. Which means that we have to find a stronger model with more incentives to get rid of it. I've seen some great ideas recently, including an intriguing one that forces the computer sending mail to do a complex calculation every time it sends something (increasing the cost of sending, but having no effect on the casual user). There might also be options for limiting the size of distribution lists in common programmes, or making it possible to send only one message a minute or similar - but I imagine all of these can be circumvented just through writing custom code. But I can't believe the only solution is to use a specific distribution list that checks whether you are friend or foe. There's a lot to be done here - and especially so if we want to get technophobes to come online and embrace the benefits that the rest of us have already seen but with a lot less frustration.
at Wednesday, May 21, 2003 Posted by Alan
Friday, May 16, 2003
I'm still thinking about that curve showing pages/website. Since the version I posted, we've added some more sites to the scan and found some even bigger websites. Doesn't matter how big they are or who they are really, just that they make the curve even more extreme and reduce the number of sites needed to get to 80% of government content. Out of the 780 or so, the biggest 155 count for exactly 80% of the content (that guy Pareto really nailed it, huh?). So after my questions in part two, comes another question ... does anyone ascribe a "value" to any of that content? The storage industry has talked about the value of data for years ... you put your most accessed in RAM, your next most accessed nearby on a hard disk, your next on a network disk, your next "nearline" (i.e. reachable quickly) and then the least valuable offline (in some kind of tape archive). Does the web change that? Shouldn't we be thinking about a hierarchy of websites and webcontent that ensures that the most valuable content is the easiest to find? Somewhere in those sites with 100,000 pages is doubtless some incredibly important piece of information that we need to know ... but could you find it if you needed it? And doesn't the value change according to who you are and what you need? Is anyone out there modelling the value of content, how you measure it and what you do when you know it? There's still a PHD thesis in all that I'm sure.
at Friday, May 16, 2003 Posted by Alan
Tuesday, May 13, 2003
So what that last post tells me is a few things - We don't need another website (sung by Tina Turner with Mel Gibson blowing up the ones that we have) - Next time you add a page to your site, think "am I making the problem better or worse?" - How on earth are we going to get all of that into content management systems? - Does anyone actually look at it? And is any of it up to date, relevant and useful?
I was going to save this for another day, but it's been on my mind all afternoon. One of the guys at Sapient had the bright idea of feeding a huge number of government URLs into Google and seeing how many pages are spidered for each. Then he put it all in a spreadsheet and with a bit of tinkering from Dan, this picture came out. What it shows (with the blue line) is the size of individual government websites (at least as far as Google is concerned, which might mean that some pages aren't spidered or can't be reached) ... the biggest comes in at 113,000 pages (I so, so want to name names here, but I'm not going to - but it's no site that you would ever think would be that big) and there are 3 sites with just two pages (one of which is my own www.gateway.gov.uk - the rest of the site is hidden behind passwords and such). The drop off in size is stunning - by the time you get to 50 sites, you're down below 10,000 pages; at 100 sites, you're around 5,000; at 200, 2,500; and at 500, you're at less than 500 pages (467 to be exact). There are nearly 800 sites in the list, with only .mod, .nhs etc excluded. The purple line (at least it's purple on my screen), shows the percentage of the total, cumulatively represented. The first 50 sites account for guess how much of the total? 58.3%. The first 100 make up 71.1%. If Pareto was ever in doubt, he could look at this data for extra reassurance. There's something like 2.6 million pages of content represented here. I did go and visit the biggest site - it's nicely put together with all kinds of good features. But two things knocked me out - to get to search you have to click to another page, and when you do use search it's a very simple form version without much scope to focus it in. With a site that is that big, I'm surprised to see search relegated off the home page. That said, the navigation is nice - but I know from our own research and evidence that people use search more instinctively than they do navigation unless they're familiar with your site. My take is that if we can make navigation ever more consistent across government, then there's a chance that people will find things more easily because they won' t have to relearn as they go from site to site. I'd always assumed government sites would be a tadpole - big head, long thin body - but this really stunned me. The next trick would be to map usage and cost data against each of the blue dots, or even number of content authors or uptime or any of a range of data. There could be a whole PHD research project in this one slide alone. I don't have much of that data and I doubt it even exists for such a wide range of sites, but it's out there somewhere for the Top 50. I'm going to go hunting.
It's been a pretty hard few weeks in envoy-land ... a couple of Gateway code releases, DotP being deployed, a couple of conferences (and a couple more coming up soon - I'm supposed to be doing the slides now), some kind offers from journalists and others who actually want me to write something that they'll publish (someone does read this stuff then!) and some urgent and vital deadlines coming up at the end of the month for which I need to write a lot of papers. When you have your head down so close to the day to day, trying to figure out obscure details of how and why things work and how to stop them from not working, it gets impossible to lift your head up and think about what next - whether "next" is lunch, June, 2004 or 2010. I'm very, very conscious that I am stuck in some weeds now - the ones that have hooks on them that grab hold of you and leave only a trail of blood when you pull away. Lucky for me, I got a day out in Frankfurt this week at the EDS Innovation show (not words that you associate together often, but the tide has certainly turned there) and was pleased to catch some stimulating speakers who have given me some fresh insight into what might be and what I might have to do about it. I have two or three weeks holiday planned in June which should let me take the time to digest all that we have done in the last 12 months and position much more strongly for the next 12, if they'll have me.
Dan at work showed me a neat feature on ukonline today ... if you hold control and move the scroll wheel on your mouse (there's probably another way to do it if you don't have a wheel, but it looks cooler this way), the text resizes across the whole page. Below is a pretty clumsy attempt at showing how it looks - with two screens showing the different sizes in text. Now, if you go away and try this on a bunch of others sites very, very few get it right - private or public. BBC.co.uk looks to be right; Kablenet's body text works, but not the navigation; Computerweekly doesn't do anything. Why is it important? Accessibility - for those who find it hard to read small type or who just like big fonts.
I often stare skywards and say silent thanks for not being too technical. There's a whole lot of noise these days about the rights and wrongs of CSS (Dave Winer, as you'd expect, explains CSS far better than I can so, check out his pages if you don't know what it is, and that was posted way back in June 1999). Not being sure whether I had indeed drunk the CSS kool aid inadvertently, it merited some research to see what the fuss is about. Searching Winer's scripting.com shows that he has a lot to say about it, positive and negative, which may have been the start of the fuss. But, thankfully, it seems to be a techno-weenie (like a geek, only smaller) thing, perhaps best encapsulated in this. from the link above ... The huge sense of self importance you detect in most CSS/XHTML evangelists stems from their utter elation at finally pummelling their design expectations and wrestling with buggy CSS implementations until they finally compromise in some Israel/Palestine sense of the word. They become one of the few, and enjoy rubbing our noses in it. "Well I did it, so you must be able to! Unless you're stupid, that is.. you're not stupid, are you?" and this from the same place The point is that CSS won't let you do what you want I'm not sure any of that matters to me. We've used CSS in our DotP platform to what I think is great effect - you can see it working smoothly on ukonline. The pages load fast (faster than anything else I've seen bar google), they render on multiple devices and in multiple browsers more simply. Every technology project I've touched since 1999 has had to assemble a vast factory of browsers and operating systems to see if everything worked in every version of every broswer. That created an "n-squared" problem - a complex matrix of does/doesn't work ticks and crosses - many of which we either couldn't get to and some of which we couldn't fix even if we could. CSS seems to be the way around that now - as long as we use the same style sheet, then we know that every page using it works fine; if we change the style sheet then we retest just a piece of the original load. And style sheets are much simpler to edit it seems than the old nested "if this browser then do this" code that we had before. And grasping that we could make a single change and watch it sweep through the whole site (or even sites with DotP) was a seminal moment for me. I know of departments who have spent tens of thousands on their website just changing their logo, name or whatever. That's all gone. The purest and best demo I have seen of this is at the CSS Zen Garden - give it a go and you'll get the message really fast. So, whatever the pros and cons of how hard it is to write in (and I doubt for a minute that it is any harder than any of the other options), what CSS seems to get me and mine, is rid of a world of pain. That means we can free up time to worry more about what we say on the web, how we say it and where we say it than how it will look on any one screen or how it will be read in a given screen reader. That's a big plus to me. I'm religious on this too. The technically aware can debate this back and forth endlessly (viz dotnet, sun one, java, jini, bluetooth and any other set of standards, implementations and whatnot). What I know is that it works for me. I think we've pushed the envelope with DotP - we may even have broken it at an edge or two. For the first time, we've deployed something in Central Government technology that can be rapidly accepted and adopted by a departments, and that can give them an edge in delivering services to citizens. It makes things better for citizens by improving the experience, the way that we interact, the way services are offered and the ease with which they can be found. Not everyone sees it that way, of course, and (for a while at least) some folks will want to religiously debate the feature set versus perfection. But the bottom line is it delivers now and if you start now, you get benefit now. Or you can debate for a lot longer and get no benefit now and probably none in the future. Seems an easy choice for me.
Friday, May 09, 2003
After a lot of hard work, we're ready to move the first user of our pilot on mobile messaging live, as Steve Range picks up in Computing this week. I was especially pleased to see the piece talk about the point of the technology before the technology - it's too easy to get stuck in tech for tech's sake mode, but here we might help sort out quite a significant problem for hospitals, kicking off with Norfolk and Norwich University Hospital. Be fascinating to see the results in a couple of months.
at Friday, May 09, 2003 Posted by Alan
Friday, May 02, 2003
There wasn't a lot of coverage of yesterday's Globex outage in the US. Globex is one of the trading systems used to exchange futures contracts, on the Chicago Mercantile Exchange. Details are sketchy, but it went down some time after 10am and didn't resurface. Less than a 1/3 of normal daily volume had gone through. The story is that it was caused by problems "with Internet switching technology". Globex isn't vital, but there will be a lot of people caught on the wrong end of trades as a result. Some of them will have hedged, but not all. Interestingly, the Dow was headed straight down until the outage when it promptly turned around and headed back to positive territory at the close. Ditto the Nasaq and the S&P. The opening today could be interesting, depending on the morning reports (Friday is Jobs day). This shows that technology can still, even in the most watched environment, cause serious problems. People will lose money on this outage. Some will lose big money. So, the argument will present itself sometime in Government about whether incidents like this give more evidence that centralisation, rationalisation and consolidation are bad things and we should, instead, encourage all departments to do their own thing - that way, if there's one failure it won't affect many other services. This is the "don't put all your eggs in one basket" argument, versus my version which is "put all your eggs in one basket and then watch the basket. Very closely". The technology to deliver the kind of services we are trying to put together - ones that join up multiple back ends across multiple departments - are inherently very complicated. The more bits of string you have, the worse the knot to untangle. But, to duplicate the string in several places and then try to thread it through various holes is not workable either. I'm not an out and out centralist - just pro putting things in the middle that would otherwise proliferate and cause pain later. It will be fun to watch how the debate on central versus distributed progresses.
It's felt a lot like MSS has been working with the mainstream press for the last couple of weeks (who MSS? follow this link). e-government is no good. noone is doing any e-government. even if they were doing e-goverment it would be no good. there is no evidence whatsoever of anyone doing anything any good with e-government. the programme is doomed and it always has been. we should slit its throat. there is no e-government Anyone else get that impression? First there was Mike Cross' crumby and misinformed story on the Gateway, then the piece in the Times which they titled "wired at a slightly lower current" or something, another piece in the Independent about the Citizen Space forum (that was an alarmingly misinformed and factually incorrect piece) and now a piece in the Independent titled "System Error - Why UKonline failed". That's just a bizarre title. I don't know a whole lot about how the press works, but it would seem to me that if MSS is not behind all this stuff, then there's an interesting bit of orchestration going on. Who knows who runs what agenda these days? As so often with these articles, it's long on rhetoric and short on ideas, only repeating the tired idea that we should change the target from putting "all services online" to "only the most commonly used". Doh! Forgive me folks, but if Self Assessment, Child Benefit, Tax Credits, PAYE, Corporation Tax, VAT and so on aren't amongst the most commonly used, please enlighten me what is. That is by no means the complete list available today and it's certainly true to say that some important transactions are not yet available - tax discs, housing benefit and so on - but they are in plan and will come. I did a brief presentation at a conference yesterday, with perhaps 70 or 80 people from suppliers in the local goverment market. It was a good audience and I was able to be candid with them about what was going well, what wasn't and what needed to be done. For a while, I've had an item in my do list to write the "1 page e-government" speech and I think I came pretty close to delivering it yesterday. I plan to write it up soon. Progress is not as horrible as might be thought from some of the press. The ukonline folks have made public visitor counts to the website which show 10 fold growth in the last year or so; Comparing April figures on the Gateway over the last three years, we've seen growth of 10 fold year 2 to year 1 and 20 folks year 3 to year 2. Pretty good growth rates, for any online service public or private, I would have thought. If that usage rate continues through the rest of the year, and I have no reason to think it won't, then we'll close with a big leap from last year's numbers and really start to make a difference. I'm the last to say it's done. Those in the audience yesterday will have heard me talk about "an absence of pixie dust" to make this happen by itself. There's still a lot of hard work to do, a lot of obstinacy to remove, a world of rationalisation to do, some politics and stuff to sweep away and some really good services to design. But it can all be done. It can all be done with the right people, the right passion, the right commitment. Of course, the way the press is running, it will sneak up on them without anyone noticing that it's actually coming together. All of a sudden, someone will visit a website and see how good it looks, how simple it is to get what you need and how responsive the service is. A bit like the journalist who found the PRO 1901 Census site by accident and commented that it was finally live - he's only about 9 months late, but well done nonetheless and it's a good, balanced article. Still, at least the Guardian knows its up. The Independent has yet to comment since it's article in January 2002 that it wasn't available.