Monday, April 28, 2003

Finally content?

Firstgov.gov, the curiously named but highly visited US government entry site, is getting some content management. I guess that's not to say that it had no management before, just that it was all hand-cranked ... Currently, GSA employees have to manually retrieve relevant information and write HTML and Java code for each individual page. Ugh. The pain of not having such a system is graphically presented with this quote ... When the Columbia shuttle tragedy happened, we took 24 hours to get up what we needed to get up," Jameson said. "If we had had this content management system, the people who do that for FirstGov could have done it from home within 20 minutes." The OMB have paid $525,000 for the software licence and ... The new system should be running by summer. The contract covers the software license and maintenance for one year and includes four one-year options ... [and] .. The license is governmentwide, so other agencies can use the Vignette system as well, she said. All of the technology contracts associated with FirstGov are governmentwide, including AT&T's hosting services. Use of the site soared from 7 million unique views in 2001 to 37 million in 2002, a 444 percent increase. Several factors fed the spike. I'm delighted that the OMB have made this move; delighted for a couple of reasons: 1) It gives me a good excuse to talk to Dan about the issues with implementing content management, what he's going to do about them and what he might learn from what we've done and vice versa. $525k for a software licence is a big chunk of change (for one year especially), add on top of that the integration and consultancy costs, the need to do the business process work (which is the hardest part in our experience), additional hardware and so on and it mounts way higher than that. I'll also be intrigued if it's planned as an XML delivery system, or whether it will all be TCL. 2) The "pan-government" nature of the deal is similar to what we've done and are doing in the UK with our DotP platform (check www,ukonline.gov.uk to see DotP in action - particularly look out for the neat use of cascading style sheets throughout). We've gone bespoke rather than package, which is a topic for some other time, but the principle is the same. Delivering "pan-anything" is pretty hard, it needs a lot of commitment (from top to bottom) and a lot of passion. The US doing what we're doing is a vote of confidence in the strategy. Content management is not a solution out of the box, as I've said before (I know, you're bored with hearing that now), yet pretty much everyone goes out and buys the disk expecting a smooth install. The challenges that it presents to the business; the hassles of running a complex system; the need to maintain and manage infrastructure and the pain of moving a big old site to a new site (changing it in flight no doubt) are all big, scary and under-estimated in every project. Anyone taking an implementation on needs to go in eyes open, otherwise there is more pain than benefit. I'm looking forward to watching how firstgov progresses with the implementation and comparing notes with our own approach. Now that we have ukonline live on DotP, we can move ahead with the implementation of other departments - all on the same infrastructure, with no additional licence costs for new entrants.

Wednesday, April 23, 2003

Death of the password

It may be premature to announce (again) the death of the password, but at least for users of Covisint, it's on its way out. Fascinating short piece in ComputerWeekly this week on a programme in Covisint to replace passwords with "tokens" - I assume USB type tokens or RSA smart cards. The reason it's fascinating (for me at least) is that there are some numbers quoted that I haven't seen before. It costs, apparently, about $100/year to "run" a token (for their community of 120,000 users in 11,000 companies (growing to 200,000 this year). Delphi, they note, has 20 staff just to administer IDs, with many handling calls to help lines no doubt, where 70% of calls are for forgotten passwords and each call costs between $40 and $60. It seems pretty easy to me to do the maths on that and come up with a sound business case. In the past when we've looked at tokens like that for government there have been two issues that stopped us moving ahead: training and technology compatibility. Training is reasonably easy to solve in a closed user community, but can you imagine how hard it would be to educate the UK population (or the online one at least) on how to use an RSA token? Technology, though, would screw you first - the wide variety of browsers, operating systems and whatnot would mean that the help desk would be full of calls complaining that the thing doesn't work. This is a problem that needs to be cracked and lots of people have had a go at it. It might, one day, be smart cards or bank cards with the EMV application in them, it might be some other kind of technical solution but, whatever it is, training and compatibility issues are going to be big costs.

Nice bio if you can get one

Just found Mark Forman's bio while looking for something else on the Whitehouse site ... his key achievements are listed as: - Simplifying the Firstgov.gov portal using a "three-clicks to service" model that led to Yahoo's recognizing Firstgov.gov as one of the 50 most incredibly useful websites; - Creation of the first IRS free filing website, using a unique private-public partnership - Regulations.gov, the world's first government sponsored e-democracy initiative that allows citizens to easily find, read and comment on proposed regulations; - Definition and deployment of a rigorous cybersecurity improvement process; - Consolidation of Federal payroll processing centers to save over $1 billion; and - Restructuring federal training through the Golearn.gov website, which has trained tens of thousands of federal workers at pennies per course That's not a bad set of things to have on your CV.

Tuesday, April 22, 2003

It doesn't write the novel too

Talking to a few people around government over the last couple of weeks, I've realised that there are some misconceptions about central infrastructure - things like the Government Gateway. I've been doing a paper on the opportunity that it presents, with the risks, issues and routes forward. Too many people have the view that it will do pretty much everything for them - "out of the box" as it were. Central infrastructure is like Microsoft Word. It gives you a great environment to write in, but don't expect it to write a best-selling novel for you too. That's where the business comes in, using what's there and exploiting it to deliver great services.

Sunday, April 20, 2003

Kill dud projects

Jay Gardner at BMC says ... "Find stuff not to do. Cross off projects from the list." ... couldn't agree more. Too many projects going on, too few with well-defined end points, even fewer with poorly defined benefits. Ugh. It's an ugly world of IT delivery out there.

POVs and ROVs

Points of view - so often so opposed. So easy to find others that disagree. I went off to the London Imax cinema today to see "Ghosts of the Abyss", James Cameron's 3D film of the Titanic. Had I read Edward Porter's review in the Sunday Times beforehand, I might not have gone ... "hard to get hugely excited", "recreations of the ship's former glories ... weakens the verite", "3D effects only compound failing". Thank God I don't listen to other people, especially ones with views as warped as this. Go and see this. It's a beautifully filmed, magical film that transports you 3 miles down to the bottom of the Atlantic, getting shots of the interior for the first time using very clever remote cameras (ROVs). The overlays showing people moving around on the decks, in the rooms, stocking the fires are exceptionally well done. I was left filled with wonderment, awed at the majesty of the vessel and of how much remains startlingly intact. I could have watched six hours more of footage. It's poignant, transfixing and truly magical. I mention this here for two reasons, (1) so that you go and see it and (2) because POVs are fine, everyone has one, we all have arseholes too, but we don't get them out and show them to everyone at the slightest opportunity. Care must be taken in expressing a point of view, checking the facts, making sure the argument hangs together. That's not always the case sadly. I refer to the marker I left a week or so ago on the recent article in the Grauniad. Factually wrong and consisting of a single point of view. Unbalanced and misleading. But there you go. POVs are a vital part of how we make progress. If we all thought the same, noone would ask questions and we would not move forward. That would be no fun at all.

Friday, April 18, 2003

Why do people blog?

Is a blog the same as any other writing? Why do people do it? One view here, from onepotmeal. We're not writing an account of our lives just as a record of our lives, we're trying to say something about our lives, and that can't be done if we stick only to 'facts': the facts that will speak to me about my life—me, with all kinds of insider knowledge and secrets—is hardly going to speak to you in the same way. We can't read weblogs the way we read other literatures—it isn't appropriate. They aren’t the same as novels or memoirs or what not, because of the play of time and because of other factors that would only serve to muddy the waters of my present point. And from burningbird, a response. Be not afraid those who would be afraid of what I write. I try only to write about what I see and what might be done about it.

Little Things Well

The very nature of Government projects, whether they are desktop rollouts; building restacks; or upgrades of tax systems, benefits delivery systems or whatever is that they're big. They're not necessarily the biggest global companies - after all, Walmart has about as many employees as the NHS and I can imagine that companies like GE have 400-500,000 or so (not that I could find out that number on the website - lesson one: if you don't know where to find what you're looking for on the web, you'll almost never find it). The consequence of often running such big projects is that even when dealing with small projects (and here, I put big in the hundreds of millions total life cost and small in the 5s and 10s of millions, UK pounds), there is an overwhelming tendency to throw additional requirements onto the list until it becomes as big as possible. This may be a result of the procurement processes that give you "one shot" at a given project, preceded by a lengthy buying/decision review; or it may be because public sector people are less directly engaged in a project during its implementation (the hand grenade over the wall syndrome). I guess it could be that public sector people inherently believe that the private sector will step up and deliver complex projects given the budget - although much recent evidence points to that not being the case. One of the latest "small projects made big" is the whole content management / knowledge management / records management programme. People are looking for a killer, install-once, take care of the problem solution to run the department's Intranet, the Internet site, internal records management, freedom of information accesses and, in some cases, the end to end paper publishing lifecycle (including leaflets, staff manuals, public policy documents and so on). The consequence of putting so many "related" things onto your do list is absolute, certain, guaranteed, cast-iron solid failure. A company bidding will be pulled in so many directions by so many stakeholders that prioritising what comes next will never be feasible. Such a project will never be meaningfully delivered - all that will result is a series of bodges as tools are stretched to do things that they weren't supposed to do, and processes are shoehorned into routes that they were never meant to follow. Ugh. I don't know whether this is an apocryphal story or not, but it illustrates the point nicely. A year or so ago, I was doing some work in the NHS looking at how the National IT programme might be structured - the work I'd been asked to do was part of a review of progress so far. One of the people I talked to was a doctor from a regional hospital who talked me through how his people were going around a procurement for new IT services. They'd been instructed by the procurement people to think of everything that they might need in the future and include it in the requirements list - nothing was to be prioritised, it was just a long list. So this doctor had wondered for a while and decided that a useful thing to have would be a bed blanket temperature controller. He wanted to be able to set a specified temperature for a given patient as part of their treatment plan and then ensure that the blanket was kept at this temp through an automated monitoring process. I'm sure that's a laudable aim, but can you imagine what it would cost to wire up every electric blanket to a system, keep track of which patient was there and ensure that the temp was right? Far better, surely, for the patient to carry a piece of cardboard with the temp setting she needs on it and have anyone who comes in the room to do something else check that it's ok? Or, maybe, just put the temp control next to the patient and let him take care of it? So, sometimes it seems to me that a bit of concentration on doing some small things well would make a lot of process towards getting the big things right - rather than trying to make a big thing work and ending up doing it badly. Even in a big project (and it's undeniable that projects in NHS are going to be enormous), breaking them up into small enough parts and handling them individually with appropriate linking standards is likely to result in far better results. Every time I think of this topic, I come back to research figures that the Standish Group quote: 45% of features in any given product are never, ever used; 19% are rarely used, 29% are often used, 7% are used all the time. Just getting the core 7% done for a defined scope would give payoff in record times and allow work to start on the 19% with money in the bank from productivity saves already underway. So, a few little things well instead of some big things badly and I think we could make a difference. It will require people to think differently about how they structure their projects, it will require suppliers to work to persuade customers that they don't need all that they're asking for - some real points of conflict might arise there, but how else will we progress?

Delivering on the Promise

Steve Ranger at Computing has picked up on one of the projects we've got running right now and published a nice, short piece on it this week's issue, although DotP has been talked about on the e-envoy website for a while (just search for "DotP"). We'll be ready to go live pretty soon and from then, government departments will have access to a high end content management system. What's hot about that? A couple of things really, (1) CMS implementations are complicated things. A lot of preparatory work has to be done up front to determine what the existing site looks like, how you'll want it to look in the new environment, what the workflow should look like, who will have access to what, how will you manage digital assets and which ones will you manage and so on. Almost all government websites are static HTML today so there's going to be a big swing toward worrying about this kind of thing over the next year or two. (2) If everyone implements a different CMS in a different way, we'll pretty much be back where we started with lots of different looking websites, managed differently (although better I'd hope), with little consistency of design principle. Having one that is ready now, that handles consistent navigation, accessibility guidlines, multiple browsers, customised workflow and with a large range of templates makes it quicker and cheaper to implement and gives us a much higher chance of consistent output; which, in turn, makes the citizen's life easier. (3) The entire engine is built as a database. Most CMS require you to think about "dynamic content" and "static content" and treat them differerently. Workflow is hard-coded as are the templates. Everything inside DotP is inside a big Oracle database - so there's no code to write. In fact, tests so far show that 91% of a big department's needs can be met from a core range of 15 already available templates, meaning that all that must be done to go live is set up an information architecture, define the workflow and access controls (again in the database, no code) and migrate. That latter point, one simple word "migrate", hides a lot of pain - but at least the technology will be there to underpin it all. This is a pretty big step for content management - what we've done here is ahead of "package" solutions and ahead of much of what has been done in the corporate world. A lot of people have worked incredibly hard to make this happen, from inside my team and from our vendor partners, Sapient and Loudcloud. It's going to be great to turn it on and see how people like it for real, after weeks and weeks of testing!

Wednesday, April 16, 2003

Still wired

Nice interview in yesterday's Times with Andrew Pinder. Can't link to it as far as I can tell - searching for it brings it up in a dedicated window ... but there's a robust exchange on targets: Will Government meet the 2005 target? “We’ll be more or less there — what matters to me is that we get it right,” he says. “If 90 per cent of services are achieved, we shouldn’t worry about the last few.” Does that mean the 100 per cent target has slipped to 90? “Come on, give me a break,” he interjects. “It’s about showing Government’s taking it seriously. We could have cheated, come up with broad categories, but we listed every service we’d thought of — up to spraying insecticide on motorway ridges. People should prioritise — and what matters is getting sites really usable, and then altering their offerings to match consumer demand. Let’s go for the 80:20 rule.” I'm always fascinated with the desire in some journalist's eyes to have the last word, a sarcastic pop. Something so that makes them think they're cleverer than us. Can't figure it out, maybe it's insecurity. Anyway, this one follows with Presumably he means the business notion that 20 per cent of well-organised time will produce 80 per cent of the results, rather than any further loosening of the target Just to show that not everyone's as clever as they think (with me at the top of that list), the 80/20 rule was coined by Vilfredo Pareto probably around 1916-20 or so and had to do with the distribution of income, not time. By the by, Pareto also clarified the "win-win relationship" into philosophical terms by defining Pareto Efficiency as the transactional state where at least one party is better off, most are as well off, and none are worse off. That's not what we want for e-government, we need a much better skew towards "better off" versus "as well off". Besides, old MSS is way funnier.

Tuesday, April 15, 2003

Killer app?

It's clear, we need this guy. A few pronouncements from him and site traffic will jump enormously. Not really e-government, but couldn't resist.

The importance of e-government

Mark Forman referred me to this note from George W the other day. Key line for me ... Our success depends on agencies working as a team across traditional boundaries to better serve the American people, focusing on citizens rather than individual agency needs Mark can rightly use that as a pretty big stick for any department or agency that doesn't want to play the right game. A useful tool to be able to say (my words), "Now, which bit of this memo don't you understand? Would you like me to ask the President to clarify it for you?" Doubt that there are many people that choose to take up that option.

Independently on a Sunday

I saw the same glass half-empty article as VoxP did, so was pleased that James came in with some positive thoughts rather than draining the glass to the bottom. Easy to throw stones, harder to see some of the light, harder still to balance the two views. Good stuff.

Sunday, April 13, 2003

Government on the web - not as bad a comparison as people make out

There's an interesting article in the Sunday Times "Culture" magazine today, in the Doors section - I can't link to it as it only pops up in a restricted window, but if you're registered, enter "customs and excise" in the search box and you'll find it quickly. The problem is that of paying duty, VAT and charges for importing goods from abroad and not so much that you have to pay (although that comes in for some stick), but how you find out what you might have to pay. It seems that the service offered by uktradeinfo.com is not yet simple to understand (and nor are the rules for import duty and VAT I imagine). Tariff information is available online after a fashion, but not at the Customs and Excise website (www.hmce.gov.uk), which is where the phone service directs you. Instead, you must follow a deeply buried link to www.uktradeinfo.com, where there is a search engine under the heading “ICN Online” — but don’t get too excited. The search engine has only recently been refined, and it takes the patience of Job to work out exactly which code applies to your goods. A great opportunity to streamline a problematic process by using technology. I guess the issue is knowing how many people it causes pain for and how much pain it causes each, without good numbers for both of those, there is little business driver. There is, however, one of those timeless quotes about the cost of bureaucracy. Some European law is changing that means duty and VAT will be payable on a much broader range of goods (with the kick off value starting lower). The Swedes have calculated that the cost of collecting revenues of £128 million will be £332 million. Great stuff. Anyway, what prompted me to write about this was a hugely painful experience with AmEx online today. I haven't seen a bill this month, so I thought I'd check online and pay somehow that way. Registering was a 2 minute job - nice process indeed, couple of shared secrets and instant access, no postal delays. Looking at the account was also simple, although at the first two attempts I was told that the "system was not responding" and that I should try later. But there was no sign of a "pay" button - nowhere at all. Eventually I found a link "how to pay", but that didn't mention online. Phoning the customer service helpline, I was greeted by a cheery chap who didn't seem surprised that I could pay online, but also told me that I couldn't pay via the phone either. So, I'm sending a cheque through the post. Now, where's that direct debit form? Aghhhhhh!

A way to deal with identity?

Carol Coye Benson thinks about Liberty, Passport and credit companies doing "shared authentication". And Fidelity struggles with Biometrics. (Thank you Scott)

More on the Accenture survey

The study I wittered on about the other day has collected quite a few comments... From the BBC, with quotes from Bill Thompson amongst others The Japan Times, noting that Japan is 15th out of 22 Some snooty reporting from Australia, via the Sydney Morning Herald Canada's Newswire, in what looks like telex format, heralded their well-deserved first place in the study I've just started to go through the report in detail. It's 94 pages. Pleasingly, one of the early graphics shows how the journey to transformation involves going through a few stages, cross difficult barriers - I doubt that I was the first to think about it, but the graphic they use does look alarmingly like the one I've been using for the last couple of years. Worth a read.

Generally Regarded As Safe

I liked this ... a directory of open source projects going on in US government and also in top companies. Not yet populated, but can't help but allay some concerns once people see a long list.

The man in charge of securing the 'net

Howard Schmidt, USA net security chief for the government, interviewed in Wired ... If you're not going to provide good security, and you're not going to provide good quality control in engineering in the products you provide us, we're not going to buy it.

Internet Self Assessment

Delighted to see today that the Inland Revenue are actively encouraging people to send their Self Assessment tax returns electronically. The "notice to complete" form should have arrived for several million people over the weekend and the first thing it talks about is sending the return electronically. I was also pleased to see that the IR have extended the deadline for when you must file to allow tax to be collected from your pay check (which used to be tied to the end of September "calculation" deadline) to the end of December. Watching the returns come in electronically, there were still two very clear peaks last tax year, one in September and one in January. This may have the effect of moving that first peak to December given that the online forms calculate your tax anyway, so September was always a red herring deadline. Or, with a bit of luck, it will spread the load so that tax forms start to arrive in bulk once everyone has their P60s and interest statements (things which often come out around June or July). Last year, volume was up 4 or 5 fold. This year, a similar increase would take the IR to between 1 and 1.5 million returns. About achievable with some good press on the service and a few more service allied to it now available, such as child benefit and tax credits. One million would be around 25% take-up - a figure that no other service has achieved to my knowledge, so a good place to aim at.

Centralisation - A realistic strategy or not?

Corporate strategies come and go. For most of the 80s and the early 90s, many corporates acquired other, smaller, companies creating far-flung global empires where each operation ran largely autonomously. As I joined Citibank in 1992, it was only just emerging from an enormous problem triggered by a mixture of the Savings and Loans problems in the USA, a cost base that was expanding faster than revenues were growing and an excess of staff performing duplicate functions. If you check back on business magazines from then, Fortune or Forbes perhaps, you'll see that one reason Citi survived was that the CEO, John Reed, implemented a series of significant cost-cutting and revenue driving progammes; at the same time, a Saudi prince bought $1 billion worth of shares at around $8.25. As I left Citibank, those shares were worth perhaps $10 billion, only seven years later. Something worked. Banks are a good example of the "expand at all costs" approach that was popular then, and Citibank was certainly not unique. The consequence though was operations in each European country with considerable duplicate functions but all operated autonomously. I had no part in developing the centralisation strategy - that was all the work of other people who, fortunately for me, I came to work with much later. I worked with a team developing the strategy for how the bank would consolidate and restructure from 1995 to 2000. A fascinating project that taught me much more about every leg of the bank than I could possibly have learned any other way. A generic example of a duplicate function might be trading in foreign exchange. Each country has an IT system, each country books trades, each country receives money and pays money, each country has brokers, sales people and product people. Each country deals with the in-country office of other corporates - the people who need to buy or sell in different countries to pay their bills (salaries and so on). Those are all pretty obvious costs ... but there are slightly more hidden costs of duplication in this case: individual brokers, trading on the bank's account, may take opposing positions in the same currency, a broker long one currency may be oblivious to another broker short the same currency; one broker may have more currency than he needs but not know that another broker needs some to settle a customer transaction who then goes out in the market to buy some more; the central bank will demand some capital to allow the positions to be held and so each country is obliged to tie up important capital. Adding all of these up creates both huge, stable expenses (based on the cost of staff, the cost of infrastructure and the cost of capital) and then enormous swings as cash inflows and outflows are managed. All of the foreign exchange staff are backed up by teams of people who manage cash flows, money market deposits, collections and more exotic derivatives that take advantage of any of these funds flows. From the early 1990s, banks took an axe to these costs: they consolidated their trading books, implemented back to back trades to ensure that risks were centrally managed, managed cash processing through regional centres and put staff into the corporates directly to manage customer money more effectively. The more dramatic programmes included consolidation of legal vehicles, moving all trading to one country, creation of enormous regional centres for all types of cash and currency handling and so on. Most banks completed the bulk of this work in time for the Euro to arrive - a factor which, had they not done it, would have forced such consolidation (after all, 18 countries in Europe trading Euros against each other would not be too smart). Some have continued the work, reducing costs further through consolidating other IT systems or financial products. The stimulus for this back in the early 90s was, clearly, a realisation that revenues weren't going very far but costs were rising. It seems to me that it's possible that there is a similar situation in many governments now. The world economy is precarious at best. Government staff counts are high and probably rising. For somem countries, tax rises may be in the offing unless there is an economic rebound. Back then, the banks had competition to spur them on - if they didn't cut costs and deliver better service, then someone else would and revenues would fall faster. In an increasingly competitive Europe where today the UK outshines all others thanks to some solid fiscal management, other countries are looking a little wobbly - and a progamme such as the banks undertook could bring them back on track, increasing the pressure on their neighbours, including us. The first steps down this path might be a programme to consolidate what a bank would call "back office functions" - payment of cash and receipt of cash as, after all, that's only a matter of reconciling money between accounts; with that underway, an aggressive step would be to consider common internal applications such as finance, payroll, HR, expenses management. A far less aggressive step would be common purchasing of everything from staples and paper to desktop PCs. Some of these steps would create problems at the front end whilst they were underway - customer confusion would be one risk and certainly staff confusion. Much of that could, perhaps, be addressed through using the Internet to mask what was happening underneath. A big programme to not only encourage citizens and businesses to use the 'net but also to get third parties (such as the independent advisors, business councils, accountants and so on) to use the 'net for their customers too would go a long way to buying time for the changes to take place. The 'net can do a lot - and it can certainly give the illusion that all is normal whilst underneath there is a lot going on. Normal may not be enough of course, but it's worth a try if the payoff just a year or two later is much greater effectiveness of services. Such a programme requires significant top down sponsorship. It doesn't happen where fiefdoms are allowed to prosper, where expenses are managed locally, where baseline budgets are hidden from view. With the top down leadership in place, there then need to be positions of power spread across the organisation that don't report into any fiefdom - cross-organisation roles, cross-product roles and cross-process roles. The folks in these roles can look across the whole piece to see the inefficiencies and the opportunities without the history of how the org, product or process has worked in the past. The corporates all had problems like those and had to deconstruct them as they moved ahead. For some it took months, for others it took years. Still others had it done to them as they were, in turn, acquired by more agile competitors. Noone is going to buy-out a government of course, but in an increasingly competitive economic environment, surely the equivalent is relocation of businesses to the strongest economy with the most educated, most available workforce. Weak economies had better watch out, because someone is going to steal your lunch. It's going to be fun to watch as a new set of people grapple with the same old problems.

Mobile phone and GPS location

There's a new service in town, called "Zingo" that gets you a taxi when you need one. We're all used to phoning a cab company and telling them where we are and then waiting however long to get a cab. That's not what this is. When you call, it figures out your location (via the cellphone network), contacts the nearest cab and then patches him into you so that you can agree how long he will be and exactly where to meet (I'm not sure what the resolution is, but the taxi that picked me up was no more than 500 yards from where I was when he got the call). For all that, he adds £1.60 to the bill and you're done. Naturally, it relies on you calling with your mobile phone (just in case there was any confusion). A while ago, I talked about some of the more interesting government services using mobile phones that might come about soon. One that piqued my interest was reporting and abandoned car. No different really from the taxi problem - you call a number, the service knows where you are, you speak the type of car, colour and plate into a system that records it and dispatches someone to go and check. Maybe you only check after three people have made the report, or 30 or whatever (like my 888 number stuff). I wondered what other services might work that are location dependent. One is the use of the phone as a proximity device (see my post on Fastchat the other day), perhaps to get access to something. You're allowed access to certain documents, say, only when you're in a given location (maybe the library?) and the phone can handle that; trying to access the data from home wouldn't work. Maybe you're a GP's assistant and you can only look at patient records when you're in the GP's office. As systems increasingly connect to the Internet and both the role that you have, the location that you are in and the time of day become important, this is something that might make a difference. Ultimately it doesn't have to be a phone - it could be a simple tag on your smart card that gives a signal to a GPS satellite, it's just that phones are more prevalent right now. I wondered about that Inland Revenue problem from a few months ago when staff were looking at tax records that they shouldn't look at. Assuming that some staff are and some aren't and maybe all the staff that are supposed to sit in one place (you can tell I have no idea here), then such a technique might work. You might, though, be able to solve all of this with RFID tags attached to the smart card, with the receiver setup in the same room or even on the device that is allowed to work - but they are less flexibly as you ought to be able to dynamically update which locations can access what. Some weeks ago there was a service on the web that, when you put your phone number in, zeroed in on your location. I'm pretty sure that The Register pointed me to it, but I can't find it anymore. Besides, shortly after it launched (and they realised what it meant - no late nights in the pub, I mean office) the location data was randomised unless you opted into the service. Ha, not much chance of that. If you're into some of the theory and issues around this, then there's a useful article for you to read by Sami Levijoki at the Helsinki University of Technology. The Zingo service shows that it's viable. You can call them next time you need a taxi on 087000 700 700.

Thursday, April 10, 2003

Hub of the wheel

Just a marker for now. More another time.

Global e-government

Accenture have published their new survey on the status of e-government in various countries, which Kablenet have kindly picked up. We haven't done as well as I would have liked in the UK, but I can see why we might improve our score by only 16% whereas other countries have moved faster. I think I've said before that our vision is on the money, but that we haven't yet managed to execute all the pieces. There are some interesting conclusions in Kable's story (I don't have the final report yet, doubtless that will arrive soon): - Initially, e-government costs money. That's certainly been true where I've looked. There is quite a significant hump of investment money needed to put in place the infrastructure to allow information and transactions to be presented. Worse, that money is typically spent department by department in many countries, making it hard to manage the costs and maximise the chance for return. Unless high numbers of people use the service and then stop calling customer service centres, don't write letters and start sending in accurate forms through the online channel, then costs (net) go up. Ages ago at dinner with Michael Dell he talked me through what happened for his company. In 1997 they went on the web, thinking that people would buy online, costs would go down and margins would improve (or sales prices could be lowered ahead of coming competition). What actually happened was that people used the web to look initial data up and then phoned in with more complicated questions. Sales per employee went up, total sales went up, but so did the number of call centre operators needed. It wasn't until the website was updated, incrementally, to address all the usual questions and more that people moved their buying to the web. I still remember the glee when Dell was selling $1 million a day of PCs online. That number must be somewhere between $50 million and $100 million now. - It's hard to move from "service online" to "service used". True enough and I think many of us would echo that. If the service is hard to find, or doesn't improve on the paper process, or doesn't add additional value that can't be found offline, then usage is bound to be low. Interestingly, my own (very anecdotal data) says that, for the most part, people are not even trying to find services online in many cases. That for me, means we've failed the neighbour test, i.e. until the service is so good that your neighbour will lean over and tell you what they found using a government website, high usage will remain elusive. That's effectively the FriendsReunited effect. Last time that happened, it was the 1901 Census data published by PRO. The report, which I've just found on the accenture.com website, goes on to list five core conclusions: - eGovernment matures through a series of plateaus. Each successive plateau acts as both a barrier to and foundation for progress to the next. (My note: nothing new here, this has been common wisdom for a few years ... the trick is how to break through each barrier) - Value drives eGovernment visions. There is a growing demand for projects to deliver Return on Investment. (My note: yes, and (to quote Jim Johnson) it's also about "return on requirement". You only get a return when you deliver and people use, so how do you deliver the minimum functionality with the maximum yield and then incrementall release more?) - CRM underpins eGovernment. Improved service delivery is the key strategic imperative for leading countries and executives. (My note: maybe, but CRM has so many connotations and is not a simple thing to describe, implement or use. Just like you "don't buy content management", you certainly can't buy CRM.) - Increasing take-up is a priority. Driving up usage is one of the key challenges for mature eGovernments. (My note: Absolutely). - New eGovernment targets are needed. There is a recognition that broad-brush availability targets have not matched objectives. (My note: been there before, the target is not the issue, it's execution against the target. Enough services available that are well designed will drive usage. One service alone, even with high take-up, does not make a government "e") Canada scores highest. That's been true for a while and it seems that they're stretching the lead now, moving into service transformation. That's impressive. Kable note that the developments are led by a strong-minded CIO with a clear central focus. Certainly when I've spken to folks in Canada, it's been clear that the aim has been to bring things to the middle, be consistent around design, force standards on departments to develop things in a citizen centric fashion and so on. Looks like it's worked.

Monday, April 07, 2003

Web service broking

I spent the day in a lengthy session with my team working on a budget plan for the next 12 months. One of the topics was whether we should build a "web service broker". Simon Freeman, who designed much of the original Government Gateway and still understands more about it than pretty much anyone, is a fan of this. It feels to me a lot like it did in mid-2000 when we first kicked around the Gateway idea. Everyone vaguely understands what it's for, noone desperately wants one now, but not putting it in place in time risks the creation of tens, hundreds or even thousands of incompatible, insecure and difficult to use web services. It took more than a year for the Gateway to gain proper traction and, if I'm honest, with some people and some departments it hasn't yet got there. The good thing is, for the most part, people aren't doing something else instead of the Gateway, they're just not doing anything. I don't think the same will be true for web services. Many will wonder how to secure them but, eventually, a few will take the plunge and then the snowball will roll. I need to do some studying to catch up with the latest thinking on such things and I plan to setup a session with some of the key folks in industry so that we can figure out what government needs to put in place - both for infrastructure and standards - before we get too far to draw it all together.

Xbox live ... Step forward

Just finished a couple of hours on Xbox live, playing Mech Assault. Just incredible. I've been playing video games for long enough to remember Space Invaders and even Pong when it first come out, to have clocked Defender and to have wasted most of my university education playing Ghosts and Goblins. Playing online and talking to people while you do it should kickstart a new round of innovation and wholesale change in video game mechanics. What an enormous shame that Halo was released without live capabilibity. The 4 player split screen game re-coupled and made into 4 players, no ability to see each other will be some fun. Why am I talking about this in my e-government blog? Three reasons really. (a) I should have been working instead of playing so I've lost a couple of hours of catchup ahead of meetings tomorrow, (b) Online collaboration, discussion and consultation is something that we haven't got right in government, but it happens right there and then on live without prompting - so if the application is right people will do it (not saying that government should be made a video game), i.e. people will help each other out (as well as kill each other) if you put the right environment in place, see Upmystreets social conversation for an example and, finally, (c) this will stimulate broadband uptake once people find out how much fun it is. Now all I need to do is to get some longer cables so that I don't have to rearrange my room to make it all work. Oh, and I need to find some friends ... when I kicked off Moto GP and pressed the "friends" button, instead of going out and finding Jennifer Aniston, the xbox told me "you don't have any friends .... press (a) to continue!". Go buy an Xbox, go get live. And if you get killed by a headshot from a sniper rifle, come Halo 2 ... that will be me.

GPRS text

The P800 has a switch that lets me send text via GPRS instead of the usual GSM-way. Does that mean that I pay for the amount of data I send? And how much do I actually send in an SMS message, with the header and any checksums etc? And is it cheaper than GSM text? I send 1000 messages a month and few fill all 168 characters. Any clues?

Public Sector slowdown ... on the web at least

Computer Weekly have some survey results this week that show government websites are slower than those in the private sector. Been here before of course, but the message ought to be getting through by now. We've had: low usage (nobody can find anyone that has used a .gov.uk site), poor availability (like buses, they are never around when you want them), poor design (even if you do find one and it's there, you can't figure out where anything is) and too big (unlike elsewhere, size doesn't matter). Now we have too slow. Seems like a full house to me. Except noone ever seems to say "too many". My personal belief is that the reason few people use government websites is that (a) they don't know what they are looking for, so don't know where to start and (b) if they do, by some chance, end up at one, it's often the wrong one and there's no easy way to get to the right one. If we had that right, I think the performance problem would be an issue worth cracking, but it's not right now - because ... Remember, this is taxpayer money being spent here. Every website has a cost, whether it's £1 or £1 million, having thousands of them makes for more than should be spent. As more surveys come out highlighting a weakness here or there, the combined cost of paying attention to the results and doing something about it across the entire domain soars. If the changes are made, of course. And given that this is the nth survey in the last 12 months, I doubt that they are. The present website guidelines are broken - it's not easy to get these kind of things right, it's even harder to get them right and then persuade the right people to do makeovers on their sites in record time to make them work. The over-riding guideline should be "if in doubt, don't build another website or use another domain name" Public sector people will tell you, I expect, that they spend a lot of time and energy trying to make their site work with a variety of browser versions which bloats the pages and makes them slow. That's certainly true - and it's one of the most frustrating things about developing web pages and applications. And then they want to be sure that everyone can find the service that they are after, so it's best to put as much as possible on the home page, right? That's just not going to make for accessible, fast pages. The work we're doing on ukonline right now showed us that we can get the load time down from 21+ seconds (where it is now if you try it) to perhaps 11-12 seconds through better design, image management and style sheets. Not 8 seconds, I agree (which is the traditional impatience measure, one that I imagine got forgotten about long ago when the number of browser versions in use went past two) - but checking the BBC website just now (via Site Confidence), it came out at more than 18 seconds. MSN was more than 38 seconds. Consolidate, rationalise, exemplify. - Consistency of design. If when the site is there, it looks similar to others, people will forgive slow load time, because they won't have to learn your site - Simple pages. Make it clear what's on the page and how to use it. The easier it is to find what you need, the less time you waste. - One domain name. Expect people to find what they need through search, either on your own site, google, ukonline or whatever. So don't confuse things with dozens of domain names - I expect the search engines pay no attention to what the name of the site is. Or do we need a few more surveys to tell us that it really is a supply problem, not a demand problem?

Upmystreet hopefully not going down

It's true, it seems. Upmystreet is for sale, or closure depending, I imagine, on how much cash they have. Let's hope the former rather than the latter. To a good home, please.

Convergence

A while ago, Phil Windley posted something that piqued my interest. He had plans to use his Ericsson T68i as a "presence proxy" so that if he went near his PC, it would sign him into Apple's ichat or start playing his favourite tunes. I filed that,at the time, as possible one of the most useful things to do with bluetooth - there are few that I have come across, except for headsets which I am told fry your brain more than even GSM phones. The other day, I came across Fastchat who have a widget that allows you to do Instant Messenger on your mobile phone (as long as you have a Nokia 7650 or 3650, although other phones are in the "coming soon" box). So, I wondered: If you combined Phil's presence proxy with Fastchat's software on your phone, when you were within range of your PC, it would sign you into your PC to do Messenger there. If you moved away, Fastchat would sign you out on your PC and sign you in on your phone. Messenger would change it's status icon to "Mobile" so that people knew you were around but might be slower to respond (unless you are 14 and can do text faster than you can type). I have no idea how you'd do something like that, but it seemed like a good thing to me. And it would definitely represent convergence. The end of boring meetings is nigh as I might soon be able to IM friends while I look like I am taking notes or allocating tasks on my p800.

Sunday, April 06, 2003

Mindshare

Prompted by an old, old posting by Jon Udell that he recently referred to again on mindshare, I wondered what mindshare might mean for government websites. It's a topic that I've mentioned before in the context that government doesn't have any. That is, the more websites you have, the harder you make it for people to find what they need, the lower mindshare you'll have. Jon said if you put something like "link:www.diverdiver.com -url:diverdiver.com" into a search engine (I used alltheweb, which has recently replaced google as my default), you get both a count of the number of sites that link to the site you input (less internal links) and a list of them. Is that a useful measure. Jon actually goes on to do things that I don't understand with scripts to refine the results (something that given a command line I could probably figure out, but you don't want to know the last time that I saw such a thing). I ran this for a few government web sites, first showing all links (excluding internal ones) and then excluding all government links (i.e. .gov.uk). That's not perfect as it doesn't exclude thinks like, say, parliament.uk or mod.uk, but I think it's close enough. What I was interested in was whether the percentage of non-government sites linking in would vary across the domain. The slide below shows a selection of the answers. You'll see, I hope, that although ukonline.gov.uk has the highest number of links, less than half come from sites that are not government. Given that it's supposed to be the entry point to government, I'd expected it to be much higher. But comparing it with Firstgov.gov (the confusingly named US version of ukonline), it looks normal - 45% versus 42% in the US. Departmental websites in both US and UK score much higher percentages of non-government referrals, meaning (I think) that their brands are better understood. Accountants link to the Revenue and so on. Is that right I wonder? The link count, though, falls dramatically after the Foreign Office (which is getting a lot of links because of SAR, the war and it's general value for finding out about travel information I imagine). And the most surprising count of all was Number 10's own website with only just over 1,000. For comparison, I looked at three other sites: Scripting.com (Dave Winer's long-time blog), the Beeb (now there's a well-linked to brand) and also Upmystreet, which gives local information based on postcodes. I thought that would have been higher. Of course, there was no point in excluding government links to those (I did check, doing it for Upmystreet shows practically no change). So, what does that all mean? First off it means, I believe, that the "central sites" are linked to by other government departments (in the UK and the US) because they've been told to, yet they are not trusted as useful or definitive sources by third parties who point their own consumers to the right place in government. That's pretty hard if it's true. If the notion of a "single entry" to government is right, then there needs to be some serious work on (1) awareness, so that it does indeed become the first port of call, (2) content, so that it has what people need, when they need it and that the content is definitively accurate and trusted, and (3) a network of partnerships and interchanges with commercial sites so that citizens can pass back and forth between sites to which they have granted a share of their mind, comfortably and easily. I think it also means that if government is going to establish a serious web presence, then consolidation is the only way to go - the kind of link numbers shown do not bode well for mindshare. P.S. Just catching up on my reading, via NTK.net, and I see that Upmystreet has gone into receivership. I haven't seen this news elsewhere, but I haven't been particularly looking for it. If it's true then it's a blow to the idea of delivering local information to people efficiently and effectively. Government hasn't managed to do this yet and Upmystreet was one of the few to do it successfully. Coupled with faxyourmp, it will be a crying shame if they disappear.

Enabling access to websites

The Disability Rights Commission is planning a review of 1,000 websites to measure their accessibility. Details are scant, but the plan is to complete by the end of the year - I hope that means several dot releases between now and then, rather than a bulk release. Stefan, he of upmystreet and whitelabel, has commented on this recently and, ignoring the arrogance of his own style (all this clue and will stuff is boring already), makes some good points. My worry is that the standards that need to be adhered to are not fully understood, in either the public or private sectors. It's one thing to say "be accessible", quite another to turn a random 50,000 pages of HTML in a .gov.uk domain into whatever that might be. Few government websites deserve to score well in this survey. Few are readily accessible to the 6 out of 7 without a disability, so the 1 out of 7 are certainly going to be out of luck in many cases. It is going to take a few beacons to stand out so that others can gravitate towards them (moths to a flame?). But let's be clear what the standards are first (and this is not meant to be exhaustive): - There's no such thing as a "separate easy access site" - ukonline has one of these right now, it soon won't - Fonts and colours must be easy to change, by the user directly if possible using their own style sheets (that pretty much rules out HTML sites) - Screen readers must be able to navigate consistently - so that means changing navigation style at different layers is out - Writing must be clear and simple (Stefan has also pointed out the there is a world of difference between "it's" and "it is" for a screen reader) - Graphics must have alt-tags (the easiest thing to do) Given that government has limited propensity for joining up its services online, there is a huge risk here that one site in the chain lets the others down - if you start at ukonline but quickly get thrown to some other site (naming no names) and your experience is broken, what do you do? Applying these standards (plus others that relate to use of video, avatars that might do auto-sign language for the deaf and so on) is going to be a headache unless the act is got together pronto. The issues are big ... 2.4 million odd pages of content spread over 1,800 sites. Few sites are available in any language other than English (a few in Welsh), but colleagues in Camden tell me that there are 300 languages in their borough alone. Accessibility, multi-language, usability ... none of them things that are excelled at right now. There is something on the distant horizon that might help government though, the Cybrarian project which is due to enter proof of concept soon. It's been rumbling for a while so I don't hold out the highest hope, but if it get's there and does the right thing it will be an important part of the migration. There's a lot more to be found on accessibility, but one of the better pieces is here, at the "Making Connections Unit" website.

Open source procurement

Lots of coverage this week on DWP's implementation of an OGC-sponsored open source solution for online procurement. I'm intrigued by this - some of the stories seem to indicate that bits are open source and bits not (it may be, and I'm not sure, that it's a proprietary package but running in Linux), but that's not what intrigues me. For a long time it's puzzled me how to get departments to work together to save money. After all, how many contracts do we need in government to buy staples, paper clips and pencil sharpeners? This might be the first chance to really get the buying clout that government deserves. Recent deals for, say, Oracle and Microsoft software, have shown toe-in-the-water thinking about blanket licences and have certainly saved some money. But low numbers in the scheme of things. Coupled with the announcement from the NHS about shared service centres, a project which has taken a long time to come togther, it looks like there is a sea change in parts of UK government about the merits of joining up. If we could get another big department to use the procurement software, and someone from outside the NHS to join in on a service centre ... Ah, dreaming? Or maybe not. Maybe now is the time. Budgets are being cut, public sector borrowing is up, IT systems are relics from a bygone age, Blair wants reform ... Maybe.

More remote broadband

Meanwhile, The Register notes that BT is planning to work with sponsors to promote wider availability of broadband in remote (not rural, not the same thing at all) areas.

Remote Broadband

I had the good fortune this week to spend some time with Angela Vivian down in Somerset. Angela styles herself TOL, or "The Old Lunatic", although I'd see the world a much saner place if we all had half of her wisdom, energy and passion. First item of the day was a local session to see how local businesses might enable broadband in the area - not ADSL especially, but broadband of one kind or another. Seems that there are quite a few local businesses that could benefit. If you've visited this part of the world, you'll see that Wedmore doesn't seem to lack for big houses so even though it's rural and remote, it's not poor. Tim Wotton of RABBIT (Remore Area Broadband Inclusion Trial) was there with an offer of a small sum of money to every business that wanted to kickstart a local attempt to get some bandwidth. The rules for getting the money appear relatively simple - a rarity in the case of government handouts. Tim's funding comes from the dti's pot of around £30 million announced a few months ago to stimulate remote broadband. Although Tim doesn't steer you towards a particularly solution, one of the conditions of funding is to report back to him on how things are going - response time, customer service, contention - and he plans to publish a review of which providers offering what technologies are doing best. To get the money though, you must be about to install - if you've already done it, you don't get a penny. Exenet were also there, presenting on some of the options for getting faster kick in your net connection, whether it's satellite, some kind of wireless, cable or good old-fashioned fibre. I've been corresponding with some people recently about the relative merits of ADSL over other technologies and also with people who are fed up with the service that they receive from BT on ADSL. These are the kind of people that send my boss abusive mails and, often as not, I get those mails to do deal with. Brightens up my day sometimes, other times not. There are probably two important things to know about ADSL - (1) it has a high contention ratio or about 50:1 meaning that if you're unlucky enough to live near people who are Kazaa-addicts, you may find your bandwidth doesn't look as good as the ads, and (2) the reason it can be had for £30 or whatever these days is because there's next to no customer service with it. If you plan to make ADSL mission critical for your business and are likely to lose revenue if it's not there, don't get a cheap solution - look for other options, because when you send me a mail to complain, I won't be impressed. Back to Angela, who plans to have the community wired up on or before the end of June of this year. Seems like it's about do-able to me as long as she stays focused on it. The problem with organising local initiatives such as this is that they need someone to sit in the driving seat and make it happen. If you don't have someone like Angela, you're going to struggle. And there aren't enough Angela's in the world.

3g? Or not.

I spent some time today playing around with a 3g phone, the NEC 606. I reserved one a few weeks ago and today was the day to put the money on the table. Rather than do that blindly, I thought I'd have a fiddle first. First impressions are not that great (I'm ok with the size of the phone, after all I have a P800). The interface is a bit strange - the opening screen is a grid of icons, but the 4 way navigation button only lets you move horizontally between them, forcing you to click 9 times to get to the last one. Pretty much everything that you do after that seems to involve a download. Click on the "Find" button and you'll see the little flashing envelope icon that says it's getting data. And you'll see it, and see it still and see it some more for quite a while. After several menus I finally got the restaurant I wanted (no T9 typing in the input fields that I could see) and off it went to download the map. Except it didn't. After 20 seconds or so, it popped up with an error. A strikingly Windows-esque error as it happens. The same error appeared a few more times whilst I was trying to get Alien Storm (or was it Swam?) downloaded. It struck me that the virtue of having fast download speeds has meant that the coders have opted either not to cache too much locally, not to store programme code locally or to write without thinking about the consumer. It may be a while ago, but I still remember squeezing out unnecessary code to save space in 16k ZX81s (and even the 1k version for a while). Website coders learnt that too (see Amazon and Google). So with only ISDN-like speeds to play with (and irregularly available ones at that), have the 3g service providers gone and written bloated applications that have to page in and out across the ether regularly? I like my phones fast - if I want to know where a place is, it had better tell me when I want to and not cause me more pain that it would take to phone a friend who probably knows. The next thing to put me off was the queue of people, 3 deep, at the counter returning phones. One guy was complaining that the software crashed five times a day. I don't know that it's the case or not, certainly the one that I was looking at didn't overtly crash, but maybe I didn't try any of the hard functions. So, I opted not to flash the cash and went home without a 3g phone. I am sure that more will become available later in the year and I'll take a second look then. It might be a while before we're using 3g phones to get help from our tax adviser on how to fill in the Self Assessment form. 3g is not ready for mainstream, the phones so far available are big with poor battery life and a clunk interface. If you want a good phone for now, buy a P800 and wait a bit for 3g. Once there's a bit of competition in the market with the other providers in there and a few new phones, things will start to shakeout. Alternatively, it could all go horribly wrong and a lot of people could lose a lot of money. If you think that's the case, then buying Vodafone shares now could be a good thing as they've wiped a chunk of the 3g debt off their balance sheet so there's room for upside. Not that I know a lot about shares, and you shouldn't take that as a recommendation.

Friday, April 04, 2003

Redesign in Australia

Continuing the theme of the last post, the audit folks in Australia say "no more silos". Progress in Aus is better than in many countries, but the report believes that there is still too much focus on the agency view of the world, as opposed to the citizens. There are some good quotes ... The current practice of e-government as a provider of online information and online transactions should be regarded as a stepping-stone to a more inclusive and integrated government Web presence. In the past, provision of e-government services has been based on the government's online strategy, rather than business or customer service. The transition from agency-oriented to citizen-centric e-government may be difficult and time consuming and will require leadership and coordination as agencies work towards a common and agreed architecture. There's that technology thing again ... "a common and agreed architecture" ... no that, no citizen focus, more disillusionment.

Redesign it all

Norm Lorentz, Mark Forman's CTO, who I met last week says at a conference that there's no part of government that doesn't need to change to deliver better service. That's a punch on the nose if ever there were one. And he want the authority given to the White House to make those fundamental changes without the need for legislation. That signifies massive change potential, massive risk and a whole lot of fun coming. In the UK, I've been a fan of delivering some services that give the illusion (that word again) of a joined up government through clever integration at the technology layer rather than in the business. If we had that, I've argued, then we could buy some time to do the real work that is needed in the business (at task, process, workflow and system level). There is no system that I know of in a department that is designed around a customer need - they are all "line of business" applications designed to support a specific transaction or piece of legislation - so taking on the commitment to re-engineer (aka transform) agencies or departments is big stuff. The technology isn't there today, and it will need to be put there fast and flexibly to facilitate that change. I'll be watching with interest and certainly cheering Norm and his folks on.

De-nokia'ed

The Nokia in me is gone, at least for a while. My new P800 arrived a few days ago. It's got some serious potential this phone, with a few flaws that could be enough to drive me nuts. Flaws? How about a touch sensitive screen so that when you hold the phone to your ear, you create a bunch of new contacts, meetings or notes consisting of random characters? Or maybe delete some of the existing contacts or meetings? The hand writing recogniser doesn't work quite the same way that the Ipaq or Palm does, so I've got to learn something new but that's not such a big deal. What probably is though is that you can't customise it - so the Ipaq menu that says "when I do an 'L' followed by a 'space', that's not a 'T'" doesn't seem to be there. Still, love the games, love the camera and, above all, the integration between phone and organiser. They're onto something at last - after endless frustrations with the T68 and the "i" version. You shouldn't need another reason to consider buying this phone, but if you do, one word should do it all. MAME. Bloody amazing. Or maybe I'm just getting old and need to relive my teens. Just one small thing please Ericsson, can't you give me a cookie that stops the annoying pop-up ads for the P800 that I see on pretty much every site now? I've got one, I'm not going to buy another.

Deadline doubters

Computing's Andy McCue sees doubts over whether the 100% online target is valid (and there's more on it here). If we were at 89% of transactions online now (right now!) and arguing about, say, whether to put "burial at sea" online, or "online exhumations" or something like that, I'd be right there. I'd be saying that we'd done enough. If I was looking at daily graphs that showed millions of transactions (out of the 5 billion annual transactions done with government) were happening, I'd be pretty clear that we'd done the right thing, it was working and I'd be arguing that we should shift attention to more fruitful activities. Re-engineering the backend of government systems, say, or rationalising departmental organisation structures. But we aren't, I haven't and so I'm not. Few transactions online, few being used and still we're arguing about the target. Get a life folks. Get to work on delivering the next few % so that people use them. If you get to 2005 and half your local population are paying their council tax online, checking the balance online, applying for their housing benefit online, paying their self assessment online and claiming child benefit online (along with the other 500-odd transactions), argue about the validity of the target. Until you can claim success (and there are one or two local authorities that can claim they are pretty much there), get on with the work. And, to the thinkers who are arguing about the target, how about you ponder why these services are not online now, why the ones that are going online aren't being used as much as they might be and what needs to be done about it. Like I said, broken things need to get fixed. Arguing about how broken it is doesn't get us anywhere.

Searching hard

To prep for a conference the other day, I spent some time on google looking for what I thought might be words that people search for to do with government. I picked things like "disability living allowance", "child benefit", "child tax credit" and so on. I also added, just for fun, "public sector IT failures". And I restricted the search to just .gov.uk domains. The flaw in doing this, of course, is that I am relying on people who search to be "government savvy", i.e. to know that there is such a thing as "disability living allowance", but bear with me on that. The number of items found for each was in the 1000s - more than 10,000 in several cases. In isolation, that may not be bad - it might mean that many sites have tried to increase their value to the people who visit by pointing to information that might be elsewhere. But, it turns out that the .gov.uk sites have tried to provide even more value by re-explaining what, say, DLA is in their own words. That's a dangerous thing for a few reasons - (1) 10,000 references means 10,000 changes might have to be made if the rules change, (2) The odds of getting it wrong and misleading someone are high and (3) it's hard to find who is the authoritative source. I understand why a web manager or a business owner of a website would do this. Maybe the source site is inadequate, maybe the site is trying to be a one-stop shop. But, the end result is confusion, duplication, increased costs and more disillusionment.

WMD

Post removed.