Sunday, September 29, 2002

I'm a regular reader of newsisfree - a great source for random news (which is how I use it) that will give you some new ideas. It collates news from thousands of sources in several languages. I noticed from John Gotze's site that he's persuaded the guy who runs newsisfree to do an e-government page too. Most of the sources that you'd expect are covered, including Kable etc and, of course, John's own site.
I noticed this week that Simon Moores has changed his web site and started a daily blog - it says that he's been working from home 3 days a week for the last couple of months and he's also managed to find the time to update this (and other) blogs daily - something I've never even got close to. It's pretty good and certainly worth spending time catching up with and reading frequently. It even uses the same engine as I use (blogger). Simon occasionally covers e-government issues and acts as a roving ambassador for the UK government on technology issues, often visiting the Middle East particularly..
I've been asked to write an article for an IT journal, to be published next week. It's a short piece of around 600 words - well, I say it's short, but it turns out that it's pretty hard. The stuff I write here is mostly stream of consciousness. Journals need a different style of writing I think so I've been reading other columnists and wondering what to theme the piece on. If I get it right then I hope I'll be able to write a few more. There are a lot of topics to cover in the e-government domain and it would be great to get the story out more widely so that things started to happen faster.
The Inland Revenue made the news again this week, for the wrong reasons. The online Self Assessment service struggled with some big volumes - causing slow response times. The IR responded by asking that users not use the site between 7pm and 11pm. I've heard that volume is perhaps 10x the levels seen last year at the same time, although the IR have said so far that volume to date is far ahead of the volume received during the whole of last year. That said, while wandering the 'net looking for personal health insurance I found that the BUPA site was down - so it's not just government sites that are having a tough time. The ironic thing is, I guess, that no "user" puts a requirement into a system specification that says "when volume reaches 2% of total potential, please crash/slow down". Government departments certainly don't do that. There is a lot to learn with this technology.

Wednesday, September 25, 2002

The Iraq Dossier went live this week (at 8am on Tuesday) ... demand was strong and lots of sites around government felt the pressure. UKonline (which we'd beefed up the night before, just in case) held up well and delivered its share of the dossier without a blip. The BBC noted that some sites had struggled to deliver their quota.
I picked up a link from Jiri's web site to JoelonSoftware an article noting that NASA is cancelling a project to build a new launch control system that has already been running for 5 years and has cost $273 million. I guess NASA is, strictly, speaking public sector - but not that long ago they were held up as the benchmark for "good" software development (SEI level 5 and all that - I rate this equivalent to ISO 9001, or the concrete life jackets standard - i.e. it doesn't have to work as long as you build it consistently over and over and over again). Joel's wonders why are we still building [multi] billion dollar monolithic systems ... and why are they failing? I've been wondering myself about this kind of thing for a while - across every public sector organisation in the world you will find piles of billion dollar/pound/euro projects building mammoth new IT systems to deliver increasingly finer-grained taxation systems, benefit delivery mechanisms and so on. These systems have web front ends that contain rules engines and often are cloned by third parties (like Sage, Oracle, Peoplesoft) who need to deliver similar functionality to allow companies to prepare their accounts. So, what we really have is a front end (with some rules) ... a back end (with the same rules) and a general ledger (that makes/receives payments and keeps the books). What I don't follow is why we have n front ends (where n is very large), m back ends (where m is probably 1/3 n) and m general ledgers ... what we need is m, 1, 1. Or, at least one general ledger and one generic rules system that exists in "space" (i.e. out there, not in here) and that can be called upon by third parties and government systems alike. This needs some rethinking about where the data is, how it's stored and who looks after it (dare I say that the citizen could be trusted to look after their own data and make sure it's up to date?). I wonder if anyone is doing this? Anyone know?
Today I found, by accident, a link to this blog from Scott Loftesness' blog which in turn took a link from Jiri Ludvik's site ... both of these are well worth spending some time on. They're focused on technology and security issues generally and, particularly, identity and authentication. Jiri starts his post by noting that he thought that the Government Gateway was a country-specific version of MS Passport. That's not a bad first shot and I wouldn't argue with the principle - Passport is about single signon to multiple web sites and the Gateway is about single signon to multiple government online services. The Gateway isn't in any way related to Passport (and although it uses MS products, the two technologies are far apart). The first job of the Gateway is to provide a way of linking "you" as a person to "you" as something that Government can relate to, using any of the myriad of numbers that the average citizen will use in their occasional contacts with Government. But secondly (and, for me, more importantly) it provides a route into Government for transactions (in our case, these are sent as XML documents that can be digitally signed if a certificate is available). That "route" is a common route for any transaction coming from any source - so a commercial web site (say a bank that wants to provide tax services for its customers) can send transactions to government (and, if the same bank wanted to move into benefit-related services, it could use the same interface); likewise, applications (such as Sage that handles small business accountancy), can also send transactions (so ... a small business need only press "send tax returns" and, provided there is an Internet connection, it will wing its way to the Inland Revenue, be authenticated, submitted and acknowledged. I don't know of any other Government in the world that has independently created a system like this - one that supports both internal government web sites and 3rd parties so that the process of submitting authenticated transactions to Government is simple (well, relatively simple - nothing is easy when you're composing XML schemas, dealing with digital certificates and trying to join up Government departments). Pleasingly, Jiri goes on to ask whether we can succeed where Microsoft (with its originaly .net vision and hailstorm services) did not. Here. he's talking about our plans to offer notification services (via mobile phones, email and so on), appointment booking etc. "Tune back in 2004" he asks. That should give us enough time ... it's not going to be easy, but it is worth it!

Saturday, September 21, 2002

VNUnet publishes an article by Mike Cross (he of ex-Kable editorship) on 'why public sector IT projects go wrong'. He notes the view of the OECD, "Unless governments learn to manage the risks connected with large public IT projects, these e-dreams will turn into global nightmares," it warned last year". I don't think that's new. The OGC's gate review process and the earlier SPRITE initiative in the UK, more than 2 years ago, noted what needed to be done and the word is that the right things are being done now. Right after the Government Gateway project I listed the key lessons that needed to be learnt if projects were going to work - and used the same slide in a lot of presentations hoping to get the message across. A couple of the lessons are noted below. I want to revisit this and will post here some updates: - The decision process needs to be rapid, focused and flexible. Decisions made one day can be changed the next day - there is not usually the time to wait for the next 'project board'. No project can be successful unless the team can respond to changing circumstances. - Fast decisions need rigorous communication processes. Changing the plan regularly or even occasionally needs to be coupled with a process that makes sure everyone knows what has moved, why and what the next checkpoint is.
Jim Haslam, the president of SOCITM, says this weekend that the 100%/2005 targets are not going to ensure that interesting services that people will actually use are going to be developed. More than £671 million is at risk if the strategy does not get more ambitious. A "compliance culture" is in place. And not one that is concentrating on service delivery that gives value to the public.
An audit of US government web sites by Darrell West (from Brown University) ranks the sites both at state and federal level and makes some fascinating recommendations. My comments follow the recommendations in >><<. **Employ consistent design and navigational principles so that users of e-government services may move among different agencies and offices without confronting radically different user interfaces, search techniques and other impediments. >>We embarked on this programme in the UK several months ago, developing a set of consistent standards that will gradually be applied to government web sites. The hard part is that many of the sites are 'static' today, i.e. don't use content management tools, making it very hard to apply changes across the whole site. So, coupled with the look and feel standards will be a single platform and toolset that will support the changes, for those departments that don't want to go through the pain themselves. - Integrate state agency websites into their state portal or gateway web site. This enables citizens to locate easily desired services by surfing either the portal page or the agency web site. >>Fewer web sites can only be a good thing. I've heard that there are 30,000 in the US government alone ... there are 1,800 in the UK (which is still about 1,700 too many)<< - Minimize use of areas that require premium fees. Placing additional charges on governmental services deters free and open access to electronic governance. >>I can't imagine any government web site that would charge for access - only for specific things (such as downloading documents that would normally be charged for)<< - Increase access to interactive technologies. The public sector has yet to implement successfully two-way communications devices, web site personalization, and credit card payments on the majority of their pages. >>Can't argue with this. It's a big step and must be coupled with the initial recommendation and an authentication gateway that provides a range of web services supporting such technologies - otherwise everyone goes out and builds the same thing over and over again.<< - Enable foreign language translation through translated pages or software translators. >>This needs to be done .. but it's a big job unless all the tools are in place and the staff are available to do it<< - Provide a clear and consistent privacy/security policy. The state of Connecticut, for example, has linked every agency in its borders to a common portal page outlining the state's policy in these areas.>>Yes<<
Romania has pushed ahead with online procurement, perhaps getting ahead of many other governments, although at a cost of 40 million euro (which seems like a lot of money, but they have already saved 10% of that price through cost reductions. The full system will be procured next - although a 40 million euro pilot seems astonishing if it's going to be thrown away and rebuilt. An open tender is out already.
Kablenet has posted a survey on whether the 100% online/2005 target should be kept. I tried the survey from my usual PC which is firewall protected and has strong control over cookies but I couldn't access the site (hosted by survey monkey), so I had to turn off the controls to use it. It's a bit disappointing - There is one question: should we keep the target yes/no. If no, you get to say which factors should determine what goes on line first (ease of use, customer benefit, ease of deployment, cost save to government etc.). My stance on this (and I've filled in the survey) is that of course the target should stay (after all, Kennedy didn't say 'get a man on or about the moon, sometime in the next 20 or 30 years, bring him back alive if you can). What we need to do to get those services online though is to do the prioritisation that the remaining question allows and deliver the key services sooner so that they encourage later take-up of the remaining services.
An earlier article on the US' e-authentication gateway notes that the piliot is funded to the tune of $2 million and will launch on September 30th, but that there is no funding (yet) for the full roll-out. This one notes that agencies may be asked to fund it themselves - which would require "re-tooling" their commitments to date. So ... if we couple that with this article on an extra bid for $45 million for the e-government programme - a "critical amount of money" for achieving the e-government programme. The article goes on to note that the Federal IT budget is $52 billion, for the 24 "quicksilver" e-government project. This $45 million is for reducing duplication, ensuring that legacy systems are retired. Mark Forman, the US e-envoy says “Clearly there are a number of areas where we are overinvested,” Forman said. “We saw this with E-Training. When we launched, we were able to buy the technology once and scale it out for everyone to use.”. Now there's a view I can identify with - build once, replicate many. But I don't follow how $45 million in the middle gets you benefits versus $52 billion on the programme. I thought that the US had put the control over the e-government projects in the hands of Mark - an excellent move - dual key equivalent, but stronger. I'll find out - one of the key learning points for any country embarking on e-government project is that funding control is vital - if you leave it in the "stovepipes" you won't get joined up projects, you will get duplication and you won't see the benefits that you need.
The US are moving ahead with their e-authentication/gateway-type project. Somewhat bizarrely, the folks there seem to be still at the stage of analysing what security levels different transactions will need. Pretty much everyone has to do this and the conclusion is always the same: no security (e.g. payments, after all, government will take money from whoever sends it); a bit of security (user id/passwords - for transactions where there is limited risk or controllable risk); quite a lot of security (where you need to be sure the person you are dealing with absolutely is that person and here the only answer today is digital certificates issued to particular guidelines); and, finally, perfect security (passports, driving licences etc as these are the documents that get you pretty much anything else in life, like bank accounts and so on). I'm a bit startled that this work is still going on - policy docs have been written on this the world over. Our own UK-version, called "t-scheme" has been around for a couple of years now. More importantly perhaps, the article says that they're ready to launch a prototype gateway this month ... I've looked around for it but it's nowhere to be seen yet. Maybe later in the month. There is a quote in the article that says the US might be looking for an industy partnership - vendors build it and charge a transaction fee to government (or, more likely, the agency that is using it - although this falls down with joined up transactions usually). But clearly this means that a version has been built, which is only for test ... and another one will be built sometime over the next year, ready for launch in September 2003. The other important quote is "The launch of the prototype gateway coincides with GSA’s announcement that the Agriculture Department’s National Finance Center, the Defense Department, NASA and the Treasury Department have signed up to use the Federal Bridge Certification Authority. The bridge lets agencies accept other agencies’ digital certificates using a public-key infrastructure to verify users’ identities online". We've been there. Certificates are on life support in the UK. I will watch with great anticipation how that works out. There is a lot to learn on using certificates for online transactions - the issuance process needs to be simple, the technology standards need to be clean and clear, the certificates need to be portable across platforms and so on. I wish them luck.

Saturday, September 14, 2002

I visited "Game On" at the Barbican this weekend and spent a good few hours reminiscing over my mis-spent youth. It's a well put together exhibition that, sadly, will be closing in a day or so. So if you haven't seen it already, you've missed your chance. 26 years of video game history are there - starting with ComputerSpace and tracking through arcade games like Centipede, Pac-man, Defender and on through home computers (there's even a ZX-81 there, but it's not running - although no ZX Spectrum that I saw) and then to consoles (including a 4 player, multi-screen version of Halo). Why am I rattling on about video games here? Well, I guess it's a recognition of where we are in the world of Internet technology and then e-government maturity. Those early games are still playable, still have great moments and fabulous memories - but they're nothing compared to what we have now - on-line, multi-player, fantasmafests. e-government appears to still be in the Galaxians stage. Relatively simple, linear transactions that lack much intimacy, variety of ah-ha! moments. That is changing. But we can't yet say that we're "there" or that we even know what "there" will look like. Could Eugue Jarvis have predicted the evolution of games when he was busy writing Defender, Robotron or Stargate? No. Not with the best crystal ball available could anyone have seen how it would develop. So it will be with e-government. The best in class will progress beyond the Galaxians stage, although it will take a few iterations, into the world of online gaming. But if it took video games 26 years, how long will it take for e-government?

Tuesday, September 10, 2002

I'm increasingly impressed by the rhetoric coming out of Mark Forman's office at the OMB ... see this latest piece updating us on the next stages in the USA e-government plans. Mark has thought this all through, built a plan and is now executing - step by step. I think there is a significant chance that the US will race ahead of the rest of the world in action as well as deed during next year. And it will be through doing it right - corralling spending so that departments cannot go awol, delivering real value when you say you will and focusing on re-engineering so that citizen benefit is clear and demonstrable with each piece of the new systems architecture.

Tuesday, September 03, 2002

I found this in a Mole article on VNUnet from last week "...these include the formation of new taskforce with a title so absurdly self-satirising that Mole may be out of a job before the e-Pinder. Please give a warm, bed-wetting welcome to the e-Envoy's e-Delivery Team.". It's there along with a point about "e" being the same as "non" ... so the office of "non-envoy", "non-delivery team" etc. Not sure what we've done to offend Mole, apart from talking about a strategy that he doesn't like. Ho humm, delivery is measured in items done per the customer requirement, by date promised at budget agreed. I think we're happy to be measured by those criteria.
I promised some thoughts on the "soup to nuts" piece on VNUnet. The mole had a few key points - The plan we have for e-government is vast, wide-ranging and complex - There is now clear sense of direction and common purpose - But we may be inventing similarities where they don't exist with our ideas on "common engines" - The timescale is ridiculously ambitious - It would be better to start from scratch and change processes and apply technology to the new processes These are all good points and clearly have value. As you work through any planning process, you need to look at the issues from various angles. This is one angle that we looked at over and over again before proceeding. When the Government Gateway was first being put together - when it was only an idea on the back of a 4 page business case - the idea of building a "single anything" for government was difficult to contemplate. It hadn't been done before. Yet now, the Gateway is used by the Inland Revenue, Customs and Excise, Dept of Trade and Industry, Dept of Envt and will soon add the Dept of Work and Pensions, some of the Devolved Administrations and others. At the same time, the functions it offers have grown from transaction routing and authentication (using digital certificates or UserID/passwords) to secure two way mail, payments and soon interfaces to the mobile networks for text messaging. There has been remarkably little customisation for any single department although we have made changes at various stages to improve the way the Gateway works or how it does particular tasks. We've now extended the Gateway "build once, use many" model to the front end - to web sites. Government has hundreds of web sites - I've been known to call them "matchstick Eiffel Towers" in the past - and they all look different today. Nearly all of them are static HTML sites, but the next upgrade cycle, to content managed web sites is happening now - piece by piece. If we end up with hundreds of web sites again, all different, then we have wasted money, time and effort and made the user experience just as hard as it is today and maybe even harder than the offline user experience. So it makes sense to apply standards, common modules and so on to the front end too. Our work so far tells us that around 80% of what any given government web site offers matches exactly a common core - the remainder is dependent on the department's function (so some departments might need particular types of feedback forms, or particular ways of presenting policy papers and so on). But if 80% is common and that would cost, say, £500,000 or £5,000,000 or more for each department- then building it centrally makes lots more sense and can be done effectively and efficiently. So, the next step is to extend that work into other aspects and, as Mole picks up on, functions such as booking appointments or notifying of changes to appointments are included in the next list. I am not as pessimistic as Mole. The trick is to isolate the pieces that can be built centrally - building them centrally doesn't mean that you only ever have one of them - you can replicate the code widely if needed and update in releases (after all, doesn't Linux work this way?) - this is practice that I have seen in many pricate sector companies. The alternative to build one and replicate is build many and don't replicate. I've watched banks do that in previous lives. The ones that did spent 1/2 a billion pounds on fixing it for Year 2000. The ones that didn't spent a whole lot less, deliver better service to their customers (across many different countries) and make regular improvements that benefit customers and lower further their cost base. I think there's a good case for this plan - if I didn't I wouldn't be doing it of course. But it's not "soup to nuts", nor "carpaccio to creme brule". It's more like a rare steak, cooked close to perfection by a capable chef at a reasonable price - that's where the beef is. And what else can you expect from eDt from here on? In a word, "more". More delivery, more capability, just more.

Sunday, September 01, 2002

Finally for today, the PRO 1901 Census web site made it back online - some 8 months after it first went live. Someone at the hosting company told me this week that he had data that showed 40 million people had tried to access it on the first day (back in January). That's an amazing number - for any web site, let alone a government one! This has been a complex project with a lot of issues to resolve and, even now, there are restrictions on how many people can access the web site so that the folks at PRO can see how it performs and do some more tuning. Getting this online has been an impressive feat. If 40 million people try again and it works, then all the work will have been worthwhile.
The PAC on e-government was widely reported this week, notably by the BBC. Edward Leigh, the chairman of this particular PAC lauded the Inland Revenue for their progress but was clear that there needs to be faster movement towards an online society. I can't argue with that. He also mentions the recent security issues ... which I have covered before. But now that I am back online [with broadband] I found a story that I'd overlooked previously from Bill Thompson (he of which puts it in better context than any other article I've read). If you're in the mood for big pdf files, then the UN report on e-government is worth a read too. It marks the UK 7th in the world in implementation, behind a couple of countries that I would argue are probably not as advanced as they say, but, nonetheless not a bad showing. Our ukonline portal comes in for particular praise.
The other thing that's been on my mind for the last few days is the "soup to nuts" VNUnet piece. The author of said piece appears to be Julian Patterson, who also writes the "Mole" column on VNUnet (VNU, funnily enough, is down now ... I don't know whether it's me or something else, but VNU always seems to be down when I'm writing these notes, so I can't point to Mole). Mole's main point was that, although we've moved from "no plan" to "some plan", this plan is not enough and we are doomed to failure. I don't remember the exact quote (and I can't point to it right now) but it was along the lines of "marvel at the complexity .... is it credible ... no ...". Mole tells us that the idea of building central blocks of infrastructure is doomed to failure in a world as complicated as government, where nothing is the same, and, moreover, even if we could do it, the timeframe would be more like 10-15 years rather than 3-4. I've turned this over several times, over and over again. It doesn't pay to ignore feedback - and good feedback is hard to find. More on this next time I post later in the week.
A couple of weeks ago I sat through a supplier presentation on web services. The supplier was convinced that they could act as a "gatekeeper" between the user and their desires - reviewing, verifying and approving "web services" and then presenting them to the user for their consumption. The company's brand is strong, although perhaps tainted somewhat by previous experience the customer might have had with it. But still, it didn't seem to me that this was a model that was going to work. There's been a lot of hype over web services for a long time. I'm proud that more than 18 months ago I was part of the team that brought the Government Gateway online - the first web service to become publicly available (and one that opened up government to the private sector, supported digital certificates and several other firsts). I thought about writing up my thoughts on web services, UDDI, SOAP and all those other good words, but then I found out that someone else had already done a better job than I could hope to. So, a visit to John Gotze's web site is worth your while, whatever your knowledge on web services. You want the posts from August 21st and before. John's given me some ideas about what we should look at next in the UK and I'm looking at them now. Don't you just love how much pan-government sharing of ideas goes on with e-government - it's unlike any other discipline in the public sector.
Continuing this theme that e-government can't be just for the techno-literate, it's clear to me that somewhere along the way we have forgotten what the Internet can do for us. Our best opportunity for improved communication and facilitation is right here, on the Internet. But government has not yet mastered the art of simplification - it's put itself online with the same approach it has taken in the offline world: complicated words, numeric forms, segregation between departments, limited customer focus and so on. In the past, I've referred to government's online presence as a "matchstick Eiffel towers" - beautiful designs that have doubtless taken lots of time but that are ultimately not too useful. Government is taking the next step now ... bringing content together, aggregating it around the needs of an individual person. This is going to take some time, but it's going to happen. And when it does, the impact on how any given person deals with government will be huge - available benefits and tax credits will be found more easily, regulations that apply to companies will be clear etc. Imagine a pair of drop-down menus on the screen: "I am a ... teacher ... doctor ... school child ... nurse ... 2 person company ... UK-based company" and "I want to be ... a headmaster ... registrar ... university student ... 10 person company ... European import/exporter". Where you are ... and where you want to be ... and then all the appropriate government content displayed for you. The first step will only be links to what you need to know, but over the next 3 1/2 years, services will be added so that you can click straight to what you need to do to move from "I am" to "I want to be". I guess "I am poor" to "I want to be rich" will take a bit of time to arrive, but anyone who wanted "I am rich .. I want to be poor" could have been given advice to invest in the stock market over the last two years. I spent some time today shopping on Amazon. Every time I visit, I find it an easy, painless experience. I have one-click ordering turned on, so I click, select an address and the item is packaged up. Anything I click in the 90 minutes after the first order is added to the shipment. Amazon's web site is the way government should operate ... it's a set of silos, fragmented organisations, complex shipping arrangements, electronic, paper and physical delivery items. Yet, they're all accessible from a single interace - differnet tabs take you between the silos, but you can search across them all, select items from them all and have those items delivered as a single package. Amazon lets you know what you've ordered, how much it costs and when it's going to arrive and keeps you posted on status. For the hard to find things, it has a special service - count that as a government person answering the 'phone or responding to email on a particular topic where the help is not detailed enough or doesn't take care of the particular situation. This is a good model for government to aim for.
Yesterday I promised some thoughts on why the next generation of technology is important for e-government. The latest figures I've seen show that some 29 million people in the UK have surfed the Internet. That's around 1/2 the population, way up on the 1/50th in 1997 (just 5 years ago!) and not that far behind the USA (which is somewhere around 60%). But growth is slowing. Digital TV is in perhaps 7 million homes, that's around 1/3 of the total homes in the country. Mobile phones are owned by as much as 80% of people in the UK. e-government will need to be delivered across all of these channels, inter-changeably, as well as via kiosks, telephone call centres and contact centres and, perhaps most importantly, intermediaries such as Citizen Advice Bureaus and so on. The key word in that preceding paragraph is interchangeably. For the next few years, no-one is going to be able to do all of their business purely via the Internet - and certainly not all of their business with government. Sure, the services will be there ... but people take time to make changes, inertia is hard to overcome and real two way contact will take longer than even some of the pessimistic forecasts think (but faster than those who say "never" of course). There is little available today that lets that true channel hopping for government take place - authentication on a PC is via UserID and password, on a telephone through a few questions, snailmail relies on a signature and so on. The authentication technology for some of these will doubtless merge - it's easy to see the banking model of random characters selected from a password being repeated over the 'phone or tapped in on either a 'phone keypad, a proper keyboard or even on a remote control for a TV. But what this tells us is that the next set of technologies and certainly the one after that will be vital in making some of this happen. What's worrying me right now is that a new technology arrives and forces us to learn a new way of interacting and, much, much worse, forces us to lose a lot of what made the last device inherently personal. So, my 6210 mobile 'phone with its one-touch dialling, list of 500 contacts, personalised ring tones and graphics, preferences etc is gone when i move to a new 'phone - in this case a 7650. Of course, the new 'phone comes with new features and things to learn too - including digital certificates, e-mail and so on. So there are two things for a user to figure out - (1) how to make the new gadget personal again and (2) how to make use of the new features and make them personal too. I'm focused on mobile 'phones now, but this could just as easily be set top boxes - right now, some of them have 28.8kb/s modems in them which is hardly enough to do online anything let alone online government; they're going to need broadband connections, full browser capability, javascript etc. They'll be a time when the set top box is personalised too - favourite channels, local hard disk content etc - and, when they're networked a copy ought to be stored elsewhere ... so that an upgrade can take place simply with preferences carried forward. In a few short months, mobile 'phones will be a part of e-government; potentially a big part. Over that same time period, many of the 80% with mobiles today will be looking to upgrade them, to new models with more features: cameras, multi-media messaging, animations and, maybe, digial certificates that support authenticated transactions with government. The transition to new models needs to be smooth and easy for the consumer ... we can't keep throwing away every previous technology we've made our own for the sake of a few new features. Microsoft learnt this ... if you have to re-install a new version of Windows from scratch every time, you wouldn't upgrade too often; but try changing laptops or desktops ... that's not a lot of fun (and it dissuades me from changing too often). We can't afford for e-government to be for techno-literate. It must be for everyone, because the ones who need it most are the ones who won't know how to twiddle bits, bytes and widgets.
What I'm not too happy about today is the amount of time I've spent getting my whizzy new Nokia 7650 phone to work. No bonus points for you guys at Nokia, in fact no points at all. How any code released in the last month can tell me "this program has not been tested on this operating system and results may be unpredictable" when I'm running XP is beyond me. It took me about 2 hours straight to get the contacts off my old 6210 and onto the new one ... 542 contacts all together. And that doesn't include the time I spent tracking down an Infrared interface for my desktop (I couldn't get the install disc to work at all on my laptop). And, now that it's all there, I find the phone is about 3-4x slower with all the contacts there than the 6210. How can that be? I'll think some more about this and then tell you why it's important for Nokia, for me and for e-government that this next generation of mobile phones work slickly. I'll post that stuff tomorrow.