Thursday, October 31, 2002
Got rid of the horrible green and found the blue that I used on the main diverdiver site. At last. Of course, the fact that I like it doesn't mean that anyone else will. But it's way better than the green.
at Thursday, October 31, 2002 Posted by Alan
Wednesday, October 30, 2002
I met with Greg Dark from the Australian Tax Office today. Australia and the UK have followed broadly similar e-government agendas over the last couple of years with each country moving a little ahead of the other alternately. Today, I think the ATO are a bit ahead again. They have a new tax portal that looks pretty good - it offers a lot of interactive services (which you won't be able to see unless you are living in Australia and have a reference number for your business - an ABN). But, most interestingly today, they are moving ahead with certs ... but not government issued certs which is where they've been to date. The big 4 banks are going to issue token-based (i.e. smart card) certificates, along with the readers needed. I am sure that there are going to be hardware problems and operating system configuration issues with this, but it's great to see someone else taking a punt on this. And, because it's coming from the banks, there will be commercial support and useful things to do with the certificate that are not just government based. More than a year ago, maybe as long as 2 years ago, the banks here were talking about something similar, using the same vehicle (Identrus) but with software certificates. That just doesn't seem to have got anywhere. Will it ever?
John Gotze's been getting excited about open source opportunities in recent posts, driven by the usual suspects promoting laws to prevent it being given preference over proprietary solutions and a couple of conferences here and there (the latter 'there' being a Danish 'there'). I've been reading all the stuff for a while now, partly driven by pointed comments from John Lettice, partly by our own open source policy and partly by excitement such as John's. I guess I'm struggling a little over some points and, in trying to get clarity over these points, all I get is the usual positional arguments. This (the e-government agenda) is not a religion for me - but it is a passion. So, a few points, questions or issues: Let's say I get some software that's open source - maybe JBOSS (an app server that competes with weblogic amongst other things). Being government there are bound to be some things we'll want it to do that it doesn't do today - perhaps give it better clustering support, enhanced performance, stronger security features or more advanced administration tools (all problems with the present version from what I can see). It may not be in my best interests, as government, to put the code that I've modified back into the public domain, especially not in the security features. If I do, then people know (far, far better than they know today) what we're doing and can look for ways to exploit it. If I don't, then next time there's an upgrade (based on work of all the people who do put their work back, I've got to do lots of integration testing, regression testing and so on. So ... do I put the enhancements in the public domain or not? Let's then say that using the software I create a product - like a DIS box that connects departments (and local authorities etc.) to the Gateway. The software that I develop will need to be installed around dozens or even hundreds of departments. Now, I don't do that ... commercial organisations do that and they handle the integration and whatnot too. But how do they do that if I've built the open source version of a DIS? Do I just give it to them, can I sell it to them to recoup the costs that I have incurred in putting the thing together in the first place? What about if it's not me that puts the DIS together, but a commercial organisation ... how do they recover their costs? They can't just sell the hardware ... and if they sell a support agreement, then isn't it going to cost about the same as the software licence in the first place (on the basis that it must recover costs)? Something else that is puzzling me is all of the talk about open source and not much sight of it actually happening. I hear a lot about people not wanting to go public because they worry that it will send a signal to someone or other and that it might be misinterpreted. This strikes me as crap, but there are not major stories every day on new adoptions of open source. Or are there and I'm just missing them? I mean the German government announced not long ago that they were going to pretty much mandate it; IBM is putting at least a billion dollars into open source developments .... but what's being done? And I mean on a scale, commercial, fully performant basis here. I know that this site runs on linux - and that's a part of open source but I don't think it's the big part. For me it's the packages and integration of systems that are going to be important - how do you take JBOSS and some open source content system and an open source caching software and piece them all together to deliver a fully functional portal with no commercial software in it? When it's built, how do you keep it current, add functions and capability, block security holes, deliver scheduled releases with fully tested feature sets and so on. Is it just too early in the programme to expect this? I don't want to be flamed here - I want to know how to get round (or over) what appear to be the early obstacles in the roll-out and scale deployment of open source software.
Saturday, October 26, 2002
At the beginning of the year I read a piece on Acts of Volition about Cascading Style Sheets, XHTML and other stuff. At the time, I wasn't sure that I fully understood it all, but shortly after we kicked off work on a further refresh to ukonline. Right up front, it was clear that this was the way to do it. We'll be relaunching before the end of the first calendar quarter next year and, although there won't be dramatically visible change to the citizen, the work that's gone on under the hood is significant. Load times will be faster, support for screen readers will be improved, a greater range of browsers should be supported more easily (let's not talk about digital certificates here). So, I was gratified to read having made this move that Wired was doing the same. We're doing extra work with uko so that other departments will be able to drop their content into our existing platform and then manage their site directly, without needing to to endless design work, without needing to implement a content management system and without needing to buy hosting environments. Lessons learned and all that. Then, my weekend was made further when John Leyden at The Register published a great piece showing how e-commerce sites (banks in this case) struggle to cope with the various browsers that are available today. Browser variety is no fun - a webpage is a webpage to me and having to cope with the foibles that each company's software displays is frustrating. Now there's a good problem to crack.
Been having some fun and games today. My limited knowledge of all things technical meant that I killed the template that I use for this Blog. I couldn't find a way to get it back, so I thought I'd use the loss as an excuse to do a new one - Blogger (it turns out) has quite a big choice. I'm not much of a fan of this green and as soon as I can figure out the hex code of something more palatable, I'll change that. But I've got the titles working, added some links to people I read a lot (in no special order) and got the archives on the front page too - I'd prefer one of those little calendars that you just click the date on but can't find out how to do that. Some other time. As a complete non-techie the process was a bit of a challenge - make a change, publish, check it's ok, (it's not), do it again ... repeat until true. No wonder content management systems exist.
Friday, October 25, 2002
I'm sitting here, on a Friday night, watching old episodes of "Yes Minister". Tonight's episode, first broadcast in February 1980, is on a "big brother" project to build a single database that would store all citizen's details: health records, tax records and so on. The conversation goes back and forth about why you can and can't do a project like this. The exact project doesn't make a lot of difference, says "Jim Hacker" - the issues usually boil down to the same: legal, technical and administrative. The follow-up episode talks about open government and reductions in civil service headcount. Great fun, hugely prescient and wonderfully warming that so much that has gone before is coming again! There's a little note at the front of each episode that says something about taking the stories after reviewing contents of the Parliamentary Library; wonder how close that is to the truth? Public sector IT project failures have been in the news recently ... Simon Moores commented on the recent NHS e-mail system procurement, Michael Cross published a piece in Computing, Steve Ranger in Computing and this week's Economist carries a piece on NHS IT, prodded by Richard Granger starting work as head of IT there (you won't be able to get this last piece unless you are a subscriber). I've heard comparisons between public and private sector before, along with appropriate cautionary tales about them not being directly comparable. Maybe that's to ... but not completely. Private sector projects fail, fail dramatically and fail often - you need only look at the 98% of dotcom companies that have gone under or the telcomms companies that have failed for examples. Public sector projects are significantly more heavily scrutinised than private sector ones - if a project is late, over-budget and under original specification it is audited at least three times in the public sector (local audit, the National Audit Office and the Parliamentary Advisory Committee), each of which is likely to make their results public - and often within the same year that the issue occured. When was the last time you saw a CEO stand up and apologise to his shareholders for spending, say, 50% more than expected on a project? The answer to that is, of course, only in a bear market - by which time the old CEO is gone and the new CEO finds it in his best interest to surface all the issues that he/she can so that there is a clean slate to start with. All issues are blamed on previous incumbents. The list of these is legion: Enron, Worldcom, Tyco etc. If you want to include CEOs that have been given a hard time because their strategy was not working, then the list grows longer (C&W, Vivendi and so on). Steve Case at AOL does not tell us exactly how much AOL v8 has cost to develop; nor does Bill Gates tell us how much the latest release of Windows has cost (versus how much it was expected to cost). There is a big shortage of obvious data for private sector failures. Turning (at last I hear you say) to the public sector: the fundamental issue is a lack of skills in the civil service. Project management is not a highly valued skill, project delivery is even less valued. Large numbers of what today we'd call "intelligent customers" have been outsourced to suppliers over the last few years as the public sector has, rightly, tried to focus on what it does (policy and government) versus what it doesn't (IT). In the absence of these smart people, requirements are not buttoned down at the beginning, stakeholders are not consulted fully and often enough, technical issues are not explored and opened up early in the process and suppliers are used for market intelligence and decision processes rather than in-house resource ... so projects kick off, get into difficulty and have few choices but to carry on in the hope that they can sort themselves out. Sometimes they do, mostly they don't. Projects need full weekly reviews, sometimes daily reviews - with all participants. In government, project boards meet monthly, have papers a week in advance and do not usually consist of "intelligent customers" either so are unable to get into the detail of an issue (how could they? Project boards are made up of people several steps away from the specifics of the project). Project boards are there to give direction - to decide what the scope should be, to make choices between several courses of action; to cover issues that are not absolutely time critical. But if a decision needs to be made TODAY about whether something should be in or out, a project board meeting is just too far away; the issues are too complex and the attendees unlikely to have grasp of the day to day detail. In the private sector, the team is likely to be able to take the decision and tell the board what has happened so that they know. Public sector techniques are improving here - Peter Gershon's OGC Gateway reviews are a big part of the improvement. But there is a need for more ... questions need to be asked in a couple of ways "is this the right thing?" and "are we doing it right?"; bad projects can be killed off early with the right controls. Resource can be focused on the "right things". The second issue relates to "intelligent suppliers". There are few suppliers that are able to manage their client, especially when the client is as complex and difficult as government. Despite the transfer of resource, the middle tier of managers at the supplier end are not good enough at spotting the issues above either; not good at managing risks and putting in place the right mitigation or dealing with more than one issue at a time. So, both sides bumble on with issues buried at the coder level, deep in the technical architecture or just in plain bad requirements - and they are not realised until it's too late. Typically, we have relied on a "prime" supplier to manage the dependencies between companies. What this has resulted in is prime suppliers that struggle to integrate a range of different providers, add layers of cost on the initial quotes and reduce the transparency available to the project management team. The third issue is one of scale. Few private sector projects touch quite the same number of people as the average public sector project. Many private sector companies have reached equivalent scale - eventually. But not so many have to launch on day one catering for benefit payments for 10 million, 20 million or more. Scale delivery requires a different mind set when it comes to building systems and, especially, testing them. There is limited specialist testing capability in government - people that can really think through the end to end issues, simulate the scenarious, figure out what the right result should be - all with a vast set of circumstances to consider. So ... what to do? - The public sector needs to make "delivery" a firm competency that is rewarded - both financially and organisationally. Projects should not be the poor relation to policy. The solution is not to dump more Prince2 documentation on people or provide more training. It is to establish a career structure for technically competent project staff where they are rewarded and even incentivised directly for successful delivery. - "Intelligent customers" need to be recruited from around government and deployed on the most significant projects. These are going to be specialist people who will move around from department to department. They will manage big programmes for two to three years at the very most and then move on. They will also be deployed into failing projects (because some will always still be on the critical list) for short periods to put them back on track. A solid career structure will be needed if these folks are to be civil servants, else the departments will forever have to rely on capable contract staff recruited at high rates for short term roles with minimal skills transfer involved. - Suppliers should be encouraged to move away from the traditional "prime contractor" process where issues are hidden several layers down. Instead they work in a "top table" fashion - all represented on the project team and equally able to identify, raise and resolve issues. Suppliers will need to work in new ways with the new types of project managers that government needs - the old way of hiding issues, negotiating from a position of strength (after all, who else is government going to choose?), blaming scope changes on the department and not on inflexible methodologies and poor discipline ... all of those have to go. - OGC Gate reviews are already well entrenched. Their scope should be expanded to include not only "is the project doing things right" to "is this the right project to do"; i.e. does it really make sense to do this project, or can we use something that has been done elsewhere, perhaps with minimal customisation, to deliver faster. Departments should not replicate what has been done elsewhere. - Departments should be focused on deploying technology as it is, not with making endless tweaks and changes to a system so that it fits their process. Changing large systems is more expensive - probably by an order of magnitude - than changing business process to match a system. - Finally and, I think most importantly, the lesson from "Yes Minister" is that there is not a single lesson to be learned that government has not already learned. Since the time when Cromwell signed himself above the Monarchy, projects have started and stopped, succeeded and failed ... and government has, somewhere in its files, the details of them all. Those lessons need to be easier to find, widely reviewed, embedded in training, kept up to date and checked up on over and over. If you, as a project manager, can stare at the history of all those who have gone before you and can see what they did wrong, you are unlikely (unless you are very stupid) to make the same mistakes. Suppliers, of course, will get to see the same lessons, whichever supplier was responsible. Without true open-ness here, we will never emerge from the swamp we are in (for we are already neck deep in aligators). We can but hope.
at Friday, October 25, 2002 Posted by Alan
Monday, October 21, 2002
It's not my day today. I'm trying to get "titles" to work properly on blogger - some blogs have a title ahead of every post. I thought this might encourage me to think up some catchy headings for my posts, but I just can't make it work. And the blogger help pages are all "unavailable" today. Ho hummm. This e stuff has a long way to go, huh?
Over the last 12 months I've made 3 trips to Dubai, each time mixing a bit of business and lots of pleasure - it's a great place for a beach holiday. Kablenet note today that there's a revamped portal available that lets you do 100 transactions and more. When I was last out there, I think the Prince was throwing out the incumbent IT supplier and bringing in a new one, so perhaps this is the fruits of their labour. Sadly, the link from the Kablenet site doesn't seem to work (it's certainly one of the most convoluted URLs I've seen and it even has "http" in it twice ...http://portal.dubai-e.gov.ae/http://web-vgn.dubai-e.gov.ae:8083/egovd/cda/main/Home_Page). So, I tried the obvious www.dubai.gov.ae. That gave me the message "Notice: This is not the official page of H.H. The Ruler's Court. This page was constructed only to provide access to other web pages hosted by H.H. The Ruler's Court". But in turn, didn't actually allow me to "access" any pages. I know that Dubai has strong ambitions to be fully electronic sooner than others and also that, when I last looked, they have done some good stuff. I'll check back some other time and see if they've fixed whatever this problem is. Who knows, I might want to go and work there one day so it would be great if I could do that online!
I got the predictable responses from my article in Computing last week - the ones offering me a tool to catalog this, automate that or publish the other all whilst singing "God Save the Queen" or otherwise relaxing somehow. Predictable that is except for one - a handwritten letter. It's not often that I see such a thing - even my 13 year old goddaughter writes to me on her PC these days. The letter was from someone who used to head the communications team of a big government department. She said some nice things about me and then went for the killer blow - that "Government content is seriously flawed", all written in caps. Maybe there's a point there. I certainly didn't want to make anyone think it would be easy to move ahead - you wouldn't start from where we are if we wanted to get where we want to be, for instance. But, this letter made me think a bit. It's true that our content is flawed - the letter-writer didn't elaborate (I'm going to try and engage in some email dialogue to see if I can draw out the point). It's flawed for lots of reasons: - There's too much of it. Last week (I think in the FT IT supplement), I saw a quote from someone that said "we need to decelerate the rate at which we acquire data". Even though I laughed about that at first, there's some truth in it. What we need to do is make sure that data ("content") that we are putting on the web is worthwhile, is tagged correctly and that it fits in the information architecture that we have. Most importantly perhaps, we need to have an expiry date on it - so that, if we don't need it anymore, we can mark it as such and move it elsewhere (instead of keeping it on an expensive, content managed front end). - It's not easy to find. Apart from it always being hard to find something small (how to apply for a grant) in something big (the myriad of government content), it doesn't help if you have to search in a lot of places. 1,800 websites or more. Umpteen search engines. So, we need to decelerate the rate at which we acquire domain names. And, as quickly as possible, slam the engine into reverse and start shedding them, merging several together and generally thinking around how the citizen might want to access data rather than how we might want to organise it. - It's not tagged correctly. Most content is tagged with the basics of the "dubline core" - i.e. who edited it, when etc. But it's not tagged with anything that we might recognise as being customer focused. Our metadata standards will need to be expanded to pull this off ... in fact, we will need to accelerate the rate at which we develop the standard, so that it quickly becomes sufficient to describe client segments and sub-segments, geographic locations, age groups and so on. - It's presented badly. Well, not badly as such. Presented differently every time. Every department has its own look and feel - their own style, navigation, templates and so on. As you move from site to site, you have to look for how to do things; not for what to do. That's not an easy thing to address because ... - It's static. Last of all, none of this can be easily done if most websites are managed using static HTML files. This was fine to begin with but has become hugely unwieldy - and, in fact, is accelerating in the rate at which it becomes even more unwieldy. A move to content management (remember, something you do, not something you buy) is overdue for the big departments - for it is they that have the biggest problems (and, I suspect) the same holds true for most goverment sites around the world. I don't see much evidence yet of dynamic content management, although I do see a lot of procurement requests for them coming out. So, yes, Government content is flawed. These issues, as far as I can tell, apply pretty much globally - most governments have 100s if not 1000s of websites; few orient their content to "needs" and fewer still do anything more than aggregate lots of links to point to the variety of content. Guilty as charged. So we have to do something about it. Something soon. And it's going to take a lot of work. Better a diamond with a flaw than a pebble without as I believe Confucius said a long time ago. What we have has the potential to be enormously powerful in helping people get the benefits that they deserve, the services that they need and the information that holds it all together. Time now to start surfacing that content, looking for the gems that are just right and throwing away the pebbles. Maybe our content "beach" looks a bit like Brighton beach right now - but that can be changed.
Thursday, October 17, 2002
Following up on the Ooops and the Dead Cert points below, it's worth thinking perhaps about two points in regard to digital certificates: 1) Is the policy wrong? We have different levels of authentication required for different transactions, laid out in a comprehensive policy. The policy also gave the ball to commercial enterprises to stimulate the market for certificates, rather than taking the (perhaps easier) route of central issuance. Differing levels is probably right. Government issuance is probably wrong, although time will tell. There are other countries (notably New Zealand Australia) who have tried it - NZ abandoned the project and, last time I checked about 8 months ago, Aus had issued a few thousand certs (to businesses only, on a base of over 1 million businesses). Viacode is gone, a victim of the Post Office's need to bring costs under control. Others are coming into the market though - Equifax last year, BT any day now. 2) Is the technology wrong? Digital certificates in the browser makes sense. The whole point of plug-ins (I thought) was that they were largely transparent to the user. I have flash installed in mine and whever a website with flash in it comes up, it's all taken care of. Maybe certificates are more complicated, but maybe there's more thought needed from the vendor community about how to make them easy? After all, since we put the Gateway live we've iterated through 2 or maybe even 3 versions of the main browsers - but no changes to the way certificates operate. Just some more thoughts.
at Thursday, October 17, 2002 Posted by Alan
Wednesday, October 16, 2002
"Honest, Alan - I just want to encourage some healthy scepticism, here as everywhere" says Bill T ... Absolutely right. The more, the better. Don't for a minute think that I don't want to hear the questions asked. I'm still reeling from the Kablenet healthy scepticism that said we should abandon the 100% online target. I sat in a meeting today where that view was held by at least a few people round the table. They said 80% would be fine as that's all we'll do (after, Burial At Sea is probably not going to make it online - imagine, if it does and people start to want it, the online revolution could rescue the British shipbuilding industry). So ... if 80%, why not 75% or 83% ... and which ones do we not do? Too hard to think about - much better to set a goal and go for it and with the change to the target recently that said "key services seeing high levels of take-up", then we have a bit of extra impetus to make sure that we don't just shovel stuff online and have no-one use it. That would be a waste. And there's already enough of that. What gets measured gets done. No idea who said that, but it seems apt.
I knew I was going to get into trouble over the digital certs piece on The Register, per this posting, "Dead Cert" from Bill Thompson. I think it's worth me putting a bit more background to the story - some of the history, the issues and what might be done. When I started work on the Government Gateway around two years ago (can it really be that long?), one of the first issues was whether to support digital certificates or not. At the time, there was a policy that said that 4 levels of transaction existed (from 0 meaning no authentication to 3, full notarised authentication). Level 1 could be handled by userid/password (where there was minimal risk that information would be disclosed or financial loss would result if it was compromised) and level 2 needed a digital certificate (because there was a higher risk with the transaction). So ... in reality, paying money to government would be level 0 (who cares who you are if you are paying money to government?), 1 would be for something like checking the status of an existing transaction (perhaps finding out when your passport would be ready to collect), 2 would be for claiming child benefit and 3 would be the "gold standard", perhaps for getting a passport or a driving licence. That was all written up in a lengthy document which I inherited to execute against. T-scheme was in its infancy and there were no certificate issuers to speak of - although Chambersign was just starting up, BT were doing some stuff with Verisign and the banks were just getting together around Identrus. What we didn't know was if departments would start off wanting to use certificates for their transactions. MAFF (now DEFRA) had done some trials the previous year with certificates and found some significant challenges. But there was not much else to study for lessons learnt. MAFF decided to come on board with the Gateway from day one, as did Customs and Excise and the Inland Revenue. Customs and MAFF opted for certificates - both aiming at business customers (with quite a high overlap; pretty much every farmer is also VAT registered). So, off the team went to figure out how to make certificates work. The W3C people were still working on the standard for signing XML, so we took the early drafts and some very smart people spent a lot of time figuring out how to get the XML into a standard format, sign it, send it to the Gateway, check the signature (against non-existant revocation lists!) and then pass the transaction to the department, minus the stuff that they didn't need to know about (principally the envelope). Sounds easy. Turns out it's not. It was quickly realised that a Java applet was needed that would be "plugged in" to the browser that would find the XML, canonicalise it (put it in a standard format ready for signing so that we could be sure of what we were receiving) and then send it to the Gateway via the submission protocol that we had developed to ensure delivery. The folks that developed IE had put certs in a special place and written an API (the crypto API) that did a lot of the work for us, so the applet was quite simple (and runs to about 70kb if I remember correctly). The Netscape folks didn't put the certs in the same place and didn't have the API, so we had to write all that stuff (and that took a lot more time, and the applet was over 400kb I think). Other browsers didn't have the capability that we needed, so we didn't support them initially, hoping that some standards would emerge and we would be able to pick them up as they came. Those standards, despite good work all round, are not in place - and Linux browsers (as far as I can tell, don't support certs the way IE and Netscape do). The applets were written by people from Viacode (who were using certificate technology from Entrust); some very smart people at Microsoft designed the signing process and wrote the Gateway end of the deal. We were, I believe, the first people in the world to get digital signing of XML to W3C standards working. I'm proud of that and proud that the people who put so much effort into this from across the various teams were able to get it all working so quickly (remember, Gateway V1 was built from start to finish in about 90 days). By the time we were live, Viacode was selling its certificates through Chambersign. Shortly afterwards, Equifax launched their certificates (for about 1/2 the price of Chambersign, or £25). Then there was a big gap .... no-one else came to market. Take-up was slow - partly because there was an overhead in getting a certificate, partly because you had to pay for them, partly because once you had one there was not much else to do with it. The commercial sector didn't pick up on them - none of the online banks required them (although we had many conversations with all of the banks and they were all "just about to do something" - B2B was a key driver, better security and whatever). All of this came to nothing in the commercial world, frustratingly. So, with few providers, only government wanting certificates and the technology issues that came with the certificates (which caused endless problems and gave the helpdesks some fun and games to deal with), there was not much success. Now, nearly 2 years on from launch it's clear that they are not working ... I noted in the Register piece that for every 6 businesses sending PAYE online (which only needs a userid/password), there is one business doing VAT and perhaps 1/2 a farmer. Not big volumes. We need to change this. Either they get easier to use (which means the browser providers standardise - one place for the certs, one API etc); we find something else (smart cards need readers and are similarly non-standard, so please don't suggest those; quizid looks interesting and might be part of the solution; USB tokens with certificates have similar problems as normal certificates but are at least portable; mobile phones don't seem to fit as more than 70% are pay as you go); or we give up and continue to use userid/password and take the risk (after all, today we rely on signatures which every bank will tell you are not at all reliable). But if a simple signature works fine in the offline world, why ask for more online? Because it's an opportunity to improve security, reduce fraud and do a better job. I don't think government issuing certificates gets round this problem. Why? Because certificates are inherently not portable - lots of people use the Internet at work and a certificate today is installed on one PC and is difficult (but not impossible) to move around; because they are not transferable between channels - there is no chance of certificates working on digital TV for a while (although for ages I thought that the second smart card slot on Sky boxes might help) and if you need to do a follow-up phone call we won't be able to use your digital certificate to authenticate you. This is hard. It's going to take some "banging heads together" as John Lettice said and some really good thinking. I'm open to all ideas. Not giving up yet ... but definitely heart massage is required to get this one going.
Monday, October 14, 2002
Kablenet are running another survey today ... should PFI be used in government. I'd like to respond, but the settings I have on my Zonealarm firewall don't allow their Surveymonkey site to work. Last time, I switched everything off to work, but I won't keep doing that. I have my firewall set to High security (Stealth) and I block cookies from tracking sites, but allow them for personalisation. What else do I need to do for a rich web experience? Everything else works fine.
I talked to John Lettice today about digital certificates - my view is that certs are on "life support", a quote which he used to head his piece. It seems to me that PKI has taken 30 years to get to where it is today - pretty much nowhere. It's spent 20 of those years in the "plateau of discontent", to use a Gartner phrase. We need to find a solution that works cross-platform and is portable between locations and devices - so the same "id" that I carry works on the 'phone, on my digital TV, on my PC (both at home and at work). That seems a long way away. John wraps up with a position that we might take in the UK to force something like this to happen. We shall see. Might be right, might be wrong. But something has to change to make this technology work - otherwise we have another WAP, lots of hype, lots of excitement. Zero take-up, zero enthusiasm, zero future.
Bill Thompson (he of andfinally) notes some of the problems with benchmarking over on Voxpolitics. It's not often that I disagree with Bill, but he's plain wrong here. His stance is that benchmarking is wrong because it uses the USA as a reference point and many of the measures used have no relevance internationally - he quotes hospital appointment booking in Canada and Ireland as being irrelevant for instance. I've travelled around a bit for the last 2 years and one thing is very clear. Every country that I visit or every one that visits me in the UK wants to know what every other country is up to. e-government is not easy and we're all struggling with the same issues. Each country, of course, has its own degree of hype and many of those asking the questions are trying to get behind the hype to see if something really does work. If millions are to be spent on delivering transformational government it makes a great deal of sense if lessons learnt elsewhere are applied - first time round! So benchmarking which, after all, is established as a pretty sound business practice in the corporate world, gives us some clues as to who is moving ahead, what's working, what isn't and sometimes (if we're lucky) why. I agree that some of the measures used do not apply to all. But if it works for 10 out of 12 countries, why would you not use it? Besides, everyone loves a competition and who doesn't want to know where they are in a league table? As long as that doesn't stop everyone working together to solve the issues you won't find me complaining. I've been impressed at how much genuine learning is going on as project managers, senior government officials and ministers grapple with the delivery of e-government. None of us want to get this wrong, it's too big, too important. For many countries, it's vital - it's the first chance that they will have to leap ahead of other countries and offer a genuine competitive advantage. If the regulatory burden is decreased or the time to market is reduced through these initiatives, there is a greater probability of additional investment in the country. Bill also refers to an article by Mike Cross (former editor of Kablenet) in the Guardian noting the apathy with which Briton's seem to regard e-government. This, unfortunately, seems to be a true. Mike thinks it's because we describe it in such a boring fashion that no-one can be bothered. He may be right. On top of that though, it's a lot to do with the mass of services that we have available - we are some distance from critical mass of transactional services. Also, the services that we do offer are not yet wrapped in sufficiently coherent information that we can just find what we need, complete the transaction and leave. A positive experience brings people back, a negative one puts their return in doubt. We need to do a lot more work to simplify the presentation of government to the citizen. Funnily enough, our model so far is pretty close to many other governments - so it could be that their citizens have a greater tolerance for frustration and boredom than us Brits!
Saturday, October 12, 2002
Another great quote from Phil Windley ... "Government has to be and will be in the identity business. Governments care more than most about tying identities to bodies than most and will act to ensure that its possible...eventually. Just don't hold your breath". Isn't that just the truth.
A few days ago, I talked about the e-government programme in the USA getting some more money and, maybe, a new IT-head for the whole thing. Well, it looks like that isn't quite what's happening. In any government organisation anywhere, getting this stuff done is hard. Without authority (in line management terms), input into the compensation process of all those involved and direct budgetary responsibility for delivering projects (and for retiring old systems), it's practically impossible. Not certainly impossible, but definitely requires superhuman effort - and there aren't too many people able to put that kind of effort in for the length of time needed. If the USA don't get this right, then despite all the great plans and proposals I have seen, I think the efforts will stumble a year from now. It will take a year because it takes time for the enthusiasm to die down.
The OeE comes in for a bit of a burning from John Lettice at the Register this week. The regular monthly report to the Prime Minister was published recently. Well, it's nearly regular - we skipped August because everyone was on holiday and then, I guess, by the time September came round no-one had done much because they'd all been on holiday in August. So, it's October and here's the report. If you've visited the OeE site before, you'll see that it's been redesigned and now looks a lot like ukonline. Good stuff. Anyway, John makes some good points as usual. The important thing, for me, is that demand for e-government services is increasing - whether it's getting the Iraqi dossier online, sending in your Self Assessment form or using the Government Gateway (and, to address the point he makes, the Gateway works fine with Mozilla, always has done. What doesn't work is digital certificates and that's down to certificate technology vendors writing applets to handle it. They haven't). And John's right, the SA system is nothing specifically to do with the OeE ... except that it runs through the Gateway and that we spend a lot of time working with IR to help with the next releases - there's one due any day now.
I saw this just now on DiveIntoMark .... It announces a full redesign of the Wired News site. The folks at Wired have moved to full CSS and XHTML. We've been going through the design changes needed for ukonline.gov.uk to support this change - for us, it needs some upgrades to the architecture and, of course, some significant re-coding. At the same time, we wanted to open up our platform so that the rest of government could manage their websites through the uko environment too. If you are in the UK public sector, you might have heard this project referred to as "DotP". We will launch on Feb 28th next year. Importantly, this change will remove the distinction between our standard site and the "easy access" version - the style sheets will change according to the device being used to read the site, whether it's a screen reader, a PDA or whatever. That's the theory anyway - I am hoping that all the devices out there are smart enough to do this. The other benefit is that page size shrinks pretty dramatically, maybe a 1/4 of the size or less. Less code means faster download and faster access to the information needed. Now that I see Wired doing it, I know that we have to hurry up and get there.
Wednesday, October 09, 2002
A week or so ago I mentioned that I was writing a piece for a journal. I guess I kept the facts short just in case it wasn't published. Well, I'm pleased to say that it's been published today (at least in the online world) and I imagine it will make print tomorrow morning, in everyone's issue of Computing. It's the first time I've been in print as "just me" since about 1989 - although I did have a piece published in Government Computing last year that was a joint effort between me and several members of my team.
at Wednesday, October 09, 2002 Posted by Alan
Sunday, October 06, 2002
Continuing the authentication theme, while browsing the must-read Scripting site by Dave Winer, I came across a link to Jon Udell's site. Seems there's a digital identify conference coming up and Phil Windley, CIO in Utah, is speaking. Phil is someone that I've tried to meet at conferences in the past and never managed to - much to my disappointment. He's clearly plugged in to the issues - and facing them down pretty much the same way as we are. The debate in John's post is whether government's should issue digital IDs or not. Every country or state will face this issue - in the UK we chose not to and have seen limited takeup (principally because of limited usefulness, standards problems and probably lack of interest). If the IDs are issued by government, then it could be argued that a lot of these problems will go away and that, because they exist widely, private sector companies will piggy back off certificates and use them in their own products, re-inforcing their usefulness. My favourite quote is "Lord knows PKI is a can of worms". I've been quoted in the past as noting that certificates (in their 'install to hard disk form') are on life-support. Sorting this one out is a headache for us all. There are a few new things coming that might make it easier, but nothing that's going to blow the doors off just yet. Phil's weblog on this topic is an excellent read ... his "motto" on the site is "organizations usually get the IT [that] they deserve" and I don't think you can say fairer than that. Just after that post, Phil talks about how journalists seem to make so many factual errors, are sloppy and don't bother to investigate. I've been there - and somewhere in this blog are some similar emotive comments, I'm sure. But the more time you spend with journalists and the more you read, the more you see that it's a tough job. If you have to take three bits of data, two suppositions, one possible fact and a couple of rumours and blend them together, the story is bound never to be absolutely exact. In my world where so many things are off-limits and where a story breaking can be quite a coup, it's no wonder that the facts are often wrong. It's the race to the deadline that probably causes the problem - noone wants to be last to hear about a story, so lots of things make the press earlier than perhaps they should. Now there's a turnabout of views for me. Just shows that I know when I can be wrong. Still, I'm always delighted when the facts are right and, when they're wrong, there's usually a good laugh to be had. Unless, of course, the facts are wrong about one of my projects.
Kablenet also notes that the US have tested the first version of their "Government Gateway"-clone, known as the 'e-authentication gateway'. I'm delighted for them - delivering projects like this is a challenge and it seems it's been overcome in the USA too, at least in a pilot version. The real stuff lies ahead. Getting the standards adopted for transactions using XML, getting industry to adopt it and use it broadly, making digital certificates work cross-platform, making the services easy enough for citizens and businesses to plug into and use - those are the next set of challenges. We've been there - and it's taken us nearly 2 years of hard work to get this far. These kind of issues can be solved much faster through collaboration than competition. Interestingly, other news reports note that the US lawmakers are lobbying for an additional $200 millon to fund e-government work and are looking to appoint a "government-wide IT manager". If that person is to have budgetary and organisational accountability across the whole of federal government IT, then some real progress can be made - and dramatically faster than almost anywhere else. But ... and what a BUT it is ... if the authority is lacking, and the person has to rely on influence, then there is no chance for success.