icon
Blog

These blog posts are maintained by seeDetail employees. There is a technical blog on testing written by Daniel Cottrell, and a another blog on wider issues surrounding testing and IT written by Chris Neville-Smith.

Posts from all blogs are collated here.

27 February 2013, 12:16 pm

Security should be everyone’s responsibility

There are two main enemies to security: convenience, and inconvenience. Better public education of the risks would make things safer.


"But I only wanted to check my Facebook."
(Photo: 48states, Wikipedia)

Security testing is a very specialised branch of software testing. Unlike most branches of software testing, where you’re simply trying to iron out things that go wrong by mistake, in security testing you’re fighting people trying to make things go wrong on purpose. It requires a lot of responsibility on the part of the testers and a lot of trust on the part of the clients – indeed, there are suspicions this gets abused – and consequently, many software testers won’t put themselves forward for security testing. Nevertheless, most testers will highlight security concerns as and when they notice them, and therefore take an interest in whichever high-profile security breach is in the news this week. Which brings me nicely on to of Hackgate.

Now, in case you lost track of the plot somewhere around episode 4,605 of the Leveson Inquiry, one of the latest developments is a claim that hacking extended to e-mails. At the moment, unlike phone hacking, this has not yet been proven or admitted to. But, quite frankly, it would come as no surprise if this turns out to be true. Like voicemails, the security surrounding personal e-mails has been notoriously lax, and practically an open invitation for hackers to pry into private matters.


In the olden days of workplace and university e-mails, your e-mails would typically be managed on a local server, which was great until you went home and had no e-mail access. This changed with the coming of Hotmail, Mailcity and many other web-based e-mail services that allowed anyone to read their e-mails anywhere in the world. The snag: this also allowed anyone in the world to read your e-mail, if could find a way round the password protection. And that was scarily easy: even if your intended victims hadn’t been silly enough to set their passwords, say, the names of their favourite pets, it was often a simple matter to use basic personal information, like a mother’s maiden name, to reset their password on the Forgotten Password page. Worse, it was (and still is) quite normal practice to store every e-mail you have ever sent and received on a server, ready for a hacker to pore over a lifetime of indiscretions. And in case you think this is just paranoid speculation: it’s happened, and it’s been nasty.[1]

In defence of Joe Public, it’s not easy to protect yourself when big IT companies routinely prioritise convenience over security, or – worse still – offer insecure products as standard when safer solutions already exist. When broadband first became popular, the “broadband modems” supplied by most ISPs offered virtually no protection from the outside world, even though routers with built-in firewalls were available at the time. (Windows firewall and other firewalls built into computers aren’t enough; it only takes one rogue program to switch it off and your protection’s gone.) Routers only became standard when wi-fi became popular, but this introduced the equally bad problem of unencrypted wi-fi; this was standard, and configuring encryption yourself was a nightmare. Internet suppliers have, thankfully, caught up with this and now routinely supply pre-configured encrypted routers, but even now new problems are emerging. Thanks to Facebook, we are being encouraged to put all of our personal information in semi-public view, even though this can be used by fraudsters to impersonate you. Meanwhile smartphone suppliers make it so easy to put so much personal information on your latest gadget, stolen smartphone are going like hotcakes not because of the handset but all the data you can use for identity fraud.

Large businesses, however, often make the opposite mistake to domestic users. They heavily lock down what users can do on the system, bog their computers down with bloated security software, refuse to consider any new software or upgrade of existing software without an overblown laborious “impact analysis” (meaning in practice that everything new becomes cost-prohibitive), and sometimes even prevent staff from encrypting data because it’s not in line with the security policy.

This Fort Knox-style mentality is just as dangerous, because it gives staff the choice: either work at snail’s pace on inefficient systems, or take short cuts such as bypassing security features or sending confidential documents to their home computers. I can’t help thinking that no-one would have copied poorly-encrypted data to two CDs that got lost in internal mail had suitable data transfer or encryption software been made available.

No, I think this problem can only be solved through better public education of IT security – and it has to go way beyond just saying “Install a virus checker” as if this takes care of everything. I don’t believe the farce at HMRC would have happened had they understood password protection is not the same as encryption, but I think we need to go further than IT departments. No-one would claim the latest safety features on a car is a substitute for a driving test, and the same principle should apply for computers. I don’t propose to ban people from using computers until they’ve passed a computing test – however tempting that may be – but we should be aiming for an online world where everyone understands the dangers. We don’t need to be paranoid about everything, but if we took a little more care with personal data – either properly protecting it, or even not leaving it lying around where it doesn’t need to be – cyberspace would be a much safer place.

[1] Okay, the tabloid e-mail intrusion went a bit further than this. It wasn't just cracking webmail passwords, it was outright hacking of people's own computers. But I'll bet it began with the easy opportunist snooping first and went on to more determined hacking once they realised how much information people were leaving around and how profitable this scheme was.

27 February 2013, 12:16 pm

Don’t be afraid to upgrade


Upgrading software in the workplace requires caution – but some companies make this far more complicated than it needs to be.




No, you’re not having a strange dream, Microsoft really is celebrating the demise of a flagship product. Continuing the tradition of celebrating milestones in web browser development with cakes, Microsoft’s latest cake marks the “death” of Internet Explorer 6 – or, more accurately, the decline in US IE6 usage to 1%. Microsoft have make a huge effort to get people off Internet Explorer 6 (obviously, they’d rather you went to Internet Explorer 7, 8 or 9 than Firefox, Chrome or Safari, but an effort nonetheless) through hasty development, advertising campaigns, and now even silent updates to upgrade remaining computers. And with Microsoft themselves admitting IE6 has had its day and even the die-hard open sources fans accepting that IE7 onwards is a big improvement, you’d think everyone would be happy.

If, however, you’re reading this blog from a UK government building, you may think you’re accessing news from a parallel universe. The UK public sector is inexplicably at odds with the rest of the world. IE6, like most early browsers, has a sluggish Java engine that runs at snail’s pace on modern Java-Rich pages. Most public web pages have now dropped support for IE6. And yet when the China hacking scandal exposed hugely embarrassing security flaws in IE6, and the French and German governments warned everyone off IE6 (and , for a while, later versions), the Cabinet Office insisted there was nothing to worry about.  To be fair, web browser security isn’t the be-all-and-end-all for government buildings – their strongest defence will always be the safeguards within the Government Secure Internet – but the web browser is the last line of defence in a compromised network, and it’s a reckless to rely on a web browser written before widespread broadband adoption and the security threats it brought along.


The Cabinet Office does, however, make a reasonable point. Upgrading a system in the workplace is not a just a simple matter of waiting for Microsoft / Apple / your Linux vendor to issue an update and click on “Yes, Upgrade”. The effects of the same upgrade can vary from one computer to the next. Many Mac users were caught out last year when the latest OSX upgrade rendered their pre-Intel software unusable. This is not normally a big issue for most domestic users – the worst that can happen is a few computer-free days until someone can put your old software back – but in a business, even a few hours without working IT can cost thousands of pounds. Businesses also have to consider whether the latest upgrade exposes them to new security threats.

The UK Civil Service, however, takes this to the extreme by refusing any upgrade without a thorough acceptance testing process – meaning in practice that almost everything is ruled out on cost grounds. That is not how you are meant to approach software testing. Instead, you should prioritise your testing based on risk, and the risk of upgrading IE6 after 7, 8 and 9 have been used by the public for years without problems is minimal (as is using Firefox or Chrome). You certainly don’t need the extensive testing required for software specially written for your own company.  (And okay, if you’re the Civil Service, you also need to think very carefully about security implications of upgrading – but doing nothing exposes you to the security implications of not upgrading.)

There is also a strange obsession that any change to IT entails expensive training costs. This is sometimes true – I, for instance, would have be hesitant to drop an Ubuntu-based workplace straight into controversial Unity desktop (Ubuntu only got away with this because their user-base tends to be tech-savvy) – but most of this time this mentality assumes workers can’t cope with even the simplest intuitive change. I’ve said before that public knowledge of IT could and should be better, but that doesn’t mean ordinary office workers are all IT-literate idiots. The equally controversial ribbon that came with Microsoft Office 2007 was a big change from earlier versions, but you’ll struggle to find a workplace that rushed into Office 2007 without training and found its workers couldn’t cope.

Then there’s the problem of workplaces locking themselves into outdated software – and this is a particular problem with IE6. Many workplace applications were written to specifically run through Internet Explorer 6, making an upgrade impossible without a fundamental rewrite of all these applications.[1] This was an easy mistake in the early noughties when IE6 looked set to be Grand High Lord of the Internet forever, but one of commonest complaints I’ve heard from software developers is that even when IE6 was on the decline and they warned customers of the dangers of locking yourself into IE6 further, companies were still insisting that applications were written to run through IE6 because that’s what they’ve always used.

Finally, I can’t help thinking that there’s a mindset that slow and unreliable systems are something normal. When I was last in a government building, I was regularly screaming and cursing that something as simple as checking the price of a train ticket took me five times as long as my (relatively low-spec) computer from home, but this didn’t seem to be considered a problem. When managers are downplaying the negative impact that out-of-date software is having in their workplace this much, the change of doing something slips even further out of reach.

In a way, software testing has a lot in common with health and safety. Good health and safety is all about identifying the risks and concentrating your efforts accordingly, so that you can carry on doing you’re doing safely (so frequent accidents such as slips, trips and falls, and serious risks such as road accidents get more attention than the chance of getting a papercut at your desk). Lazy health and safety – the sort which gets gives the business a bad name – involves overblown risk assessments over the most trivial dangers to the point where the only practical solution remaining is to not do it at all, which is why you get schools cancelling school trips for daft reasons. The same principle applies to software testing: good testing helps you achieve what you want safely, bad testing stops you doing it completely. And like silly health and safety decisions preventing children playing outside, the risks of not upgrading can often be far greater than the paranoid risks used as justification not to do it.

It’s perhaps unfair to blame project managers for being risk-averse. There is no shortage of botched IT projects out there, so it’s understandable why people would choose to play it safe and stick with what they know, however inefficient it may be. But the paperwork around upgrading is far more complicated than it needs to be, and if we’d focused more on what really matters and less on hypothetical scenarios that don’t, we could have enjoyed Microsoft’s cake much sooner.


[1] Having said that, you can install a modern version Firefox/Chrome/Opera/Safari alongside IE6 so that you can access the internet on a modern browser whilst still having use of your IE6-specific applications. But given the lack of adoption of this easy solution, I can only assume that companies who mindlessly run everything through IE6 are the same people who obsess over overblown acceptance testing and training costs whenever anyone considers using a new product.

27 February 2013, 12:16 pm

SOPA is not the answer to piracy


Ordinary people’s livelihoods need protecting from copyright theft somehow – but SOPA is too high a price to pay.



Apologies to software testing blog entry fans, but this week it’s another generic IT-related post. This can’t wait because, as you may have noticed, there was a blackout of several websites last week, most prominently Wikipedia. This was in protest over the Stop Online Piracy Act  (SOPA) going through the US House of Representatives, and although this is only a US law, like software patents it stands to affect the UK. The participation of Wikipedia has suddenly brought this issue into the spotlight, with pro-piracy activists, pro-control record companies and all sorts of people in between giving their points of view.

Let me be absolutely clear: I have no time for pirates, especially not those who run websites like The Pirate Bay. They are not noble crusaders selflessly standing up for internet freedom – they are big businesses who make a packet from advertising and subscriptions without the tedium of sharing the proceeds with anyone who made the stuff in the first place. Yes, the music industry has survived home taping, CD copying and bootleg market stalls, but file-sharing makes the practice much easier, so the issue must be taken seriously. I couldn’t care less if Jay-Z or the chairman of Sony-BMG can’t afford an extra Mercedes, but they aren’t the real victims. And I’m not talking about the people who work in the music industry (although this is a valid point the record companies make), but the small-time artists struggling to make a living.


I am fortunate. My own small-time artistry is play writing and directing. I have willingly put in a lot of time into theatre for nothing, but I would not have been able to use various theatres for free without the money they get from ticket sales. I have little to fear from online piracy because you can’t copy a theatre visit online, but musicians, authors and computer programmers aren’t so lucky. One of the favourite pro-piracy arguments is that it benefits small-time musicians by promoting their work, but reality does not back this up. In Sweden, where there is the strongest culture of piracy, any musicians who complain about loss of earnings are vilified for not sharing the Pirate Party’s views of what’s best for them. An obvious point is that musicians tend to promote themselves through samples on MySpace or YouTube rather an online free-for-all, but this too falls on deaf ears. My view is that all these arguments about supporting music aren’t reasons for piracy, they are simply excuses.

But the stance of the big record companies does small-time artists no favours. Of course they have to protect their sources of income, but the arguments they use are blatantly geared towards maximising profits first and protecting creativity a long way second. When Prince chose to release an album for free - surely no-one can object to a millionaire pop star giving something away at his own expense? - the record companies were outraged. Half the time, anti-piracy technology seems to have little to do with anti-piracy and plenty to do with restricting how you may use your own products, from unskippable adverts on DVDs through to some highly suspect restrictions on Blu-Ray. Until the RIAA backed down, the RIAA resorted to mass lawsuits against people who may or may not have illegally uploaded material based on questionable evidence and scary lawyers. Copyright laws, like most laws, work best when people have confidence in them, and so far the record companies are failing miserably.

This bludgeoning approach is a large part of the problem with SOPA – although, to be fair, it’s largely the big-time pirates’ fault this was considered in the first place. Pirates evade the law either by claiming their operations are out of reach of the law by locating their servers in Belize, or claiming that their site’s Not For Use By Copyright Infringers (Honest). The latter category is the big problem. Countless sites rely on content uploaded or shared by users, from Limewire and old-style Napster to YouTube to Wikipedia, plus Facebook, Twitter and pretty much any site that allows users to post comments such as this blog. What they all have in common is that there’s no way foolproof way of ensuring uploaded material isn’t someone else’s work. Beyond that the similarity ends: the Newzbins of the world turn a blind eye, and sites such as Wikipedia diligently police themselves. The question is: how, in the eyes of the law, do you tell one from the other?

SOPA’s answer is, at best, vague. And vague laws are dangerous, because that places power in the hands of those with the most expensive legal teams. We’ve already seen US software patent laws used almost exclusively by big companies to keep small companies out of the market and extract money from big competitors, and for all we know SOPA could go the same way. Could a company who cares little about piracy but disapproves of Wikipedia try to put them out of business? Could they claim the upload mechanism "might" be used for piracy? It might seem a ridiculous scenario, but there’s little to assure us this couldn’t happen. It’s little wonder sites like Wikipedia are up in arms about this, and yet the big record companies still see their opposition irresponsible pro-piracy.

There are plenty of other possible solutions. I’d take a good look at Wikipedia founder Jimmy Wales’s idea of going after the money rather than the uploaders. I’ll bet that if you take money out of the equation, people running sites like The Pirate Bay will suddenly forget their ideological commitment to “sharing”.  Websites can work with the copyright holders; almost all music videos streamed for free on YouTube now are done with the copyright holders’ blessing, either for a share of advertising revenue or just promotion of the song. Existing laws are getting quite good at telling the difference between bona fide content sharing sites, and piracy sites masquerading as legitimate ones. All of these possibilities could and should be considered before resorting to handing poorly-specified powers to unspecified individuals.

However much idealists want to believe otherwise, the music and film industries are not sustainable in a world where payment is voluntary for everyone, but this is what we will get if the big record labels carry on behaving like they own the internet. At the time of writing, SOPA's passage through Congress has been suspended – whether this really means the end of SOPA as we know it is unclear. But I hope this will be used as an opportunity to go back to the drawing board and think about what really matters. It may take many more attempts to get the balance right, but if we stick with this, it will be worth it in the end.


27 February 2013, 12:16 pm

Give penguins a chance


Would switching to open source software save public money? I don’t know, but we should at least try to find out.

The Windows logo versus the Linux mascot. A little-known but very bloody feud.

I know software testing is a very absorbing activity, but in between bouts of testing you might have noticed there’s a bit of a financial crisisgoing on. As tax rises, benefit cuts and axing public services don’t go down that well with the public, the government is keen to find less painful ways of saving money. This, in part, was the idea behind the Spending Challenge letters that went out to all public sector workers shortly after the 2010 election asking for ideas to save money. The ideas ranged from the pragmatic to the ridiculous, but one suggestion that caught my eye was to switch proprietary software for free open-source alternatives. This is not an unthinkable as you might expect; the Lib Dem manifesto said they’d look into this, and George Osborne himself is said to be interested.

I’ll be open and upfront here: I use Linux, LibreOffice (effectively the successor to OpenOffice) and other free open-source products wherever possible. It’s partly I don’t want to pay for software when free stuff does the job, and partly because I have problems with the way Microsoft uses its dominant position to make life difficult for people who use competitors’ products. But I don’t believe in imposing my views on other people, and I’ll help out with any IT problems whatever software they’re using. (Indeed, a software tester who doesn’t is a short-lived one.) I wouldn't push savings too much with a charity (Microsoft usually heavily discounts software for them). I’d also be hesitant to encourage a small business to switch to open-source when everyone they work with expects them to do all things Microsoft. The public sector does not have that problem – they mostly communicate with each other, and they’re big and ugly enough to insist anyone else works with their software if they wish – but any move away from Microsoft or any other proprietary software must save the public money, and not just be done to prove a point.

Open source is a far better option that it used to be. As little as ten years ago, getting Linux to work was a nightmare for even the most tech-savvy users. Nowadays, however, it’s as easy to learn how to use a Linux-based computer as it is to learn a Windows-based one, and for the most basic layman’s tasks, the differences between the two are trivial. Microsoft Office still offers a lot more features than LibreOffice, but most people don’t use the advanced features anyway. Those people who claim that you’d have to pay if you’re a business, or that they might start charging you later, are mistaken – the licence used makes this impossible. There’s a lot of talk over the costs of acceptance testing or retraining, but it’s broadly similar to the hyperbole against moving from IE6 which I’ve already been over.  And if you still think Linux is only for geeks in their bedroom, the continuing success of Linux-based Android tells a different story.

But Microsoft does make one valid point: there’s more to the cost of IT in business than the licence. The term Microsoft keeps banging on about the Total Cost of Ownership, and much as I hate buzzwords, it has to be taken seriously. There’s labour costs associated with installation, maintenance, fixing problems before they disrupt your business, and the hardware needed to support your system. Microsoft also claims that if software’s free, there’s no-one on the end of a phone if things go wrong. That’s not really true any more; the major Linux distributors sell Enterprise packages that include this support, but the fact remains it costs money. The bottom line is that Microsoft claims their software works out cheaper when you factor in everything. I find some of their anti-Linux claims to be dubious, but that’s just their marketing department doing their job, and I wouldn’t be surprised if Microsoft and Canonical do the same.

Anyway, here is my idea. It’s a suggestion which the Government is welcome to take up without any need for acknowledgements or royalties. It’s a tried and tested method which works in every other area of government business when different companies claim to provide the same goods or services for less money.

Without further ado, the solution is …

[Drum roll]

… put it out to tender.

At the moment, public sector IT contracts generally are a choice between company A providing Windows and MS Office, company B providing Windows and MS Office, and company C providing Windows and MS Office. That’s not good enough. I can’t think of a single example other than this where it’s considered acceptable to choose one company without considering any competitors. It doesn’t have to be a choice of all Microsoft or no Microsoft; it’s perfectly possible to run LibreOffice on Windows, Microsoft Office on Linux, or mix and match pretty much any combination of open source and proprietary components. Claiming Microsoft is the only option doesn’t wash any more – government bodies elsewhere in the world have made the switch and managed. Claiming it’s what everyone uses is a poor excuse for any government that believes in free and fair competition. If 90% of motorists drove Skodas, would anyone argue the Government should help make it 100%?

What should we consider when awarding the contract? Anything we think is important, just as long as all sides get to make their case. Does Microsoft believe their software is cheaper to maintain in the workplace? Are their servers easier to maintain? No problem – let Microsoft make their case, let the open source vendors reply. Is there a problem with a Microsoft lock-in? Their licensing arrangements? Let the open source vendors say why there is, let Microsoft say why there isn’t. Does Microsoft or Linux offer better security? Which is faster? Which is more reliable? For all of these questions, we should be asking the vendors to make their case themselves, rather than picking one and dismissing the others out of hand.

And what if the winner is Microsoft, Microsoft and more Microsoft? It will still be worth the paperwork. Experience shows competition is good for Microsoft products. Microsoft moved on from the horribly outdated IE6 because of competition from Firefox. When the XBox’s standing was threatened by the revolutionary Nintendo Wii controller, they responded with the equally innovative Kinect. There have been advances in Windows and Office in the last two decades, but two things in particular have never really been addressed: why it’s necessary to pay hundreds of pounds for software when you only use 10% of the features, and why the processing power needed to run them balloons as quickly as processing power of computers. With real competition to the office market and something might be done about this.

Will this happen? On the one hand, if the Cabinet Office consider upgrading from IE6 to be too difficult/complicated/expensive, there isn't much hope. On the other hand a consultation was launched last year on this area, and although it seems to be confusing open source with open standards a bit, there are signs that the Government is starting to recognise the need for proprietary and open source software to compete on fair terms. The Government is in a far better position to bring competition back to IT than any other company, and if they stick this course, it could be rewarding for everyone.

27 February 2013, 12:15 pm

Are web designers the new car mechanics?


Websites are easier to make than most people think. Bear this in mind when a website designer wants a hefty payment.

A joke, obviously. But does this sales pitch work in IT?

Advance warning: this post is another moan. Up to now, I’ve had two pet hates: people who sign up to wildly optimistic cheap/convenient IT projects that turn out to be unreliable and expensive; and at the other end, people who block trivially easy IT projects because of silly overblown cost estimates. I’d forgotten the third type. But we’ll get on to that later.

This story begins with my website – you know, the one in my shameless plug masquerading as a piece on Search Engine Optimisation. Well, my web traffic is still quite abysmal, in spite of pushing up the Google rankings. But from the few people who’ve looked at the site, I’m quite likely to set up a website for an arts organisation, which I’m happy to do as a freebie; and if all goes well I may get some paid work off the back of that. And in this scenario, the obvious question is: how much should I ask to be paid?

The thing is, there’s nothing special about my web design knowledge. What I created for myself was technically very basic (I was using a free web template and Kompozer if anyone's wondering). I’d rate my skills above those of a 13-year-old who has discovered FrontPage – I do at least understand the importance of Cascading Style Sheets, W3C compliance and not doing fancy animated backgrounds – but ask me to produce a site that handles user-uploaded content, streaming video or credit card payments and I wouldn’t have a clue. And yet paltry offerings to the interweb like mine seem to be regarded as the height of technical genius.

One organisation I know (I won’t give names – it’s not one I’m directly involved in) got its website from a company that advertises a “starter” package of four pages for £150. I’ve seen what they produced, and whilst there’s nothing particularly wrong with the pages, I don’t think I could bring myself to charge that. Even if I valued my services at £30 per hour, that would be five hours of work – five minutes would be a closer estimate. And even that figure was cheap compared to what some other businesses charge. All right, these are small businesses trying to make a living, and they have office overheads and after-sales issues to consider, so it’s unfair to pick on them too much. But it’s a problem endemic throughout IT services: if your customers don’t know any better, you can claim the simplest tasks are laborious and expensive and they’ll believe you.

Oh, and another reason why not to pick on small companies: if you’re serious about over-charging, why stop at £150? How about $18,000,000? Yes, that’s right: eighteen million US Dollars. Because that’s what luxury hotel chain Four Seasons paid for theirs. Some websites might be expensive to make – I’m testing a feature-rich website at the moment and I know first-hand how much work can be involved – but $18 million for this one? A secure banking site might cost that much, but this one has a hotel booking facility, smartphone compatibility, and some pretty panoramic pictures of their expensive rooms and beautiful locations: all standard features seen in websites made at fraction of the cost. They’ve not even done that good a job of it – it’s been criticised for shutting itself off search engines, poor accessibility for disabled users, and sloppy user friendliness amongst other things. One would have expected a project that expensive to dedicate at least a few million to proper testing to deal with those sorts of problems. I can’t help thinking someone is going round with $17,990,000 in his back pocket.

In a way, website designers can be likened to car mechanics. Just like the unscrupulous car mechanic can make wildly inflated estimates for easy repairs, it is far too easy for website designers to say the IT sales-speak equivalent of: “Right, let me see … that’ll be HTML, CSS, web server rental, domain name, SEO, setting up an FTP server … hmm, you’ll want a contact form so that’s PHP and SMTP as well … oh dear, we’re talking about a work here, ‘sgonnancostya”. The difference is that whilst most people know better than to hand over money to car mechanics until you’re satisfied you can trust them, the same is not happening for IT products. From small clubs and societies to the biggest boardroom, people sign cheques first and ask questions later.

I’ve said it before, and I’ll say it again: people – big organisations in particular – making the decisions on IT projects have to understand what they are hiring a contractor to do. You cannot rely on techno-waffle from sales representatives; you need people independent of the contractors who can tell you if it’s a bargain of a rip-off. Claiming IT consultants are too expensive is no excuse – in most cases, you can get what you need by identifying people in your organisation who understand computers and listening to what they think. I cannot imagine anyone would have paid a motor chain $18 thousand, let alone £18 million for a contract repairing company cars without at least getting an opinion from someone who knows about motor repairs.

So there you are, my new pet hate. Joining people who cook up silly overblown expenses as an excuse not to do IT projects are people who cook up silly overblown expenses and then actually pay it. It’s not just websites; it wasn’t that long ago that the House of Commons Public Accounts Committee highlighted government departments spending £3,500 per computer. Many schools are eager to equip every classroom with iPads when cheap netbooks would do the job equally well. And yes, software testing companies are not immune – I’ve read my fair share of sales pitches for test automation tools that I can tell are overcharged and not that useful, but someone must be buying them if they’re in business.

The Government's latest initiative to keep costs down is the GCloud programme - it's a good idea in principle, whether it works in practice is yet to be seen. Private companies too have the means to find out for themselves when they’re being overcharged. But individuals aren't so lucky. Many IT companies routinely promote unnecessarily expensive products in the domestic market, such as computer stores bundling expensive security suites into PC sales when a free package off the internet would suffice. Some laptop vendors promote special “school” laptops at twice the price of lower-spec machines, when most school children have no use for the higher specs. And not everybody has a tech-savvy friend to warn you if it’s a waste your money. But you don’t need a PhD in computer science to understand that “expensive” does not necessarily mean “better”, and a little more attention that that principle would go a long way.

Other Sections