How to measure the success of a new IT system
Posted on 26 Apr 2013 at 10:24
Simon Jones reveals how to measure the impact a new software application has had on your business
When you have the idea for a new computer application – whether written in-house, outsourced or bought off the shelf – perhaps the first thing you ought to do is define what would constitute "success", and how you’re going to measure that. How will you know, in the long term, whether your money was well spent? If you don’t know that, there’s a good chance it wasn’t. Worse still, you’re likely to make the same mistakes again in your next project.
What you definitely won’t want to do is employ the traditional measures of success – being "on time" and "on budget". These merely tell you that you spent X hours and Y pounds on writing, testing and installing the new system, costs you must always expect to incur and which are all on the minus side of the equation. The fact you didn’t spend more than expected is good news, but it doesn’t measure the success of the whole system, only that of its writing, testing and implementation.
Not defining how you will measure success can often lead to failure
What about the positive side? What about the return on investment (ROI)? Has the new system improved your productivity, lowered your costs, made your users’ working lives better? These are also things you have to measure before you can say whether a new system is a success.
A new invoicing system, for instance, might make it easier to send statements by email to all your customers, saving the time and cost of printing and postage. It might also reduce the time taken to introduce a new invoice format. These criteria, therefore, should be written into the original proposal as measures of success:
1. Send statements by email to 90% of customers, saving £1,000 on paper and postage and half a day’s time each month.
2. Reduce time to implement new invoice format from two days to four hours.
Once your new system is running, you can measure its performance against these criteria to see how you’re doing. Depending on your chosen measures, you might be able to assess success from day to day or week to week, or it might take several months to get a clear picture.
For example, statements are usually sent once a month, so the first measure for the new invoice system can only be assessed monthly.
After the first month of running the new system, you might have collected email addresses for 60% of your accounts – quite a big step towards your goal. You could then write to all the customers whose email addresses you don’t have and ask for one. Follow up that letter with a phone call two weeks later and you should be able to achieve the 90% target quite soon.
If you’re still not there by the end of the second month, you’ll see nonetheless how you’re progressing towards the target and can take further action to encourage more customers to supply email addresses.
Had you not defined how you were going to measure success at the outset, none of this would have been possible and you wouldn’t have fully achieved the potential cost savings of your new system. Not defining how you will measure success can often lead to failure.
Sometimes it’s difficult to measure success. The expectation that your invoicing system will reduce the time to implement a new format from two days to four hours seems like a nice concrete measure, but it’s an action that happens infrequently.
The best thing you can do is perform an artificial test as soon as possible. Your company may not need a new invoice format right now, but you could design one anyway and go all the way through the process without taking it live. The point is to test the success of this subsystem before you wind down the development team and put the code away. If you really do need a new invoice format in two years’ time and it takes four days rather than two, that’s a bit late to be finding out. Testing for success early gives you a chance to fix such shortcomings; failing to test leaves you with no chance.
A mnemonic for measuring success that’s been used for decades is WILU (pronounced "will you"), which stands for Working, Installed, Liked and Used. It’s just as relevant now, with our self-service app stores on phones, tablets and PCs, as it was back in the days of mainframes and mini-computers.
- How to sell more ebooks on Amazon
- 10 ways to make your business more secure
- Top five VoIP mistakes
- How to add in-app purchasing to an iPhone, Android or Windows app
- Remote-control ransomware: TeamViewer and software hardball
- Why laptops with serial ports matter to the Internet of Things
- Make your mobile battery last longer
- Small steps into handling Big Data
- Nexus 5: does it really run stock Android?
- How to get broadband to a garden office
- 20 years of PC Pro: our best covers
- Why we've closed the PC Pro forums
- How to turn off Google Location Tracking
- 20 years of PC Pro: our greatest review mistakes
- 20 years of PC Pro: our first A-List
- Wikipedia's "right to be forgotten" protest hits the wrong note
- 3D printing hits the high street for plastic selfies
- 20 years of PC Pro: What amazed us in our first issue
- How Google Glass ruined my lunch hour
- Smartphone battery packs: can a USB power pack beat the festival battery blues?
- Microsoft yanks Windows 8.1 update after crash reports
- Microsoft backtracks on blocking out-of-date Java
- Gartner: time to start planning your Windows 7 upgrade
- Still on IE8? You've got 18 months to upgrade
- Who's buying Chromebooks? American schools
- Microsoft targets Windows in next Patch Tuesday
- Microsoft to block old ActiveX controls in security push
- Samsung and Apple call off all legal disputes, except in the US
- Microsoft ordered to hand over European data
- Will the next Windows 8.1 update arrive next month?