How to measure the success of a new IT system
Posted on 26 Apr 2013 at 10:24
Simon Jones reveals how to measure the impact a new software application has had on your business
When you have the idea for a new computer application – whether written in-house, outsourced or bought off the shelf – perhaps the first thing you ought to do is define what would constitute "success", and how you’re going to measure that. How will you know, in the long term, whether your money was well spent? If you don’t know that, there’s a good chance it wasn’t. Worse still, you’re likely to make the same mistakes again in your next project.
What you definitely won’t want to do is employ the traditional measures of success – being "on time" and "on budget". These merely tell you that you spent X hours and Y pounds on writing, testing and installing the new system, costs you must always expect to incur and which are all on the minus side of the equation. The fact you didn’t spend more than expected is good news, but it doesn’t measure the success of the whole system, only that of its writing, testing and implementation.
Not defining how you will measure success can often lead to failure
What about the positive side? What about the return on investment (ROI)? Has the new system improved your productivity, lowered your costs, made your users’ working lives better? These are also things you have to measure before you can say whether a new system is a success.
A new invoicing system, for instance, might make it easier to send statements by email to all your customers, saving the time and cost of printing and postage. It might also reduce the time taken to introduce a new invoice format. These criteria, therefore, should be written into the original proposal as measures of success:
1. Send statements by email to 90% of customers, saving £1,000 on paper and postage and half a day’s time each month.
2. Reduce time to implement new invoice format from two days to four hours.
Once your new system is running, you can measure its performance against these criteria to see how you’re doing. Depending on your chosen measures, you might be able to assess success from day to day or week to week, or it might take several months to get a clear picture.
For example, statements are usually sent once a month, so the first measure for the new invoice system can only be assessed monthly.
After the first month of running the new system, you might have collected email addresses for 60% of your accounts – quite a big step towards your goal. You could then write to all the customers whose email addresses you don’t have and ask for one. Follow up that letter with a phone call two weeks later and you should be able to achieve the 90% target quite soon.
If you’re still not there by the end of the second month, you’ll see nonetheless how you’re progressing towards the target and can take further action to encourage more customers to supply email addresses.
Had you not defined how you were going to measure success at the outset, none of this would have been possible and you wouldn’t have fully achieved the potential cost savings of your new system. Not defining how you will measure success can often lead to failure.
Sometimes it’s difficult to measure success. The expectation that your invoicing system will reduce the time to implement a new format from two days to four hours seems like a nice concrete measure, but it’s an action that happens infrequently.
The best thing you can do is perform an artificial test as soon as possible. Your company may not need a new invoice format right now, but you could design one anyway and go all the way through the process without taking it live. The point is to test the success of this subsystem before you wind down the development team and put the code away. If you really do need a new invoice format in two years’ time and it takes four days rather than two, that’s a bit late to be finding out. Testing for success early gives you a chance to fix such shortcomings; failing to test leaves you with no chance.
A mnemonic for measuring success that’s been used for decades is WILU (pronounced "will you"), which stands for Working, Installed, Liked and Used. It’s just as relevant now, with our self-service app stores on phones, tablets and PCs, as it was back in the days of mainframes and mini-computers.
- Make your mobile battery last longer
- Small steps into handling Big Data
- Nexus 5: does it really run stock Android?
- How to get broadband to a garden office
- How to write your company's IT security policy
- Raspberry Pi and Wolfram: a must-have for every child
- Could you get by with Office Web Apps?
- The best Android antivirus apps for 2014
- Headings vs headers: how to use both in Word
- Windows Server 2012 R2: how the Datacenter edition could change SMBs
- Windows 8.1 Update: an abject surrender
- The insane economics of Sky Now TV
- No such thing as a free app... so pay up if you want quality
- Time to outlaw crapware-laden installers
- Windows Phone 8.1 video: hands-on
- Office for iPad: key information
- Why every PC buyer owes Richard Durkin a debt of gratitude
- HTC One M8 vs Samsung Galaxy S5: 2014's big-hitters compared
- Windows XP end of life: key information
- Cut out the broadband jargon? What jargon?
- Microsoft supercharges PowerPoint with Office Mix
- Microsoft and Nokia deal tweaked ahead of completion
- Microsoft slashes custom XP support price
- Ubuntu LTS Server 14.04 extends cloud support
- Intel: PC sales are "encouraging"
- Google to rank encrypted pages higher
- Heartbleed: the race to reissue security certificates
- Dropbox boosts app line-up with Carousel and Mailbox for Android
- BlackBerry CEO says not selling off phones "any time soon"
- Microsoft halts business downloads of Windows 8.1 Update