Skip to navigation
Real World Computing
Evaluation sheet

How to measure the success of a new IT system

Posted on 26 Apr 2013 at 10:24

Simon Jones reveals how to measure the impact a new software application has had on your business

When you have the idea for a new computer application – whether written in-house, outsourced or bought off the shelf – perhaps the first thing you ought to do is define what would constitute "success", and how you’re going to measure that. How will you know, in the long term, whether your money was well spent? If you don’t know that, there’s a good chance it wasn’t. Worse still, you’re likely to make the same mistakes again in your next project.

What you definitely won’t want to do is employ the traditional measures of success – being "on time" and "on budget". These merely tell you that you spent X hours and Y pounds on writing, testing and installing the new system, costs you must always expect to incur and which are all on the minus side of the equation. The fact you didn’t spend more than expected is good news, but it doesn’t measure the success of the whole system, only that of its writing, testing and implementation.

Not defining how you will measure success can often lead to failure

What about the positive side? What about the return on investment (ROI)? Has the new system improved your productivity, lowered your costs, made your users’ working lives better? These are also things you have to measure before you can say whether a new system is a success.

A new invoicing system, for instance, might make it easier to send statements by email to all your customers, saving the time and cost of printing and postage. It might also reduce the time taken to introduce a new invoice format. These criteria, therefore, should be written into the original proposal as measures of success:

1. Send statements by email to 90% of customers, saving £1,000 on paper and postage and half a day’s time each month.

2. Reduce time to implement new invoice format from two days to four hours.

Once your new system is running, you can measure its performance against these criteria to see how you’re doing. Depending on your chosen measures, you might be able to assess success from day to day or week to week, or it might take several months to get a clear picture.

For example, statements are usually sent once a month, so the first measure for the new invoice system can only be assessed monthly.
After the first month of running the new system, you might have collected email addresses for 60% of your accounts – quite a big step towards your goal. You could then write to all the customers whose email addresses you don’t have and ask for one. Follow up that letter with a phone call two weeks later and you should be able to achieve the 90% target quite soon.

If you’re still not there by the end of the second month, you’ll see nonetheless how you’re progressing towards the target and can take further action to encourage more customers to supply email addresses.

Had you not defined how you were going to measure success at the outset, none of this would have been possible and you wouldn’t have fully achieved the potential cost savings of your new system. Not defining how you will measure success can often lead to failure.
Sometimes it’s difficult to measure success. The expectation that your invoicing system will reduce the time to implement a new format from two days to four hours seems like a nice concrete measure, but it’s an action that happens infrequently.

The best thing you can do is perform an artificial test as soon as possible. Your company may not need a new invoice format right now, but you could design one anyway and go all the way through the process without taking it live. The point is to test the success of this subsystem before you wind down the development team and put the code away. If you really do need a new invoice format in two years’ time and it takes four days rather than two, that’s a bit late to be finding out. Testing for success early gives you a chance to fix such shortcomings; failing to test leaves you with no chance.

A mnemonic for measuring success that’s been used for decades is WILU (pronounced "will you"), which stands for Working, Installed, Liked and Used. It’s just as relevant now, with our self-service app stores on phones, tablets and PCs, as it was back in the days of mainframes and mini-computers.

1 2 3
Subscribe to PC Pro magazine. We'll give you 3 issues for £1 plus a free gift - click here
Be the first to comment this article

You need to Login or Register to comment.

(optional)

Simon Jones

Simon Jones

Simon is a contributing editor to PC Pro. He's an independent IT consultant specialising in Microsoft Office, Visual Basic and SQL Server.

Read more More by Simon Jones

advertisement

Latest Real World Computing
Latest Blog Posts Subscribe to our RSS Feeds
Latest News Stories Subscribe to our RSS Feeds
Latest ReviewsSubscribe to our RSS Feeds

advertisement

Sponsored Links
 
SEARCH
Loading
WEB ID
SIGN UP

Your email:

Your password:

remember me

advertisement


Hitwise Top 10 Website 2010
 
 

PCPro-Computing in the Real World Printed from www.pcpro.co.uk

Register to receive our regular email newsletter at http://www.pcpro.co.uk/registration.

The newsletter contains links to our latest PC news, product reviews, features and how-to guides, plus special offers and competitions.