CRM six months on...
There’s an awful lot of hype in the world of CRM, and it can be a pretty insightful, and potentially humbling, exercise to re-visit a project to see whether you’ve genuinely added value to a client’s business. I spent a morning last week reviewing how a client had fared since their system was rolled out in January, and since our last formal progress review in May. The feedback proved to be pretty startling.
The project in question was focussed on improving the operation of the customer support function, and our involvement covered planning the project, defining requirements, and project managing the implementation.
When we initially helped them shape the business case for the project, it was pretty clear that by providing better visibility of their performance against service levels, they would be much better placed to tune their quality of service. Eight months after live, this proved to be accurate though, it must be admitted, they were significantly outperforming our expectations. The SLA performance statistics generated by the new system were now religiously studied by the management team, and had become one of the key areas of focus of the weekly review meeting.
When the data was first reviewed, it was apparent that some aspects of performance were falling outside their published standards. The company was now able to quickly identify and target problem areas, and by the time of our meeting the number of overrun calls was well under 1% of the total. Because of the commercial arrangements the company had place, this improvement would repay the initial investment in the system (including our involvement) over twelve times in year one….which didn’t seem to me at least too shabby a return on investment. The client also flagged a host of softer benefits, which I’ll cover in a later post, but from an SLA perspective the project, confirmed the wisdom of W. Edwards Deming’s observation - 'What can't be measured can't be managed, and what can't be managed can't be improved.'
The project in question was focussed on improving the operation of the customer support function, and our involvement covered planning the project, defining requirements, and project managing the implementation.
When we initially helped them shape the business case for the project, it was pretty clear that by providing better visibility of their performance against service levels, they would be much better placed to tune their quality of service. Eight months after live, this proved to be accurate though, it must be admitted, they were significantly outperforming our expectations. The SLA performance statistics generated by the new system were now religiously studied by the management team, and had become one of the key areas of focus of the weekly review meeting.
When the data was first reviewed, it was apparent that some aspects of performance were falling outside their published standards. The company was now able to quickly identify and target problem areas, and by the time of our meeting the number of overrun calls was well under 1% of the total. Because of the commercial arrangements the company had place, this improvement would repay the initial investment in the system (including our involvement) over twelve times in year one….which didn’t seem to me at least too shabby a return on investment. The client also flagged a host of softer benefits, which I’ll cover in a later post, but from an SLA perspective the project, confirmed the wisdom of W. Edwards Deming’s observation - 'What can't be measured can't be managed, and what can't be managed can't be improved.'
<< Home