Jump to section
- Q. Why have you used Socitm’s benchmarking services?
- Q. Have you seen particular benefits from the Socitm service?
- Q. What is the benefit of using several parts of Socitm’s benchmarking services?
- Q. Does benchmarking make a difference to how an IT service is perceived?
- Q. How much preparation do you have to do for a benchmarking?
- Q. Did the satisfaction survey confirm what you already knew?
- Q. What benefits have you seen from the other benchmarking modules?
Q. Why have you used Socitm’s benchmarking services?
I think it is important to have independent reviews of your IT services so that you can assess the environment and service provision, see if there are any areas where you are not performing well and could improve and find where you need to focus your attention. We have had many years with challenging financial positions, meaning our senior managers want to know we are running cost-effective services. This kind of benchmarking assessment really helps put into context what you are doing, how much you are spending and what the impact is in terms of service delivery and performance.
The fact that it is independent means your senior managers can be sure that they are getting the facts and repeating the assessments over periods of time means that when you put in improvement plans you can see what the trends are.
Q. Have you seen particular benefits from the Socitm service?
Socitm is the membership organisation for those of us in local government and the wider public sector and it has no vested interest in skewing the results towards sales. It is independent of particular suppliers and is not going to turn around at the end and say here are your results, hand it over to us and we’ll knock 20% off which is usually what happens with some of these services.
I think Socitm is uniquely positioned to benchmark against other public sector organisations. You can refine the results by filtering by type of local authority, so you can compare apples with apples rather than oranges. I like the format and structure – it is quite easy for us to complete and we can see our results against our previous assessments. And it is cost-effective: if I were to get this from other organisations it would cost us a lot more than it does from Socitm.
Q. What is the benefit of using several parts of Socitm’s benchmarking services?
I have always used a set of four – user satisfaction, cost, delivery and performance – because you really need to look at the metrics together as if you are very low cost, how is that impacting on user satisfaction or performance? I pull together results to analyse them, for example on incident resolution and performance I might include workloads and statistics on how many customers we have per IT worker and our user satisfaction cost. That might show that you are cheap as chips but have terrible performance. This sort of collective perspective gives you the insights that you need to make decisions about what you need to with the service.
Q. Does benchmarking make a difference to how an IT service is perceived?
I think so, because it is factual and done independently. I have certainly heard over the last few years that local authorities have a desire to benchmark themselves. It is hard to think pragmatically how I would achieve that if it wasn’t for Socitm. In a few other service areas I have asked neighbouring authorities for some data but it is very limited – you can’t ask them to do an entire benchmark activity. It is more practical and cost-effective to use Socitm.
Q. How much preparation do you have to do for a benchmarking?
You are asked for quite a lot of information. The questions cover various aspects of the service and there is no one person who is going to answer all of those, so we created a spreadsheet of the questions, allocated ownership of each one and when the data was in we uploaded it to the portal. Socitm processed it within days and the output was then available through a Power BI portal which allowed us to benchmark against other organisations and ourselves. Matthew Fraser was very patient because we did go backwards and forwards looking at the data. Some of the results didn’t look quite right so we asked have we actually provided the right data, and in some cases we hadn’t.
Once we had a full set of results in the portal I needed to put these into a format that was digestible by my senior leadership team. I used the Power BI dashboard to produce graphs and analysis, then I could highlight the key findings with evidence as to why I was coming to my conclusions. The user satisfaction survey was slightly different because you also get a set of individual comments, which I categorised by theme and put them on our website to show a ‘you said, we did’ approach.
Q. Did the satisfaction survey confirm what you already knew?
What we got back was not a complete surprise. We had spent a lot of time and effort trying to move people from telephony to our service desk’s online portal and the first time we did the survey at Worcestershire people said that was an area they didn’t like. We then did a lot of work to make the portal easier to use, including user engagement on what people did and didn’t like and we put in a chatbot. The second time this issue was not a key finding, which was great. Instead, users said they wanted more training, which was not quite on my radar and we will do some more customer engagement on this.
You don’t know until you ask people. You think you know, but ultimately with all services you have got to ask what they want and whether we have designed those processes and services from their perspective.
Q. What benefits have you seen from the other benchmarking modules?
The cost review both times has shown that we are running a cost-effective service, although it did indicate that we perhaps need to be a little bit sharper about getting the best value for our devices. Interestingly, and I’m not repentant about this, we spend quite a lot on training because we have a ‘grow our own’ approach to developing staff and I think that is money well invested. It also showed that we need to have another look at what we are paying for our network, which was useful.
On delivery we have got really high levels of smartphones so we need to get a grip on that. We also perhaps need to look at the ratio of laptops to tablets as we have a lot of tablets. Those kinds of insights allow me to have a focus which I would not be able to have without the benchmark.
On performance we were with the pack even though we are reasonably low cost and our projects were looking good in terms of how we were completing those. Overall I was quite happy with the results and we have addressed the issue of people finding it difficult to contact us following the implementation of the service portal.
This article is an edited version of a Socitm webinar on benchmarking recorded in November 2023.