Jump to section
- Q. As a relatively new unitary authority formed in 2019, what were your reasons for using Socitm’s benchmarking and customer survey services in 2021?
- Q. What was the impact of the merger on how you used benchmarking?
- Q. What is your experience of working with Socitm on benchmarking?
- Q. What so far have been the key results, what actions did you take and what outcomes have you seen?
- Q. What is your key takeaway on using benchmarking?

Q. As a relatively new unitary authority formed in 2019, what were your reasons for using Socitm’s benchmarking and customer survey services in 2021?
As an organisation we were emerging from the fires of unitarisation and Covid. The political administration wanted to be assured the council in its newly-unitarised state was operating effectively, so it started to develop an internal value for money process. I had been appointed as head of IT at the point of unitarisation, having worked as a service manager for the former Dorset County Council, where we had used Socitm benchmarking at various times.
I volunteered the ICT operational service as a pilot for Dorset’s new value for money process. I was looking for a recognised, structured benchmarking approach that gave consistent and effective ways of assessing the organisation against other similar organisations across the sector and connected previous experiences of using Socitm benchmarking with that. I also wanted to use Socitm’s customer survey and we ran the two in parallel.
The need for such structured data gathering demonstrated the complexity of assessing value for money but from my perspective I felt the Socitm benchmarking provided important insights as it demonstrated that the operational service was effective, low cost, high performance and highly available.
Q. What was the impact of the merger on how you used benchmarking?
Benchmarking gives you the same information for a merged unitary as for county, district and borough councils. The value comes from how you use that information to articulate the context you find yourself in. Our motivation was following significant organisational upheaval merging six councils into one we were looking for assurances that services continued to be effective and for help understanding any improvements we might want to make as a merged council.
We did the benchmarking two and half years after becoming a unitary and the Covid year had allowed us to get on with a lot of infrastructure convergence, so we perhaps didn’t see that much disruption in the results. The county council was by far the biggest of the former authorities, and in order to minimise disruption to public services Dorset Council adopted much of the former county council’s infrastructure. The benchmarking data showed this process had been effective and that we had delivered a lean, coherent and consolidated infrastructure.
Q. What is your experience of working with Socitm on benchmarking?
Socitm works hard to make it a supported experience. We had regular online meetings to ask questions about how to interpret the guidance on establishing the right data and trying to make sure we were being consistent, so that as closely as possible the data we fed in could be compared with data from other organisations. The form itself was easy – you can put data in, save it, go in as many times as you like as you build data up over time.
There were output sessions to help us interpret the data correctly including various indices, particularly in the customer survey, which allow you to understand gaps between your customers’ perceptions and those in other organisations. It’s a polished offer.
Q. What so far have been the key results, what actions did you take and what outcomes have you seen?
Our operational service was in good health – we were number one for availability of services, lowest quartile for cost and cost per employee, high customer satisfaction for operational support. But the customer survey showed that our senior management and service leadership had a really marked difference of opinion about the IT service’s support for change and transformation, with lots of frustration and very low scores, compared with consumers of the service.
That helped to drive an understanding that at unitarisation the council set out to deliver a sound operational function but didn’t invest in capacity to provide strategic technological leadership.
We are now on the cusp of a new target operating model which will be underpinned by a new customer relationship management system, changing our enterprise resource planning system and wholesale use of generative artificial intelligence, such as in the contact centre and social care.
There will be significant investment in technologies and the council’s ability to wield those capably, not only capacity to configure and develop on these platforms but also manage alignment with them over time, rationalise to them, adopt architectural principles and standards and have more structured governance.
Some of this stems from our ability to demonstrate that we had effective operational IT and didn’t have friction with day-to-day operations but we were not helping the organisation push forward to achieve greater service outcomes and financial sustainability through the adoption of technology. Evidence that demonstrated how people felt about the service was one of the key things we got out of this.
Q. What is your key takeaway on using benchmarking?
The value we got from it was derived from us taking not only the benchmarking service but also the customer survey service in parallel. Bringing those two datasets together gave us the strategic insight that operational services were good but the support for transformation was less good. We would not have got that from just doing benchmarking on its own.