Oh happy day! The UK civil service recently published its 2010 staff survey results. 325,119 people (62% of the 528,729 who were invited) took part. What did they tell us? Well for starters 32% believe “I have the opportunity to contribute my views before decisions are made that affect me”. A whopping 38% of the people who replied believe that senior management will take action based on these survey results, and 27% believe change is managed well in their organisation. What that says in relation to achieving the seismic changes that are about to hit the civil service is anyone’s guess. Pick a number – and make it a low one. There are more questions and answers in the survey than you can shake a stick at. Each one of them registers considerably lower on my interest scale than the last.
I posted a link to these results over on David Zinger’s Employee Engagement Network.Jean Douglas was kind enough to get in touch. She notes:
You have to wade through the methodology to find out that the engagement index is calculated in a manner different than what you might think – I am still trying to understand what they did – and this is my field.
Here is their description:
The employee engagement index is calculated as a weighted average of the response to the five employee engagement questions and ranges from 0 to 100. An index score of 0 indicates all respondents strongly disagree to all five engagement questions and a score of 100 represents all respondents strongly agree to all five engagement questions. The 2010 benchmark is the median (midpoint) engagement index of the 103 organisations that participated in the CSPS 2010. |
The engagement score is listed as 56%; however, the “%” is misleading. There is no 56% of something. The score is simply 56 (the highest number is 100 – which does not automatically mean it is a percent). It could have been from a range of scores running from 0 to 157).
They have also “mooshed” together the scores in a department (“moosh” is my new statistical term when numbers are added and divided to come up with another difficult to understand index).
They missed some real opportunities here to get at some good predictive results. .
The individual departmental results are more meaningful (except for the engagement score) as they have not done all that mooshing.
Thanks Jean. So basically the civil service is frigging around with numbers and mooshing stuff. That figures.
Beyond the survey we find…the initial findings. The initial findings – there’s a title to stir the soul. The initial findings are about the rationale behind the survey and why it is important to measure engagement. Apparently it is important to measure engagement because:
Rotten vegetables aside – this whole project is dull and unimaginitive. Trying to measure engagement sucks. Sucks like a vampire. It sucks cost and it sucks time ( I estimate that the completion of the survey alone took over 6,000 person days). And having gone to all the trouble to measure – the evidence shows us that few believe action will be taken, fewer still believe that any action taken will be managed well. This sucks. Sucks in a way that the good Count Dracula himself would be proud of.
Stop measuring engagement and just start doing it.