Recognise This! – Cumbersome, yet limited, 360-degree reviews tried to solve the problems of the traditional annual performance review but missed the mark.

Regular readers know my CEO, Eric Mosley, recently published his latest e-book, The Crowdsourced Performance Review, which explains why the traditional annual performance review is broken and how it can be fixed through crowdsourced positive feedback.

I’m often asked, how is this any different from 360-degree reviews? Let me be frank; 360-degree reviews have as many problems as the traditional annual appraisal process:

  1. 360-degree reviews are not anonymous. Sure, those providing feedback through a 360-degree review process are told their feedback is anonymous, but everyone knows (or assumes) it’s easy to figure out who is giving the feedback. Therefore, answers are often less than honest or very skewed.
  2. The sample size is much too small and usually limited to those directly within the daily circle of the subject. This is one reason why anonymity is impossible, but more to the point, not enough people are providing feedback. Why limit 360-degree reviews to just a few? That’s easy – because it’s a cumbersome tool to administer and manage, especially as more people are added.
  3. The 360-degree review process is too structured. Both in terms of type of feedback solicited and who it’s solicited from, the feedback results in a “score.” which feels like a win/lose proposition. Data is often normed depending on the survey tool, which can also feel like a judgment. What happens if you score in 10th percentile on communications skills, for example?

Don’t believe me? Here’s an excerpt of what Marcus Buckingham (author of First, Break All the Rules) has to say about it in a Harvard Business Review article:

“I still think all but a very few 360 degree surveys are, at best, a waste of everyone’s time, and at worst actively damaging to both the individual and the organisation. We could stop using all of them, right now, and our organisations would be the stronger for it…

“No, my beef with 360 surveys is more basic, more fundamental. It’s the data itself. The data generated from a 360 survey is bad. It’s always bad. And since the data is bad, no matter how well-intended your coaching, how insightful your feedback, how coherent your leadership model, you are likely leading your leaders astray…

“The bottom line is that, when it comes to rating my behaviour, you are not objective. You are, in statistical parlance, unreliable. You give us bad data… Each individual rater is equally unreliable. This means that each rater yields bad data. And, unfortunately, when you add together many sources of bad data, you do not get good data. You get lots of bad data…

“Your raters are a non-random group of people who happen to work with you or report to you. In statistics we call this a ‘skewed sample.’ Add up all their ratings and you do not get an accurate, objective measure of your leadership behaviours. You get gossip, quantified.”

A crowdsourced review done in the way Eric proposes in his e-book, on the other hand, solicits regular, ongoing, positive feedback throughout the year from all employees in the form of detailed, specific recognition for those who demonstrate and achieve what is most important to your organisation – your core values and strategic objectives.

The “sample” size is much larger, feedback doesn’t need to be anonymous, and the big data generated reveals many other interesting truths for analysis.

Does your organisation use 360-degree reviews? Have you participated in one (as a reviewer or as a recipient)? Do you think they serve a valuable purpose?