There has been much hullabaloo about Facebook’s recent foray into testing the hypothesis that social media can affect our moods. This is presumably so that Mark Zuckerberg can persuade advertisers to spend more money on the Social Media network that brings cats, people and pictures of your breakfast together. In case you are not aware, Facebook conducted a social experiment where they manipulated the news feeds of more than 600,000 users to find out whether positive or negative emotions could be transmitted like a virus via a social network. For one week, Facebook manipulated the news feeds of one group to contain a high proportion of ‘uppers’ whilst another group were given the ‘downers’. I write as someone with a scientific, business and artistic head on my shoulders, having started life as a scientist, taught MBA’s and run a business for 20 years and been a musician all my life. With this in mind, the questions I ask are:

From Science: Is it good research?

From Business: Is it legal?

From Art: Is it decent?


Putting my scientists’ head (and glasses) on, Facebook’s experiment it is exceptionally good research. In all normal situations such as a clinical trial, questionnaire or focus group, the subjects know they are being ‘experimented upon’. This can produce all sorts of spurious effects such as social desirability bias and so on which affect the research. In this study, nobody knew they were being experimented on which is indeed rare. On the question of numbers, in many research studies there are relatively few subjects so the results are not always statistically reliable. In this case there were 689 003 people in the study. By anyone’s standards, these are the hallmarks of a good piece of research.

Turning to the other scientific question of “does the study tell us anything?”, this is rather more mixed. Because large numbers were used in the study, even small differences can be made to seem statistically significant. Facebook did not really find out anything remarkable from this study. The researchers reported:

“Results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

Yet this conclusion is based on extremely small shifts in the data on how many positive or negative words were used in posts by subjects who were subjected to the positively and negatively manipulated news feeds.

So, big numbers don’t always make for big data or big intelligence. Statistical significance is not the same as importance.


Now with my business head (and wig) on (and I do have a Latin O Level, which is sufficient to be a Judge). If we want to be pedantic Facebook broke the law since they did not tell people that they were entitled to conduct research on them in their terms and conditions. Facebook subsequently introduced this clause some time after the study was done. Does that matter? Probably not, because the social contract with Facebook implies that they can more or less do what they like with and to us. Most people tacitly accept the deal by using the Social Network, until someone wakes us up.

There is one exception to this point. Apparently Facebook did not select out people under 16 in the study. One could argue that this was foolish to include minors in the study, given Facebook’s ability to select the subjects they wanted to research. It would have been perhaps more interesting to run a number of smaller experiments on groups of people segmented by age / gender and the vast amount of psychographic data that Facebook holds about us. It could lead to some more fascinating insights for example:

and so on …

Overall Facebook may have been foolish to do the study without telling people but overall I have to say that this is the price we pay for social media.


So, I find little wrong with Facebook’s experimental design, even though the findings are unremarkable. In legal terms Facebook strictly should have modified the terms and conditions to include ‘research’ before conducting the research, Facebook is a free network and really there is a fairly well understood social contract in operation here:

“We can chat to our friends, share cat pictures and so on in exchange for whatever Facebook chooses to do to us or with us”

So, the remaining question is “Should Facebook have done this to us?”. Given the magnitude of the findings and the perception by many that this was an immoral position to take, probably “No”. Does it matter that they did? Probably “No” in my humble opinion. It hardly compares in importance or gravity with other social experiments such as the Millgram experiment, which involved asking subjects to give electric shocks to people, to test theories about obedience to authority figures. Sure a few people may have felt marginally more positive for a few days and another group a bit down but that’s hardly earth shattering in a world where we are bombarded by stimuli to alter our moods every day.

So, let’s get over this and learn from it:

This article is to be published in a forthcoming book by Kandy Woodfield on the social research implications of social media in general.  Please click on SOCIAL RESEARCH to see if this is something you would like to contribute to or comment on this draft if you wish me to include something in the final article.