Facebook vows to be more careful when toying with your emotions


Facebook has altered its research guidelines following outrage over a test that monitored how changing News Feed algorithms altered users’ emotions.


The firm was heavily criticised for a psychological experiment conducted on 700,000 people, which found users were more likely to post positive or negative status updates depending on the overriding mood of their News Feed.


Today the firm admitted it was surprised by the backlash and claims to have ‘taken to heart the comments and criticism'. Facebook says it is now adding more checks and balances and also considering alternative methods of research.


“Although this subject matter was important to research,” Facebook's CTO Mike Schroepfer wrote on the company's Newsroom blog, “we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism. It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research.


"The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.


“Over the past three months, we’ve taken a close look at the way we do research. Today we’re introducing a new framework that covers both internal work and research that might be published.”


Facebook went onto say it has given its research teams much clearer guidelines, claiming an “enhanced review process” will have to take please when studies relate to content that may be “considered deeply personal (such as emotions).”


Does the knowledge that Facebook will only manipulate your emotions when it believes it has due cause and a moral obligation settle your concerns? Or are you still freaked out by being a guinea pig? Share your thoughts below.


Read more: Why did Facebook buy WhatsApp?