OK, I'm late to the party here...
But I think this latest news has finally persuaded me to open a Facebook account,
Facebook let researchers adjust its users' news feeds to manipulate their emotions – and has suggested that such experimentation is routine, which is seemingly how the idea got past the advertising firm's ethics committees. In 2012, the trick cyclists, led by the company's data scientist Adam Kramer, manipulated which posts from …
But I think this latest news has finally persuaded me to open a Facebook account,
Me too, I want to be part of all these experiments, just did not know what I was missing :-)
I wonder which of the various things I called him did it?
(yeah, I know, if you're not paying you're the product not the consumer...)
These people definately need the 'trick-cyclist' label. There is no way they qualify as boffins.
one of the most annoying things about this for me is that it gives a bad name to social scientists in general. I know that there is not a lot of love for social scientists on these forums, but this really doesn't help anyone.
I cannot bellieve that someone signed this off - informed consent is something that every social science undergraduate has drummed into them from the first day.
What will be interesting to see is if any of the US regualtors step in, there have been suggestions that this breaks federal law in the US: (http://laboratorium.net/archive/2014/06/28/as_flies_to_wanton_boys)
0% creepy, rilly. You get more emotional response manglement whenever a politician or a spokesperson appears on the tube and uses newspeak to whip up a frenzy. Extra buffer stuffing when it's an economist working for our top clown court.
Oh, on the contrary, you manipulated peoples feeds to show them negative posts in hope to make them negative (as that was the outline of your research) whilst doing the opposite to the other half - therefore you fully intended to upset half of your research 'volunteers'...
Yeah - that's the way I read it too.
Studies like this are always problematic as there's no real way to conduct it without keeping people in the dark. When a new drug is being tested it will go through double-blind tests against a placebo. Participants are fully aware that they may be taking a placebo but it works because there is no way to tell if you are in the control group or not.
With an experiment like this, telling people the parameters of the study would expose the whole thing. Sure, it's possible that all your friends are just sad all the time but if you notice that your feed is now a bit less or more upbeat than normal then you're going to have a pretty good idea which group you are in.
So, this research is near impossible to conduct while still getting informed consent. The question has to be asked, then, whether the research is valuable enough to warrant what has happened. I would suggest not. It's important that science not simply settle for 'common sense' answers but I don't think it would be too detrimental to our understanding of emotions if we just assume that people exposed to predominantly negative information take on some of that negativity themselves.
After all, advertising works so it's really not a stretch at all to assume that 'emotional advertisement' works too.
Indeed a specious statement - cognitive dissonance generation at both ends.
It could have been done reasonably well. Throw up a notification asking if people are willing to be part of an experiment on social behavior, which may alter their experience of Facebook for the next week. Explain that providing any more details about the experiment would alter people's behavior and invalidate the results, but provide more information about the study and which group people were in after it's over. Not complete information, but enough for reasonably informed consent, and far better than how they provided no information and obtained no consent.
Second comment: Creepy indeed, and it's going to get worse.
Third comment: I propose we name this kind of thing "inverse spam", i.e. "$BIGCO is blocking messages that I explicitly signed up for! What absolute assholes!"
Comment the fourth: Consider Chevy paying Farcebook/Twatter/et alia to block all positive posts about Ford vehicles. Or vice versa.
Comment the fifth: I fear we have reached the no going back side of the slippery slope when it comes to personal privacy. George Orwell was only off by a couple decades.
Next it will be the individually crafted reorder of your newsfeeds being monetized for ad-slinging (actually product placement) purposes. Welcome to the world of tomorrow, which Orwell, Huxley, Ira Levin and Gibson could not even dream of.
"Next it will be the individually crafted reorder of your newsfeeds being monetized for ad-slinging (actually product placement) purposes."
Which is exactly what the "top stories" newsfeed already does if you allow it.
Take it you don't have a Googlemail account then?
Were psychologists able to determine if they have a greater influence on news feeds than Facebook bugs do? I think they're running one of those "eventual consistency" databases, where "eventual" extends to infinity.
If you walk into a sheep pen and behave like a sheep, you should expect to get sheared.
Or end up served with mint sauce.........................................Better to be a wolf really...
I just got more irritated and pissed off with Facebook for not giving me what I want to see in my newsfeed, which is ALL posts that I've elected to receive, most recent first. In other words, with none of their fancy filtering to try to determine which ones I might want to see applied. It was lonjg ago that I decided that I don't fit anyone's standard profiles, and FB is not an exception to that.
I think on this occasion you fit the standard profile perfectly. That is what ALL their users want from their news feed, but facebook won't give it them.
So, people are happy when their friends are perceived to be happy and sad when their friends perceived to be sad. Well done. Now where's my 'research' money?
IOW "Nobody likes me, everybody hates me, I think I'll go eat worms."
Probably made worse by the "do-over" generation.
There will probably be a video going "viral" by morning, PST.
The long thin juicy ones slip down easily,
The short fat furry ones stick.
It is an advertising platform.
Regardless of what a few mind-boffins do, advertising is intended to mess witth your mind and get you to buy crap you don't want and is MUCH worse than a slight emotional wrping due to the order you receive news in.
Getting uppy about this while placidly sucking on the advertising teet makes no sense alt all.
A benign advertising platform facebook isn't:
"They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. "
Horseshit Mr Manning, its an experimental advertising platform, and doing more and more of this creepy shit isn't going to bring new disciples!
Yup, just another day on the treadmill for the Facebook lab rats.
The _Atlantic_ article states, in part: "The backlash [against the research methodology] in this case, seems tied directly to the sense that Facebook manipulated people -- used them as guinea pigs -- without their knowledge, and in a setting where that kind of manipulation feels intimate." The outrage, then, seems to be hinged on people thinking that fecebook does NOT manipulate people, that a company whose revenues derive in no small part from adverts would not manipulate its users. Hokay. Not sure how this is different to measuring the effectiveness of said adverts (using illustration A, 24% of viewers clicked the link, while using illustration B only got 15% of viewers to click") -- viewer sees stimulus, viewer takes action, fecebook keeps track, and correlations are hypothesized -- as someone posted above, if a service is free you are not the customer you are the product. Having said that, if fecebook does manage to gets its hands slapped this time, hallelujah.
Tracking your response to adverts is one thing. Going out to purposely manipulate peoples news feeds to see if you can make them angry or happy or depressed and not telling anyone that they have been opted into an experiment that they have no way of opting out of is a completely different thing
Krammer: "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
So, this was actually a brand building exercise. Just like a toothpaste company's research, except that it made you and your kids not so cheery.
If one of those 7,00,000, goes to a shopping mall killing spree, will they be punished in a slightest way that GM is facing? Ah.. no...we are the product..
7m subjects, half being pushed negative, half positive. So with 3.5m being made more depressed than they would otherwise have been, I'd expect at least one to have committed suicide that wouldn't otherwise have done.
1. Set your News Feed to show "Most Recent" rather than what some algorithm wants to promote.
2. Go through all your friends and individually set who you want to follow (there will be some people you want to keep on your friend list but don't feel the need to read about their breakfast every day).
That's it. Now you get the stuff you want.
For some reason, many Facebook settings only work for a few days before mysteriously resetting to defaults.
The Newsfeed settings vanish so quickly that I don't use it at all anymore.
Also 'most recent' still has a scrambled order. Random posts are selected to be ordered based on the date of the last comment, not the date they were posted. How this order differs from the top-stories order I'm not sure, but its not what I'd expect from 'most recent'.
If you append ?sk=lf to the end of the facebook URL you should always get latest posts (and comments, unfortunately) first.
Dunno why they show me stuff that friends have commented on but that I'm not allowed to, though.
Don't participate in so-called "social" media.
Instead, cook dinner for your loved ones, or go have a beer with your mates.
"Social" media is anything but social ...
...and it is channel 3
Adam Kramer of Facebook is reported by BBC.co.uk as saying
""At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
The guy is Missing the point. Its Facebook with its arrogant attitude to people and the fact that not all of us feel the need to plaster minutiae of our lives on the internet that leads many to avoid visiting Facebook, a fact I am relived to see is dawning on some of those I know who are Facebook users, and are cutting down or ceasing their use of the data-mining service. Many of us can see NO value in joining, For my part I feel that joining would lead to a deficit in privacy and the quality of contact with friends, I prefer to relate to my friends individually - I feel they are worth that time and effort rather than employing the electronic scattergun approach of spraying dull details of everyday across t'interwebz
Simply put Mr Kramer, Facebook really isn't as compelling as you feel it should be, and - happily - people are starting to question their participation in it. I have never been and never will bne part of it, companies that increasingly demand a facebook or twitter account to interact with them should take not of the fact that I am p[art of a sizeable section of web user, one that may well grow in the future.
This is as sinister and underhanded as subliminal advertising, and it is time that this "service" was more strictly controlled - those running it clearly have no morals when it comes to messing with other peoples heads.
So they acknowledge direct psychological manipulation of unknowing users? That's a pretty stupid thing to do considering that amongst the vast number of FB users there must be at least some who are at one time or another in a mentally fragile state.
The Ts & Cs may refer to use of data for research purposes, but this amounts to use of the users minds for research purposes - and that sure isn't covered.
It doesn't surprise me that FB doesn't see any difference, or care if there is, but it worries the hell out of me that the researchers didn't see any need to consider the ethical implications of what they were doing.
All the people who responded positively to negative stories and vice versa get tagged as potential psychos and then get nothing but gun and knife adverts.
they should have consulted the Labour "advisors" how to manipulate people feelings.
I see, they work for the current government now!
And this kind of nonsense is why anyone with a hint on sense has FaceBook Purity (FBP) installed and set to remove that waste of space called the newsfeed from sight.
If I want the news I'll look at a news website.
Actually it's called fluff busting purity, because otherwise Zuck + pals might get a tad... litigious about it.
A vote from me for fbpurity: the function that hides posts containing your chosen keywords is a $deity send.
OK I signed up and accepted that they could use data I provided for research.
Where is it written that actually that means I'm signing up as a test subject? This is an experiment I could have been involuntarily involved with. If a drugs company did it there would be hell to pay.
Doesn't this actually contravene human rights legislation. Maybe the EU need to get involved and find out how many EU citizens were used as human Guinea pigs without consenting.
What more proof do we need that Facebook is garbage — as are the sites that use it as their commenting platform? It lives down to the lowest ideals of its founder.
to ditch FB.
It was Adam Kramer's comment "it was in 2012, and we've come a long way since then" that really sent the chills down me.
You do know that the more recent mobile phone versions have your permission to read your texts, right?
What happens to those "volunteers" who suffer with any kind of depression, and who rely on facebook to provide their "warm fuzzies" and help get them through the day?
As many here have pointed out before, if you get it for free you are not the customer you are the product.
What Farcebook are forgetting is that not only are their members the product, they are the resource, any company that hopes to have any kind of longevity in its chosen business knows that managing and shepherding its prime resources is fundamental to success.
When you treat resources as a limitless, maintenance free supply of product that will always be there; someday you are going to wake up with a nasty shock, it will all be gone.
Let's Hope Eh!
.. Simply refuse to have any dealings with a company that advertises on Feb. If it's your bank or power supplier then switch and tell them that you are switching because by running adverts on Facebook they have supported extremely dubious social engineering experiments.
No one will of course... Such is the ennui
Makes you wonder how many idiots (People already suffering manic depression etc.) went out and injured others (the normal trolls in many cases) or themselves because they were being manipulated. It sounds as if they were 'practicing' (researching) psychology/psychiatry which to the best of my knowledge information and belief requires a license.
Our government has been known to 'experiment' with what it seems to think is ITS property (People) for years.
next they will be shaping mainstream news to change the way we think..........
I'd get my coat but I'm a coward!