So, by inference, these guys wouldn't mind if someone secretly & surreptitiously decided to deliberately provoke them emotionally without regard or consent?
Facebook's “creepy” feed-manipulation experimentation, which has generated an avalanche of outrage among users, isn't without its chums. A growing collection of psychologists and tech pundits is linking arms, standing next to Mark Zuckerburg, and singing “We Shall Overcome” in the direction of mobs carrying metaphorical …
"So, by inference, these guys wouldn't mind if someone secretly & surreptitiously decided to deliberately provoke them emotionally without regard or consent?"
Actually that test has been done (at least once) when (IIRC) a professor systematically bullied and belittled one of his research students.
I can't recall if it really ended in murder (as an episode of CSI does) or just violent assault .
The US Psychological Association was very relaxed about USG interrogation methods at Guantanamo bay.
Rather more like their Psychiatric counterparts in the former Soviet Union in fact.
There is no such thing as the "US Psychological Association." You might be referring to the American Psychological Association which forbade its members to participate in the Gitmo "intelligence gathering" and voted to sanction any member who participated. That's not really very laid back.
You might also be referring to the American Psychiatric Association which is MD psychiatrists. They deferred to the American Medical Association sanctions against any physician who participate in the Gitmo interrogation.
I have no idea who USG is, so I can't respond to that. The only USG that I've heard of is the building materials company.
Any academic who used human subjects in any way without informed consent would have been summarily fired. It isn't that it doesn't happen, it does, but academics will end careers in a heartbeat when it is found out. If there is an ethical, legal, or moral issue, even tenure does not help you even a tiny bit.
So if Tal Yarkoni has a family...would he mind if someone told him they had just all been killed in an auto accident...just to test his emotions? Even though they were all safe at home?
After all..."“The manipulation had a negligible real-world impact on users’ behaviour”, Yarkoni writes."
or in his case...
“The manipulation would have a negligible real-world impact on users’ behaviour”, Yarkoni writes."
Isn't that right Tal? No problem is there?
Since the 2012 election, FB has been touting that it will have a "package" ready for sale by the 2016 election. This software is being heavily pitched to Republican groups as a last great hope to not only influence the perception of issues and candidates, but, by introducing tiny false data bits, to change the outcome of elections. In other words, pay Sucker-Man enough money and he will be able to get you elected.
Huh? Zuck is hard core Democrat. I am sure he is pitching a product to help the Obama Republicans win in the next election.
Cantor's loss has terrified a lot of career politicians and their supporters. I am sure Zuckerberg is working very hard to make sure that doesn't happen again.
Bootnote: Back in the Reagan days there were Democrats known as Reagan Democrats.
"And El Reg can't help but wonder why informed consent is a concept that requires scare quotes."
Simple. If they have to act responsibly, that's a barrier to arbitrary action, usually intended to benefit them financially in some way, even if indirectly. I very much doubt Facebook worked with these folks purely out of interest in science, even if they had received no or minimal compensation. Knowing these sorts of things help them figure out how best to monetize their users.
If companies and/or academics working with them had to request consent or, possibly, be completely barred from such research, then this means they get to investigate how to monetize more slowly, or not at all, and they will resort to exactly this sort of red-herring arguments to try and hedge against that risk.
It's really quite disgusting.
The fact that requesting consent alters the outcome of the observation is well established in psychology. It's part of why they typically write the consent forms in a vague way before you sign them. Then they tell you they are testing for something when in fact they are testing for something else. A prime example is the experiment where they told the actual test subjects that they would be assisting a certified authority in questioning people who had committed a crime that resulted in its victims being in imminent danger. The subject was then asked to run the dials on an electroshock mechanism to encourage the criminal (who was in actuality an actor/actress) to tell the truth. They got the test subjects to turn up the dial to lethal levels with fair ease. If they had told the test subjects it was a test of their own morality, they would likely not have gotten those results. (I'd contend they'd have gotten different results if the alleged perp had been connected to a real machine and the pain and suffering had been real instead of an actor, but that's not relevant to the main point.)
The thing is, even though we KNOW informed consent is an obstacle to gathering accurate data, we as a society have decided it is more important to require informed consent than gather the information. This isn't merely theoretical. You can hunt down quite a few horror stories when informed consent is not required. The most prominent one to recently get a public film is probably the Tuskegee Airmen and of course the ultimate extreme is some of the experiments the Nazi's ran on "undesirables" in WW2.
This post has been deleted by its author
For more of this sh*t, in fact an onslaught of it! But they needed to gauge the public opinion first, and then steamroll over it with implicit consent, like 'manufacturing consent' for the Iraq War... i.e. Desensitise you and break you down drip by drip with false flag data...
I believe the announcement is more of an indication of completions and an effort to soften the blow of what fb has been doing for quite some time. The fact they announced the new indicates that they want posting behavior to change, which in turn will skew any future results now.
What fb psychologists did was far beyond just just fb content. Through my observation, they did a mixture of fb content with real-life interjections. For a psychological study you need some sort of real-life interaction to gauge conclusive results (phone, in person, etc). No doubt they read messages, viewed posts and pretty much all the personal information shared to formulate a case study. No doubt they caused a lot of sever reaction in the real-world with their invasive and intrusive private invading methods.
<No doubt they read messages, viewed posts and pretty much all the personal information shared to formulate a case study.>
No, they didn't do those things…
" Researchers did not view any names of users or even the words posted by users. They relied on automated text analysis, through a software program called the Linguistic Inquiry Word Count, to measure the emotional content of each post."
<No doubt they caused a lot of sever reaction in the real-world with their invasive and intrusive private invading methods>
This is closer to the real issue, but still wrapped up rather in pitchfork talk. The research might have triggered a negative reaction sufficient in some vulnerable participants to precipitate significant harm. Did the researchers anticipate and mitigate that possibility?
I say this not because I am defending the apparent absence of consent in this research, but because there is a baby in this bathwater.
It's almost certainly not some ghastly conspiracy in order to get permission to do even more hideous things in future.
It's far more likely to be that no-one at Facebook gives a damn about privacy, and it never even occurred to them that there'd be a problem, or that anyone would object. After all, they're used to people giving them all their most intimate data, with almost no restrictions. So it's little wonder that they feel they have the right to do whatever the hell they like with it.
"It's almost certainly not some ghastly conspiracy in order to get permission to do even more hideous things in future. It's far more likely to be that no-one at Facebook gives a damn about privacy, and it never even occurred to them that there'd be a problem,"
I think its a combination. Clearly they do care about privacy but only from a fallout, backfire, loss of monetization angle. As regards conspiracy? No, but they do keep pushing the button, i.e. "what can we get away with doing with all this personal info", its just too irresistible to FB data mining gimps...
The author appears to have a very poor understanding of research methods regarding human subjects. First, I'll point out that the author likely regularly uses behavioral priming in the titles of the articles and benefits from previous research in this area demonstrating effectiveness. Second, the researchers could have known the effect would be small in advance due to the extensive literature on emotional priming conducting in both controlled lab environments and field studies. In addition, a tiny effect probably means shifting the average person less than .10 on a 1-10 scale measuring happiness. Third, the business version of the informed consent is the agreement each user agrees to when registering and most major companies preform some type of research. In fact, I think we should applaud Facebook for being willing to publish their findings rather than keep them in house and for being willing to team with academics to properly interpret the data as otherwise the in house spin would have ensured this occurred even more frequently without the publics knowledge. Lastly, the author should be ashamed for throwing all psychological researchers under the bus without properly understanding that there are fair more topics and studies being conducted that greatly benefit people. Creating a sense of distrust by writing an inflammatory article without fully understanding the topic is a woeful excuse for journalism.
"Lastly, the author should be ashamed for throwing all psychological researchers under the bus without properly understanding that there are fair more topics and studies being conducted that greatly benefit people. Creating a sense of distrust by writing an inflammatory article without fully understanding the topic is a woeful excuse for journalism."
You sound quite excited by the work.
Funny you can't quite bring yourself to put your name on your comments, isn't it?
If this was the first ethically questionable piece of psychological or psychiatric research, you might have point AC. But it's not , research standards and ethics are not just there to protect the individual but to attempt to give some credibility to the research. In psychiatry and psychology this credibility is often spurious as research is not an open question (what's good for depression) but a closed one (does our new wonder drug cause problems). results even with standards and ethics committees are manipulated until a small statistical anomaly becomes a therapeutic advantage. the research is hard to reproduce with patented proprietary chemicals because of the financial implications . This partly because research is carried out for or on behalf of public companies , the shareholder needs must also play a part in the skewing of results. for an interesting historic perspective on fraud in research and some really gross examples for psychology and psychiatry see Betrayers of the Truth Wade Nicholas (Author), Broad William J. (Author) ISBN-13: 978-0671447694
There is no "business version" of informed consent. That is complete nonsense. Many of the unwilling participants were underage children who are legally incapable of giving informed consent in any fashion. The informed consent that FB is claiming is for "internal operations" use only, not experimental academic research.
The results will likely be: 1) Adam DI Kramer will be fired from FB. The lawyers have already added after-the-fact language to the TOS giving permission for academic research, but even this will not hold legal water. 2) The four low-level researchers will be fired and their careers essentially terminated for ethics violations. They also face jail time for lying to the IRB's about the funding and nature of the "research." 3) Cornell, UCSF, UCSD and Yale will likely be hit with research sanctions ranging from requirements for increased IRB scrutiny to IRB training to limits on research similar to football recruiting sanctions. 4) There could possibly be repercussions from the illegal use of federal and state tobacco settlement money. 5) There will likely be serious repercussions in Europe from the use of non-consenting underage experimental research subjects. Huge (multi-billion dollar) fines are a likely result.
Mostly these things happen behind closed doors, but academic has ways of punishing people who violate moral, legal or ethical rules and laws. Bad things are going to happen to a lot of peoples' careers.
They are likely working for the Government as part of the research for how to make us believe all the bullshit that the politicians would like to tell us.
I haven't forgotten this - http://www.theregister.co.uk/2012/05/29/science_and_maths_knowledge_makes_you_sceptical/
Now the psych boys have to show that they can make us believe this bit of "It's OK" crap.
The actual outrageous deed could have been anything at all for this purpose.
Really? Is that a valid excuse? Isn't that the sort of stupid concept that Blair and Bush used to argue that we should invade Iraq... Look what a success that turned out to be.
But then again it is from someone in Texas and they do seem to have a completely distorted view of reality there.
My understanding of Facebook's actions is that they did it because they could.
And that it would make them richer than Croesus, or possibly even Apple, if they would be able to manipulate the emotions of their users as posited. It would take advertising to a new level and really, really stick it to Google.
Just think of implications in the political sphere if Facebook had a preferred candidate.
Just what is the cost of political mood?
"The TL;dr version? You're all wrong, quite possibly ignorant, routinely manipulated, and why should we let ethics get in the way of science?"
I guess he's fine with the famed "monster" study of Wendell Johnson or the equally notorious sex-change operation carried out on infant David Reimer that lead, years later, to his suicide. Or, moving to animals, what about Seligman & Maier's experiments on dogs, which involved repeatedly subjecting them to electric shocks or the experiments on monkeys carried out by Harlow - both explicitly designed to cause gross psychological distress in the animals under their care?
I am not going to claim that there have never been useful results out of unethical experiments and studies but I think most people would agree that trade-off is not necessarily worth it.
The words of Spock - the needs of the many outweigh the needs of the one, or the few - have some merit but it all depends on how they are applied.
It is one thing to choose to give a vaccine to a group of people rather than a single person*. It is another thing, entirely, to purposefully infect a healthy person so you can study a disease and thus help cure a group of people.
I am a supporter of science and I believe that one of the hallmarks of a civilised society is the pursuit of knowledge. I also believe, however that another hallmark of a civilised society is that it has left 'the ends justify the means' behind long ago.
* - Perhaps they are in different locations and you only have time to reach one.
"The words of Spock - the needs of the many outweigh the needs of the one, or the few - have some merit but it all depends on how they are applied."
Careful there. That's only a Planck lenght away from things like "the need of the society as a whole to feel safe outweighs the right of an alleged criminal to be judged fairly, so if we're not sure, we should just lock him up to be on the safe side" - after all, the 'weight' would surely be on the side of the mases...
Me, I'll just stick to the "the needs of no matter how many can never outweigh the rights of even the single one" version. Sorry, Spock.
Absolutely, which was the point of "depends on how [it's] applied".
I certainly agree that the needs of the many doesn't (don't) outweigh the rights of even a single person, but that is not what I was talking about.
Of course, not all rights are of equal importance and so some are set down with limits. I have a right to say what I want but only in so far as that doesn't overly impact another's right to be left alone.
Yahoo / Flickr rolling out crappy beta versions of websites to an unsuspecting and random handful of subscribers, then foisting a basically shit user experience on people sometime later? I mean, in the end, the service provider picks what experience the user will have and they try out different versions on experimental subsets of subscribers.
"There's not much risk, for example, that re-weighting sponsored posts that pop up in a user's feed will make someone with clinical depression feel worse."
Actually, all those ads are making people—depressed or otherwise—feel worse. We are manipulating the human mind into spending money on unnecessary shit in order to make a few people more wealthy.
As the AC pointed out above, the business version of the informed consent is the agreement each user agrees to when registering and most major companies preform some type of research. It makes me wonder if the user agreement would really cover this, at least in a legal sense. It's not as though in signing up for a service you are expecting to be experimented upon.
Also, academics have to put their experiments through a review process before going forward. Part of that process is an ethical determination of expected or possible harm to the subjects put against the expected gains in knowledge. From a business practice, I wouldn't be surprised if this came down to "Is this likely to cost us more money than it is likely to generate?" Just a thought.
These academics DIDN'T put this experiment through a review process or it would not have gotten off the ground WITHOUT informed consent.
And no, no business can ever include this kind of legally binding consent in their terms and agreements for service delivery. This kind of thing get the same multiple places initialed and signed treatment you have on a standard mortgage agreement.
Sadly, you're right ... not me though ...
Oddly, you actually "apply" for an account closure.
Best e-mail, ever, follows ...
------------- snip ---------------
We have received a request to permanently delete your account. Your account has been deactivated from the site and will be permanently deleted within 14 days.
If you did not request to permanently delete your account, please login to Facebook to cancel this request:
The Facebook Team
The author's bias is absurd.
1) The "research" was funded by Facebook and by money stolen from smoking cessation research funding. The "researchers" lied to the Cornell and UCSF IRB's about the funding, claiming that it was funded by the Army Research Office (which doesn't even exist) and private foundation money. O(nce the claim of Army funding was made, even if it was false, it brought the "research" under the aegis of the Office of Human Subject Protection.
2) The "researchers" lied to the Cornell and UCSF IRB's about whether it was experimental or observational science. They claimed falsely that this was only analysis of pre-existing data that was then collected. This is against the law.
3) A significant number of the participants (into the tens of thousands) were underage children who cannot give consent under Federal law. In fact, even looking at the data for such children is a crime. Children get extra protection under the law because they are exceptionally vulnerable.
4) There is a vast difference between observational research for which informed consent can be waived after review and experimental research in which informed consent, opt-out, and other laws must be followed. It becomes experimental research if anything in the subject's environment, including his news feed, is changes.
5) The "permission" cited by Facebook is for "internal operations" only. Other language was added to the TOS after the fact and hardly counts since it was not there and is essentially an admission of wrong-doing on the part of FB.
6) The PNAS action editor characterized the "research" as "creepy" and "likely a very bad example of experimental social research" but published it anyway. No one ever questioned that the paper did not disclose the sources of funding and the serious conflicts of interest involved.
7) Cornell has changed its story several times --- to the point that every point it is making is, at some point, a lie --- to try to escape liability. UCSF, UCSD, and Yale are keeping quiet and hoping it blows over.
If you want to see how these things can go awry, look up the Milgram Experiment. By causing "volunteers" to faux-electrocute other "volunteers" the author sought to demonstrate the extremes to which people will follow authority. But over half the "participants" refused to cooperate and were dropped from the study. In the remainder, a number suffered serious psychological harm from the "harmless" and "valuable" experiments. Many heads rolled over this one and a lot of the regulation goes back to this and the Tuskegee Syphilis Study.
AVoiceForMen has this to say about the kinds of manipulative scum who use these misandrist attacks to goad men into compliance with their schemes:
All your life you are told by others what it means to be a real man. And you are told how worthless you are if you don't measure up.
Just know this. Anyone, man or woman, sending you this message is trying to shame you into their service. They are manipulating you to carry their load, to take on their hardships; even to bleed and die for their cause... or their profit.
Don't buy the lie. No one but you can define you as a human being or measure your worth. Never trust anyone who puts an adjective in front of the word MAN.
As someone who does research and is vaguely interested in the findings there are two sides to this. In FB's defence it seems pretty unlikely that anyone would actually really be hurt by this. There is much worse stuff on the News, even in El Reg. I think it weakens people's argument against this if they say its just like prisoner/warder experiments or mind control etc. It also seems artificial to say that "proper" research that goes into journals etc. has to go through ethical review and full informed consent but "commercial" research - people ringing you up to ask how you feel about catfood, web surveys etc. don't. I'm not sure how you change this but from my experience ethics committees are supposed to reflect community standards but often seem to search for problems which are only theoretical - e.g. in a study we did to ask older users to use tabletsor smartphones we narrowly avoided having to provide counselling for users who may have got distressed. Advertisers also try to associate their products with positive things - "Soap gets you clean" is only seen in Billy the fish.
On the anti-FB side, journalists ask lots of offensive questions and shops etc. effectively do a/B tests all the time by offering discounts to some people not others.
On the anti-FB side they are staffed by the most annoying people in the world (TM), they didn't ask me if I wanted to be involved and you can actually get permission for deception in research from most ethics ctte's as long as you justify it and I think you probably could with a fairly small redesign and possibly the offer of support to people who may have been distressed, although again distress seems unlikely.
However, I would be prepared to serve on an ethical review board for large Internet companies wanting to do this sort of work, as long as the meetings were held somewhere convenient to unspoiled beaches with rather nice restaurants and first-class travel available.
Biting the hand that feeds IT © 1998–2020