Looks as if that hooded, oh-so-clever Zuckerberg might now be "hoist on his own petard".
The Electronic Privacy Information Center (EPIC) has filed an official complaint with the US Federal Trade Commission (FTC) over an experiment in which Facebook "purposefully messed with people’s minds." "Facebook altered the News Feeds of Facebook users to elicit positive and negative emotional responses," the complaint …
I doubt very much that anything will really come of this to disturb the status quo. Maybe some more stringent requirements around disclosure, maybe a fine, but nothing that will upset the shareholders or trouble Zuckerberg's yacht fund too much.
I do award you a point, though, for getting through a whole post (short though it was) without mentioning:
* Zukerberg's age (at least not directly)
If possible, I'd give you a second point for making it through without a single ellipsis, pun or gasp!
Even now I'm not sure that "We can use your data for research" means "we can manipulate your news feed just to see if we can make you happy or sad, or suicidal or mad enough to go out and shoot people" (and no I don't think those last two happened from their manipulation... but we don't know if it did, and neither do Facebook)
I still think they're playing with us all - my FB feed keeps flicking back to "Top Stories" from "Most Recent" and my privacy settings keep putting my posts back to "public"... so I wonder if this is another "lets fuck with their minds" experiment.
> "research" is one thing but surely this falls under the definition of "experiment".
Yes, "research" is broad to the point where this doesn't look like informed consent (the crucial word being "informed").
I'm more disappointed by the universities and academics for getting involved. I'm a psychologist at a UK university and I doubt our ethics committee would approve this (nor would I try). I don't think the British Psychological Society's code of ethics would permit this. Any kind of deception, or lack of informed consent, would have to be very well justified in a cost/benefit analysis. Afterwards you'd need to fully debrief people and give them an opportunity to withdraw their data from the study. I'm less familiar with the American Psychological Association's research guidelines and code of conduct, but if any of the academics are APA members then aggrieved parties might consider filing a complaint with the APA and the relevant universities.
Even if this hasn't caused much harm actual, it has arguably brought psychology and institutions into disrepute. It saddens me because my colleagues and I have spent years trying to clean up psychology. Psychological research is very reliant upon people volunteering their time. How can we expect anyone to participate in psychological research if confidence in ethical standards is eroded?
"That's a company, sold for countless billions, with another supposed billion or two users, that is trying to control people."
To be fair, that's kind of what advertising is.
The troubling part here is how they went about it. Presumably the idea is not to manipulate peoples' information in the future, but, now that they have confirmed that positive/negative posts can change someone's mood, they can look at writing software to process posts and make a judgement on if a person is seeing predominantly positive or negative stuff and then advertise accordingly.
TV advertisers can do a similar thing simply by choosing which programs their ads are shown alongside.
I don't think it's a good thing, but it's not right to think that what FB is doing is really that far out.
Much of the debate seems to be missing the point that this is *not at all* the usual "passing on harvested data to third parties" privacy issue (bad enough as that is), but that it lifts the game onto a whole *new* level by actually *changing"* data which do not belong to FB, and without either sender nor recipient being even made aware of that.
IANAL but I can't imagine that the boilerplate "It's OK that we sell your data" T&C language covers that level, too. Even "Use for research" of data cannot possibly include their clandestine *modification*.
Yep, "whoops, too bad" is about the worst apology you can give.
A couple of days ago a company I've bought e-cigs from decided that the best way to market their crap to me was to give Twitter my email address and full name, so that Twitter can invite me to register and subsequently follow my retailer... it took multiple email exchanges before they figured out that I was upset that they had spaffed my personal details to a 3rd party, not that they sent me a marketing email every 2 days.
Their subsequent "apology" was along the lines of "Well, we've done it now, can't really take it back". Fortunately, they are a UK subsidiary of a US company, I only dealt with the UK company, so I'm seeing how toothless the ICO actually is in dealing with idiots like this. Accidental data losses are one thing, this was wilful.
"We were wrong and we're sorry that we upset you" -- Wrong, that is exactly what they wanted to do (and for the record, they also tried to make some others happy).
Do we honestly think that Facebook is the only organisation that deliberately attempts to manipulate our feelings? Manipulation like this is the core of how our culture works and (especially) how our economy. Think about those charity adverts on TV which talk about suffering children in Syria or sick children in Great Ormond Street Hospital: They're not going to be getting any (or at least very little) financial support if their TV slots were full of laughing and fun times. They need to make people feel sad about what we see on the screen (I'm not being cynical about this, we need bad news in life in order to make decisions for the better).
However, the government depends on invoking fear in order to get away with removing certain freedoms. Much of the green movement depends on fear (of the consequences of our carbon-vomiting actions) and guilt (for being such carbon-vomiting gits).
Businesses and government and all manner of (non government) organisations depend on being able to manipulate our feelings, all that's happened here with Facebook is that we found out about it.
But I know that adverts are trying to manipulate me.. I know politicos in their broadcasts are trying to manipulate me.
What FB did was hide good news stories from people's friends just to see if those people reacted positively or negatively. This wouldn't be a problem if they had asked for consent and offered an opt out but they didn't.
But it goes further than that. By "hiding" your friends' news, your friend doesn't know that you did not see it. So they are now wondering why you didn't "like" or comment on their new arrival/exam pass/engagement. Or are disappointed that you have nothing comforting to say about grandpa dying/their divorce/redundancy/horrible car accident.
So they were also messing with their victims' friends' minds, and their relationships, too. You'd think that if facebook wanted to become a key means of staying in touch with people, they'd ensure that these things didn't get "lost".
Think about those charity adverts on TV which talk about suffering children in Syria or sick children in Great Ormond Street Hospital:
In most cases those are recognisable as adverts. News items, with footage (partially) provided by the organisation that would benefit from heightened awareness, less so. News providers themselves (sites, newspapers and broadcasters) can and do filter in accordance with their worldview, but one could say that the viewer has the choice to select their news provider, or use several to get a less biased view.
Facebook surreptitiously manipulating exposure of certain messages to certain viewers is something not at all like the above.
News paper chooses which stories to post to match their own agenda. That's a problem, but we can mitigate it by choosing what to read. It's a problem that's already there, but we would not want it worse.
Communications companies decide to change, amend and/or block communications between users to match their own agenda. It's not an existing problem, because until now, no one did it (AFAIK post office does not re-write your mail to "persuade" your voting decisions etc) due to difficulty, cost, lack of technology or just plain morals... now that seems it might be changing?
People gave Facebook permission to do this the moment they signed up.
If they don't like it, then they shouldn't use Facebook - it's not compulsory.
But no, crying all the way to mommy because the big bad mans did something they gave him explicit consent to do.
Wise up, cretins. On Facebook **YOU** are the product.
No I did not.. I gave fb to use my data for research. This is doing things like seeing if my posts change frequency or attitude round external events such as the world cup. It will cover seeing what sort of news feed messages I block or ignored or respond to.
It does NOT mean they can go round messing with what I see in my news feed just to see how I react.
Completely agree - a total non-issue. Besides they weren't changing actual 'news', just what stories you receive. And if your sole source of news is Facebook, then you pretty much deserve to be manipulated, as you've already signed away all requirements for impartiality and truthfulness.
Besides, adverts attempt to do the same sort of thing, and, as we all know, adverts are the arterial blood of the WWW. Every site uses them extensively after all.
So stop whining. Don't like Facebook or what it does with you? Then take charge if your life and don't use it.
Anyone a parent to moody teenagers or 20 somethings who spend inordinate amounts of their time online?
I don't need someone else deliberately messing with our heads. It's unethical, perhaps illegal. I use FB to communicate business, to chat and to keep abreast of social events. It's a good tool, but when it's manipulated to make you feel negative, that's a problem.
I'm OK though, my head is already messed up, and I don't need FB to feel "complete".
The effect may not be significant, it's just something we need to be aware of.
Biting the hand that feeds IT © 1998–2019