Remember that one? Just use a beheading-the-infidels video instead (or whatever)....
A Labour MP is pushing the UK government to introduce strict time limits for tech giants to take down illegal content as part of draft counter-terror legislation. Taking a leaf out of Germany's playbook, Stephen Doughty, a member of the Home Affairs Committee, wants to enshrine in law how search engines and social media …
If I post a complaint that a video is illegal,
(a) are Google, FB etc obliged to take it down as *I* made a complaint and said it was illegal?
(b) do the companies themselves have to decide whether the material referred to in a complaint is illegal then take it down?
If (a) then the entire system is open to abuse and fraudulent/frivolous complaints.
If (b) then we are opening the entire system to open censorship by private companies.
Either way it stinks.
What should happen? I make a complaint, an officer of the state (police, judge or whatever) decides on the validity of that complaint and issues a take down/block request to the company followed necessarily by legal action against the purportrator.
Andy - this isn't how it works in any other industries. In all the companies I've worked for, none has the attitude that they'll just let illegal stuff happen and wait until the feds show up with a warrant before they stop. They all have legal, H&S and HR depts plus specialist consultants when necessary who manage to keep them on the right side of the law. The Reg. like any other publisher, wouldn't publish anything it considered libellous and wait for the court case and I assume its journos are well trained or experienced enough to recognize dubious content and get advice before hitting the button. This might result in a "when in doubt, don't publish" attitude, but a quick glance at the UK press doesn't leave one with the impression of a heavily self-censored fourth estate.
The defence that Google, FB and the rest put up is that there's so much content that they couldn't possibly be proactive and therefore it's too hard and they can't do it. No other industry gets away with that approach and neither should Google, FB, etc. If their business model allows people to do illegal stuff unchecked then they should treated as accessories and prosecuted in the criminal courts. If the law were changed to make the platform owners criminally responsible for their content then I bet they'd find a way to fix it very, very quickly.
If their business model allows people to do illegal stuff unchecked then they should treated as accessories and prosecuted in the criminal courts
Then so should the network companies that carry it.
that's why we need to ban encryption so that Cable & Wireless can check all the packets going down its fibers. We also need the post office to open all mail to check that it isn't illegal
@Yet Another Anonymous Coward
Then so should the network companies that carry it.
That's a complete non-sequitur. The Royal Mail doesn't get prosecuted for delivering dodgy materials in a plain brown wrapping. Vodafone doesn't get prosecuted if you shout abuse at someone over the phone. Nobody prosecutes Highways England for maintaining the roads that criminals drive on. It's the publisher who's responsible, not the carrier.
There you go, FTFY.
Nonsense. All of those publications have highly trained journalists, and editors, who know to an inch what the libel law says - and more importantly, what it doesn't say. I suggest you do some research on it yourself, then maybe you'll understand why they do things the way they do.
Policing forums is not at all related to publishing. The publisher is the person putting up the post. All Google/FB/etc are doing (in these cases) is providing transmission capability (just like BT and the Royal Mail).
That's like saying "the journalist is the one publishing the story, the newspaper is just transmitting it". It's - just wrong.
Facebook and Google - their entire business model is based on drawing people's attention to content that they wouldn't otherwise be aware of. If that's not publishing, what is?
The Reg. like any other publisher, with the exception of the Daily Mail Group*, the Sun, Express, Mirror, etc. etc. wouldn't publish anything it considered libellous and wait for the court case
There you go, FTFY.
*From my understanding, the Daily Heil is particularly adept at publishing total bollocks of the dog-whistle variety on the front page, and then making quiet retractions in small print on their website at midnight on a Sunday night, or similar.
"The defence that Google, FB and the rest put up is that there's so much content that they couldn't possibly be proactive and therefore it's too hard and they can't do it. No other industry gets away with that approach and neither should Google, FB, etc."
Have you actually looked at how many people would need to be employed full time to watch every video uploaded to YouTube?
I had a quick google, and what seems to be a recent estimate says that 300 hours of video are uploaded to YouTube every minute.
So you would need 18,000 people full time doing nothing else. At least 3 shifts a day. Weekends. Holidays. And how long could anyone stay doing that job? So you'd need constant replacements.
You'd need people who understood all the languages being spoken, including colloquialisms.
Then you've got all the other social media.
It is impossible.
Christoph - it's not impossible. In fact, you've solved it for them: just go and employ about 100'000 mods plus a bit of infrastructure. That sounds very possible for an organization like Google whose turnover is about 90% of the value of NHS annual funding but has less than a tenth of the headcount. If it makes a dent in Google's meagre profits then they might have to change their business model by, say, charging people to post and charging them for hosting. That might also have the benefit of reducing some of the dross and the number of reviewers required.
It's not impossible - it's difficult, expensive and, perhaps, a tad inconvenient, but it's not impossible.
No other industry gets the right to say "We'll choose which laws we obey based on how much it's going to cost us to implement them." Why should Google, FB, etc.?
"In fact, you've solved it for them: just go and employ about 100'000 mods plus a bit of infrastructure"
Not really. The workforce would be 10% effective at best so you would need a million of them. It soon becomes cost effective to just pull the plug on the countries that require it by law.
A more probable scenario is that they (Google) would develop an AI system to do the task. We would then be in the position of an AI dictating to us what we could see and know. Be careful what you ask for.
they (Google) would develop an AI system to do the task
They did - it simply blocked any video with arabic script or black flags.
I foresee two possible outcomes: either the pirates will see off the AI baddies and strike a blow against online censorship, or the pirates end up as collateral damage but rising sea-levels from the resultant global warming will flood the censorship data centres. I'm hedging my bets by land-banking hill-top sites with planning permission for factories that manufacture Pirate Requisites.
The problem isn't that it's difficult - it's that Google (and others, but take that as read from here on) don't want to do it, because it will reduce clicks and dwell time.
If the HSE come calling to talk to me about why those people fell off the roof I can't say "sorry, mate, I'm too busy to do risk assessments and buy PPE - do you know how hard it is to do that stuff when I've got to do the monthly management powerpoint". Why should Google be able to do the equivalent? If their business model and practices aren't compatible with the laws of the country then either they change, or fuck off.
When Germany changed the law, backed up with fines of up to €50M for not taking content down in 24 hours, Facebook managed to do the "impossible" with about 1200 mods and some trauma counsellors.
"In fact, you've solved it for them: just go and employ about 100'000 mods plus a bit of infrastructure"
It won't work. People get disability for workplace PTSD from screening for pornography.
Somehow I expect stress and burnout rates for screening for terrorism would be higher.
Then again, if this stuff is as insidiously effective as the angst about it seems to indicate, then you are just recruiting tens of thousands of new terrorists and supporters from among your million+ (with constant turnover) censor pool.
China seem to manage it.
China's approach to censorship is more sophisticated than that. Yes, they do have the Great Firewall, but they've also realised - like the Russians - that the truth can be just as effectively suppressed by bombarding people with false, or simply irrelevant, information, so that the information you want to cover up gets - literally - covered.
That's why China has huge and thriving social media companies of its own: they're part of a deliberate government strategy to do to its own population - the same thing that the populations of the West are doing to themselves voluntarily.
Sorry, Headley, your analogy is completely wrong. Policing forums is not at all related to publishing. The publisher is the person putting up the post. All Google/FB/etc are doing (in these cases) is providing transmission capability (just like BT and the Royal Mail).
A better analogy for FB/Google/etc is a hotel. Of course a hotel doesn't want people using its rooms to conduct illegal activities (e.g. run a criminal operation). But it doesn't employ people to spy on all the rooms all the time, monitor what people are doing, etc. It waits for the police to call about some activity and then it may terminate the room hire.
I am no fan of Google/FB/etc (in fact, I do not use them), but in this case they are right. This is critical because Google/FB/etc are the town square nowadays, whether I like it or not. If I have a complaint (against the government or against a company) I need to be able to air it on FB/etc. FB/etc should not be allowed to interfere with that unless my activity is illegal.
Graham - the analogy with the hotel is interesting, and made me think.
If Google were like the hotel then it wouldn't know anything about it's customers other than name, length of stay and whether they'd stayed before - and in that case I'd agree with your analogy. Google would be a channel, just like the hotel, the Royal Mail and the local bus.
If the hotel were like Facebook then it would know lots about the customer - voting and sexual preferences, shopping likes, friend, acquaintances, and would have a history of messages, friends, etc., etc., and it *would* have cameras in every room, and also on the streets around the hotel, and in the local buses and taxis. It would know where you'd been, who you'd talked to, what you've just bought and where you're going tomorrow. If it wanted, the hotel could guess to a high level of certainty whether its customers were up to nefarious deeds or not before they even set foot in the room
OK - so that's egging the cake a bit, but Google/Facebook/etc. are not passive channels in the way that a hotel, Royal Mail and the local bus service are. They can, and do, read, analyse, synthesize and profit from the information which users post on their platforms. In my opinion (others are available) it is this agency which differentiates them from the notice board in the window of the local newsagent and with agency comes responsibility.
BTW - I'm disinterested in whether we should have an x-hour takedown law. My interest in the debate is simply that I don't believe Google et al should be above the law. I agree that bad law would be a bad thing, but the argument that it would be too difficult for the Googles to police needs to be tested a little more than simply believing it - and they haven't walked away from other countries who have implemented takedown limits.
but this analogy shows that they DO get away with not following the law. If you own a shopping mall, you can't say "oh, there's just too many people coming for a stroll, we can't possibly police it". And yet, the malls DO police their yard. Ineffectively, but they DO have this legal duty.
In effect, the Big Boys (of the Internet) say: our domain is so vast and punters so numerous, we can't take the burden of policing it (they can, and it would be eye-wateringly expensive to them, and horrible, and might well break their business model, i.e. the internet as a whole, but they CAN do it, whether successfully, is another matter). But they squel, because they know it takes time (years) to introduce legislation, and legislators change (and yes, some can be bought outright), and times change, so another makeshift solution might turn up in the meantime. So they delay by employing all possible resources to delay (and, from their evil, no doubt, viewpoint, it's the best tactic). But they should not be above the law, they should not be too large to fail (although, in reality, this is exactly the case with large organizations, as we all know).
That would be New Labor, essentially Tories in a sluttier dress and with a slightly different definition of the term "pearl necklace". Far removed from the actual Labor party.
I am not in favour of the Tories or Labour, they're both rampant Muppet farms, but New Labor and the Blairites are/were even worse than either.
It's a shame there are still so many Blairite Labour MPs, isn't it?
Don't know this one, but I see he was an adviser to one of the big Blairites before he entered parliament, so I would expect he'll be one of them.
Why are you so fixated on him being Labour and, horror of horrors, a Bl**rite. You should be more concerned about whether this is a good thing or a bad thing. PS. since it attempts to constrain the power of the megacorps, it's a good thing.
@cynic56, does it really constrain the power of the megacorps? Does it? Legislating for a specific use case is short-sighted (albeit with good intentions). Just remember how much local councils abused RIPA in the name of service delivery despite it being designed *not* to be used for such purposes (funnily enough, RIPA was to deal with terrorism...).
Please. "This was New Labor" is the standard bloody excuse that any Labourite of the more lefty persuasion comes up with. Without 'New Labour', Labour would've been consigned to the loony bin for decades already. Thanks to New Labour, Labour *had* its time in the sun for the better part of 17 years. So no. "This was New Labor" doesn't cut it. At. All.
One thing I *do* and *will* agree with you on is one of your closing comments... Yes, yes they are muppet farms. British politics is infested with them. Muppets, that is.
... And then it's a Labour MP who pulls this stunt. Oh, yes, and we know who put some of the original surveillance laws in place too... It wasn't the blue party either.
Putting aside the discussion of what colour weasel came up with this, the proposals aren't inherently bad. In their current form, they are unworkable for some pretty obvious reasons. However, the problem of online terrorist material isn't going to go away without some form of action.
Assuming you don't believe in a total libertarian world, where everything is permitted, and free speech has no bounds, there have to be laws about what you are not permitted to say. (The archetypical example is the freedom to shout 'fire' in a crowded venue - it is not infringing freedom of speech to make causing mass panic a crime).
With the advent of youtube et al it has become trivially easy to publish terrorist material, and indeed criminal material of all types. There is no real argument about whether it is harmful to society, so someone has to tackle it. It follows that it should be incumbent on the content providers to take down material that has been flagged as illegal. The technicality comes in how, and by whom, this flagging takes place, and what oversight there is of the removal process to ensure it is not abused, as well as the question of what recourse someone has if their material is wrongly removed. The big players (Google, Apple, Microsoft, etc.) have not come up with any sensible suggestions as far as I know, hence these proposals. Bear in mind that they are, currently, at the committee stage, prior to a white paper, and prior to any debate in parliament. Whilst it is entirely correct to be concerned about the potential impact on civil liberties, it might be a bit premature for those concerns to be substantial. Right now, I'd be more concerned about the likes of May being on record as wanting to get rid of the ECHR.
I'm sorry toi sound trite that all politicians are the same, but in some, fundamental ways, they are the same. They would hate the other party get more votes, but as both parties seek the same votes, from the same voters, they apply to the same voters' hopes and fears. They might appeal to different ideals, but essentially, all people want the same. And we all fear TERRORISTS, and PAEDOPHILES, so the politicians appeal to those fears (and make themselves the Custodians, the Ultimate Protectors). And if the fear isn't strong enough, well, we can remind the plebs it is a CLEAR AND PRESENT DANGER by introducing legislation against it (Act on Evildoing and Other Wrongs).
No party has a monopoly.
Let's keep in mind CMD got in at least in part on dumping Blairs beloved National ID Card scheme and deeply Orwellian "National Identity Register" cradle-to-grave database of what, when and where you were/are doing stuff.
Too many people forget that "authoritarian/freedom" is a completely orthogonal axis to political left/right. Unfortunately, becoming an MP mostly attracts more authoritarians so they are over-represented in the House of Commons. We need more civil society types appointed to the House of Lords to counter this.
...and here is the future basis for permanent retention of all your internet activity.
Idiots competing over piety in any field result in an incremental slide into dystopia. We're only recognising it in the internet because it is so new we can see the process from the outset.
It needs to be caught and stopped at every step, because it's like a ratchet; every loss is kept, every respite is temporary.
This is a terrible idea. Not only in the amount of retention required, personnel involved, and processes to follow amounting to a greater than proposed maximum length (6 hours) required to determine if its illegal material or not.
When you delve into it - who defines illegal material? I mean there was a video roaming around Mossad throwing an ISIS fighter off a cliff... ISIS is a terror organisation, but Mossad are meant to be allies. The video is terrifying. Who draws the line?
Governments are still using “terrorism” as a scareword to get any insane law passed – like Britain’s digital book-burning law. But with its other hand, those same governments are expanding the definition of terrorism way beyond what the public could possibly imagine: the government’s own training material says that peaceful street protests in disagreement with administration policies are examples of terrorism. (Source : https://www.privateinternetaccess.com/blog/2017/10/reminder-in-government-training-material-terrorism-includes-peacefully-disagreeing-with-administration-policy-in-public/)
So, legally speaking, any disagreement of the government policy, or process, that is protested peacefully, can still be deemed a terror act.
Striking (unions) would also come under that surely?
Do Facebook/Google etc.. have to hire people specialised in UK law, then EU law, then USA law, AUS law, SA law, etc....
Which laws are they protecting? The local laws where it was posted, local laws where it was viewed from, or "business laws" created to ensure a poor image of the company isnt propagated?
Where are they prosecuting from? Local laws in the UK differ to Ireland, what may be ok here isn't there - and that is where the DC is.
Where does the line get drawn? This just goes to further demonstrate the fundamental lack of understanding our lawmakers and policymakers have in the digital age, and the lack of preparation. Almost as good as Amber Rudd saying she doesnt need to understand anything about Encryption to know its bad (though, she will be happy to know her Bank is keeping her money safe, even if she is trying to undermine it all.....)
Another line-drawing problem is the continuum between public and private messages. There's a public web page at one extreme, and an encrypted message to my spouse/solicitor at the other extreme, or a Google Docs document that I haven't shared with anyone at all (so it's not even communication? unless several people are sharing a single account?), and somewhere in between there are things like a message sent to an invitation-only mailing list or "Google Group" or whatever.
@TechyLogic: your link (that isn't linked) points to a story that links to a story that doesn't link to alleged materials allegedly used by the US Department of Defense. That's pretty tenuous. Even if it's true (and no final link means that can't be verified), it's (a) a completely different government and (b) a department of that government that has nothing to do with domestic law enforcement.
The phrase "So, legally speaking" is therefore gross exaggeration. Training materials produced by the DoD have zero legal weight even in the USA. In the UK, it's hard to imagine anything less relevant.
I think that the entire thing is being approached from the wrong direction. Take all the politics and beliefs away and you are left with:
Speech that is inciting hate/extremism/violence
These are the fundamental things that need dealing with and for most of the part the current crop of US based companies shirk any responsibility. There is as much incitement coming out of Christian or other teachings that are deemed acceptable because they are Western in origins as there is from the Muslim aligned faiths. The problem with taking this approach is that it does not align with the political rhetoric that if you are not a friend of the US then you are evil. The freedom of expression that an uncontrolled Internet provides is also is downfall. There is so much junk returned in searches that it is getting more and more difficult to actually find what you want. Even simple searches can return huge quantities of useless results that land you on advert infested pages.
My biggest gripe is the amount of click bait links for items that are discontinued or not available (if they ever where) at a price. The later often is Amazon links that show an item at one price yet if you click on the link it is double that yet because you visited the page, all the metrics mean someone makes some money, just not from selling anything. We appear to have reached the sorry state where there is more money to be made from people visiting a page than actually buying things.
...I don't know where to begin.
But I'm sure that it will be another weapon the Police use to terrorise us. Effectively, it provides the Government with a giant moderation power of ALL online discussion - which effectively means ALL DISCUSSION in the UK.
I have fought against thought control in the UK all of my life, and I've never seen it get so close as it is now...
Wait until this is all settled in with a fixed time limit in which they have to respond. Then organise a mass reporting where huge numbers of bots make near-simultaneous reports of different items. What are they supposed to do, other than simply delete every single reported item without checking it?
"How can they prove you watched it? For all they know a malicious website opened a new tab or a pop-under window that started streaming it in the background."
So is it illegal before anyone evaluates it?
Does viewing before evaluation for forbidden content count against your three lifetime strikes?
Does streaming after following a deceptive or ambiguous link count?
Does streaming as items in a linked set of videos, the first of which is not forbidden content, count? (which you might not even be watching on your unattended monitor while you feed the cat / answer the door / make dinner)
What about streams that open on a page in an advertising spot, automagically? (again, possibly in a window that is not being watched for some reason)
How can anyone not an officer of the court or a government trained and approved censor evaluate the edge cases accurately?
exemptions should include academics and journalists, as well as people who were viewing to gain a better understanding or did so "out of foolishness or poor judgement".
That seems to cover a wide base, ignoring the 'professional' exempt, the amateur exempt covers the span from the curious that seek knowledge to the naive and feckless.
They'll be wanting to use it to cut the former out of society, the latter are proles and can be safely ignored cause they merely clicked on it 'cause it was there while the former 'actively searched for it'.
Rule one of taking away power from the people... start small and go from there. There is absolutely no way of policing what people say online those that want to cause harm to us do not use public websites. This will simply be a launching platform to take down copyright infringing websites. Just like the war on drugs, this will fail.
They are trying to redefine it like they did with people downloading illegal porn, where they now charge people with making an illegal image because a copy was found in their internet cache, even for just a thumbnail image in you cache can get you charged with an offence and that could have easily got there without your knowledge.
And with case of terrorism or sexual offences the people often assume your guilty until proven innocent, which isn't helped by the police failing to hand over evidence in recent rape trials that could have undermined the prosecution case.
Um, 'illegal' content? 'Illegal' _where_? This is the Internet - illegal in the place it was posted from? Illegal in the place it is observed or read? Both? Or is the expectation that content carriers will be able to create country specific filters to block content based on - er, what? The receiving device's IP address? VPNs seem to make that likely to fail. IP spoofing isn't difficult. And what about posts that are legal where the poster is, and perhaps protected by forms of local freedom of expression legislation? If someone somewhere else, for example the UK, posts a complaint, is the content carriers supposed to take down the material, possibly violating said poster-local legislation? For example, if an American citizen posts video of them using a gun to shoot at things, possibly targets with pictures of people on, and someone in the UK complains said video is a terror incitement?
Sigh. I think I'll go and sit in a corner and grump some more.
if an American citizen posts video of them using a gun to shoot at things, possibly targets with pictures of people on
If it's brown people shooting at pictures of white people then it is taken down as terrorism
If it's white people shooting at pictures of brown people then it's ok as 1st and 2nd amendment
If it's white people shooting at pictures of black people then it's taken down as copyright theft of police training videos
oh dear, I misread that this is to limit my watching of illegal online content to 6-(hours)-a-day...
p.s. I would also like the honourable gentlemen (and the rest) to introduce a law PUNISHING anybody watching or hearing a crime being commited in front of their very own eyeses (also applies for the cases of stumbling upon a bill for a garden extension put on mp's expenseses)
Biting the hand that feeds IT © 1998–2019