Also, roughly the rate you can expect to actually receive from a 70 Megabit connection from any UK telcom.
962 posts • joined 26 Aug 2014
Also, roughly the rate you can expect to actually receive from a 70 Megabit connection from any UK telcom.
"Windows 10's growth rate since the official end of the free upgrade period has barely managed to outpace Windows 7's growth rate, according to Netmarketshare.com. "
Not that I disagree with your overall point, but this simply isn't true.
Win 10's growth is about 2% over the past 3 months (an amount equal to all Linux users across all distros, according to the same site; about the rate MacOS has grown in the past 8 years - operating systems that plenty on this site will happily insist are about to take over the world based on these figures), while Win 7 grew 0.3%. That's nowhere near the same growth rate.
I doubt that the PC market is suddenly about to explode back into life, but there are a few things that say it might. Some software now being released is actually taxing hardware again - 64-bit CAD programs in particular are actually demanding all the grunt you can throw at them. Thin client solutions which Gartner were predicting would take over the world two years ago have proven to be just as disappointing as they do every time they rear their ugly heads; the touch UIs developed for tablets and phones have proven to be mostly useless for any serious workloads beyond emailing and light text editing; and most European economies are finally growing, meaning that businesses might just go ahead with hardware refreshes that have been long-postponed.
While I doubt that Win 10 will be driving any expansion in the market, it does stand to benefit disproportionately if it does happen.
"My new dev laptop for work will be Win10 only because Win7 has been withdrawn from the corporate build."
Exactly this tbh. Win 10 will grow because corporates are never going to go to Linux (sorry penguinistas, but let's try to focus on reality rather than the impossible dream - if you have 20 MS support staff on permanent contracts, you're never, ever going to switch your shop to Linux) and the 1.5 year New Microsoft Operating System Quarantine Period has now passed. They're the vast bulk of the people keeping Win 7 alive atm, and they'll begin switching to Win 10 now because it's not an absolute abortion of an OS like Win 8 was.
Plus, most of the more objectionable parts of 10 (telemetry, no-choice updating) can be deactivated with the Enterprise license. Funny, that; it's almost like MS knew that IT professionals would hate those parts...
"If the market for spinning drives reduces too much, there is not enough of a market to justify the expense of building a factory."
That's just circular logic. You're saying the market for spinning rust will shrink because it's out-competed by flash, and it will be out-competed by Flash because the market for spinning rust will have shrunk.
As for refuting DougS's points, I'll leave that to the sage words of, um, DougS in this thread, from when he argued the exact opposite: https://forums.theregister.co.uk/forum/1/2014/12/09/no_flash_datacentre_takeover/
"Why wouldn't hard drives completely disappear, once the price premium disappears?"
First, the premium will never entirely disappear, simply because the fixed capital requirements for SSD production are orders of magnitude higher than HDD and storage requirements grow faster than SSD production can. Even at current price multiples, it takes 4-5 times as long for an SSD fab to pay for itself compared to an HDD plant, and it's probably not possible for the premium to sink below 2x ever, unless someone finds a NAND tree.
Second, there's some circumstances where raw capacity is always preferable to speed, so as long as the HDD is at a lower $/TB, it'll be the better choice. Consider, say, FB photo storage, which is rarely accessed but needs vast capacity and increases enormously and continuously. You just want to stick in the largest capacity item possible at the lowest price possible. That's going to be HDD.
There probably is a technology coming in the near future which will obsolete HDD, but it's not going to be SSD; we need something that achieves higher speeds at HDD-equivalent prices. Either that, or some breed of NVRAM will come along and simply eliminate the concept of separate storage.
"Ive appears to be phoning in designs now"
I never really saw any evidence of his supposed genius for design anyway tbh. Flat UIs, round corners, the entire iPhone 4 disaster from start to finish... he's not the world-class design maestro that people often proclaimed him. Very Apple, in that he'd do something profoundly obvious and call it brave, or profoundly unpopular and then just call his market idiots if they dared to dislike it... but not actually a particularly impressive or innovative designer.
Agreed on the rest of Cook's Apple though - he's just a bit of a duffer, isn't he? He's mostly coasted along by iterating existing product lines, and when he's picked a new idea to follow it's been derivative and usually a failure. Jobs regularly stole anything he could find, but at least had a good eye for what he should steal and a how to improve on it. Cook appears to just jump on whatever bandwagon is going (wearables, self-driving cars, Apple music) with little idea of what it's supposed to be for, and then delivers an overpriced and underspecced version years later, only to quietly drop it 18 months later when no-one but the most die-hard Apple fans goes for it.
For all his many faults, Jobs understood that a piece of tech should have a clear use to solve an existing problem (MP3 players for digital music on the go; iPhones for mobile net access; tablets for mobile consumption). Cook just doesn't seem to grasp that, and so goes for any old tat that a competitor is doing just for the sake of competing with them. He's slowly converting Apple into modern-day HP, run by the bean counters and lawyers rather than the designers and engineers.
Updates aren't really that relevant, though. For starters, most higher-end Droids do get prompt updates throughout the phone's lifespan now; my S6 still receives updates within a week or two of Google releasing them. And for seconds, most people who consume flagship models don't actually notice either way anyway; they have the most expensive handset they can find for prestige purposes rather than because of the specs or the security. They almost certainly couldn't tell you the difference between iOS's approach to FDE and Android 7's.
The fact is, phones only last about 2-3 years at any price point regardless of whether it's from Apple, Samsung, Sony or whatever, and the actual performance difference between an £800 iPhone and a £200 OnePlus is now meaningless for 99.99% of tasks. Flagships are slowly dying because the marginal performance improvements no longer justify a 4-5 times higher price, which is why Apple's phone shipments fell in 2016 and Samsung's rose despite Sammy producing a flagship-grade phone that literally exploded in customer's pockets.
Apple will continue to cling to the market, but ultimately the iPhone is headed the way of the Mac - it's going to become a pricey niche device with a small but massively loyal following who would never, ever consider using anything else, even if Apple produced a Samsung-esque exploding handset. Meanwhile the rest of the market will move into low-margin commodity devices; I'd expect the variety of phones in the £600+ range to shrink each year from now on. Samsung might keep producing high-end flagships, but it'll increasingly be a matter of innovation prestige rather than a serious profit source.
And with that kind of modesty and self-deprecating humour, I'll bet you don't rub people up the wrong way or strike them as arrogant at all.
Actually, it might.
It's an accepted cultural thing in business that a man with a bigger salary carries more weight in discussion. If your CISO costs a million quid a year, he has much more clout to argue for whatever his expert staff say should be happening than a 50k CISO who is there simply so the company can say it has a dedicated CISO.
Sure, the real way of stopping individual bits of malware is to have good techies... but the CISO ought to be able to take what the techies say, convert it into boardspeak, and then successfully make the case that it ought to happen. His odds of successfully making that case are at least partially contingent on his salary being big enough to ean the respect of the other executive board members.
"I'd love to get some people who think..."
No-one is suggesting that the guy on a million a year doesn't work hard. We're just also suggesting the guy on 100k also works just as hard, but isn't listened to because he's not costing as much.
Unfortunately, that's largely the logic on some boards. You can ignore the CISO if he only cost 60k, but if he costs a million then he outranks the lesser boardlings. Which is, of course, why the beancounters and the lawyers tend to rule the roost.
"Some kind of fookin' tape drive that must have bought! "
Sounds more like live backup to off-site SAN storage to me, tbh. Which could easily top 600k if you have enough data. Not sure how it would fail to prevent the data loss, though, aside from possibly if they put it in the same server room as the main systems...
Thing is, it could have just been 1 firewall at 1 pharmacy in the arse end of nowhere, connected to the 'net via the dial-up modem they received in 1998. No firewall. No decent security setup. No ports blocked and no updates because they take all week to download. Then you VPN into N3 and suddenly it spreads to the whole system, because you're inside.
I have literally seen exactly this setup in the NHS when I worked for them about 5 years back, so it's not far-fetched. Many of the trusts themselves have very good external security, but are helpless if someone can gain access from inside; it's still run on fortress-style security principles rather than compartmentalized.
" I wonder when we will see the first hybrid malware:"
IIRC, we already saw that exact scenario a couple of years ago; an SMS virus targeting 'droids which then dumped a payload onto Windows machines once it was connected to them. It's not actually that effective a vector, though.
"I don't. I blame the network administrator."
This. Or whichever idiot overruled him.
Blaming Microsoft for someone else failing to secure the product and failing to install patches that were released months before the outbreak Is like blaming a door manufacturer for a break-in because you failed to engage the lock. Saying 'But the door was unlocked when it was delivered!' is not going to win you any court cases.
" First, document the risk, then identify the likelihood and the impact, then identify the mitigation strategy, and then consider the cost benefit for those mitigation strategies (which is where you get to constraints)."
And then email that information to literally everyone above you in the chain of command. And then take a copy of that email, and save it somewhere off-site where only you have access. Because when they ignore you completely and the shit hits the fan, you will need it.
"Have to disagree there. It was mostly because way too many people seriously over-hyped the actual risks and made it look like the end of the world while those risks were in fact minimal if not non-existent."
Actually, the reason those risks appeared minimal or non-existent is precisely BECAUSE a lot of people were paid a lot of money for a long time prior to the millennium in order to fix it. Huge chunks of the banking, education, healthcare and other sectors noticed big problems with the bug prior to the big event; my mother worked for a major educational establishment which literally re-wrote their entire student enrollment systems because of it, since they couldn't input their 4-year students starting in 1997.
Looking back, it's seen as a big fake scare in the popular imagination, but vast amounts of code were re-written in the decade leading up to it to prevent massive disruptions to very, very large areas of the world's major industrialized economies. Planes might not have fallen out of the sky, but it wouldn't have mattered since no-one would have been able to withdraw cash to pay for a flight.
The problem is, nothing turns people's brains off quicker than a lecture from your average IT security consultant. I've had some success by introducing humor, which helps keep them engaged, and by emphasizing that they're also at threat outside work.
But in general I agree. The long tendency of Infosec to just say 'Thou Shalt Not' rather than explaining 'you can't go there because it will literally destroy the company and your job with it' has led to most people perceiving them as a nuisance and a cost centre to be worked around, rather than expert professionals doing a difficult job to keep them safe. Pointing out that they access their bank details/personal email/whatever on their work PC and so need to keep it safe helps a little, but you really do need to sit down and explain what the threats are and how they work.
That's not really an option for a lot of jobs now. You try and diagnose an Exchange server fault without being able to check the error code from Microsoft's KB. Or get email via Office 365 without a web connection. Or get the latest set of medical papers to read through without access to JSTOR. Or any one of a hundred thousand jobs that aren't minimum-wage helpdesk crap.
Sure, low-end ITIL helpdesk staff who are mostly call centre flunkies given permission to reset passwords can be kept off the 'net, if you're attempting to drive them to suicide through frustration and boredom; you're already discovering that the side-effect of that is that their morale drops through the floor. But most people have jobs where productivity is increased by web access, and restricting it too far will damage both your ability to recruit and keep the best people, and will harm the company's productivity and competitiveness over the longer term while your staff struggle to answer questions that could be solved with 5 minutes on Google.
Any decent web filter or proxy can weed out serious productivity killers, like Youtube or flash games (though the best way to do so is actually motivating your staff, rather than treating them like convicts). Cutting the cord to the 'net altogether is not a sensible alternative for most users.
I'm sure he reckons that the sacrifice will be worth it, if only the apocalyptic scourge of people texting during films can be vanquished forever.
Guaranteed that HCI will kill SAN within 18 months then, judging by how well Dell's similar proclaimations about Cloud have played out.
Agreed. We honestly cannot sit here and spend 20 years whining that MS suck at security, and then complain about them suggesting maybe there should be some effort to do security properly. Especially since MS's security infrastructure is no longer the joke it was in 2003; charges of hypocrisy are a little unfair when the company has been spending a lot of money and throwing a lot of effort into moving away from it's previous bad practices for over a decade now.
"by saying "our security is the best on the market", which is odd that they aren't when you think of the opportunity."
It's not odd that they aren't, because on aggregate consumers don't care. They really don't.
Oh, if you ask them if they want their stuff to be secure, they say they do. But if you market the product as 'the most secure on the market', then watch it sink like a stone in the face of a rival who instead markets their product as 'and it comes in black!' or 'look, it has rounded corners!'.
This state of affairs will continue until someone actually makes the effort to explain why their security is good, and why best practice security is better than random bullshit. But for the most part, security is explained with magic wands, so is about as meaningful to a consumer as the 'science' portion of a shampoo advert.
"I still won't be able to give you a good answer apart from "that's how markets work"."
Increasingly it isn't. Flagships are a dying breed, and making them even bigger and more expensive while the actual performance difference from a <£300 phone is shrinking ain't gonna save them.
"Apple's whole raison d'être is usability and utility"
The 3.5mm headphone jack called. Apparently it thinks you're a liar.
For Vista onwards, Windows moved to a different driver framework that required things like 'keep kernel-mode drivers and user-mode drivers separate', 'don't just use the highest privileges possible by default', and 'why not try reading up on security standards before calling yourself a programmer'. All the things which had made earlier versions of Windows so unstable and fundamentally insecure, in other words, were now to be forbidden, so we could discover exciting new types of instability and insecurity rather than just bluescreening because your joystick decided to write into the kernel space for no reason.
This was basically why Vista didn't seem to work with anything initially - it demanded properly written drivers, at a point when basically no-one had bothered writing them to any decent standard. The reason Windows seemed to work with literally all the hardware in the world (to a given value of 'work', at any rate) is because prior to Vista/7 it didn't stop you from doing stupid and insecure things with your driver code. You could let the work experience kid cobble your driver together based on his Art History degree and 20 minutes of training, and companies literally did. After Vista, it did, and the immediate result was 90% of existing device drivers were suddenly forbidden from working.
Unfortunately, lots of the devices used by the NHS (think MRI scanners, X-ray machines, and other hugely expensive medical equipment designed with a 50-year lifespan) have horribly-written drivers created by companies that ceased to exist 20 years ago, and so new drivers were never created. So the NHS kept using Windows XP on the machines connected to those devices, but also hooked them into the network so they could transfer those scans around - via port 445, using SMB v1, which is precisely the protocol which this worm used to spread itself.
Which basically explains the whole situation, tbh.
"WannaCrypt software was developed by the NSA and leaked "
Ish. Bits of Wannacry were taken straight out of the leaked NSA tools. Other bits look like they've been written by a 16 year old with a limited grasp of infosec. For example, the 'just check if this domain has been registered' killswitch which stopped it is the kind of thing that a state-backed group would not include, full stop - whether that's the Russians, the Chinese, the Norks or the NSA/GCHQ.
This mostly seems to have been a cut-and-paste job tbh. Someone who didn't really know how to program but did know how to slam chunks of code they found on Stack Overflow together blundered onto the dark web and just cobbled together whatever they found to do what they wanted. The result was an odd combination of high-end and low-end features.
"Why's that such a bad thing?"
You run a large estate of 600+ machines. If you have all of them on one O/S, you hire one engineer on 50k and 3 technicians on 20k. If you have them on a 12 bespoke OSes, you hire 12 specialist engineers on 50k each, and each of them spends 90% of his time doing nothing.
"they had the patch ALREADY written before the NSA hack was leaked.............."
Yes. They have customers who pay extortionate fees for support to continue to get security updates for obsolete O/Ses. The idea is to encourage them to get off ancient systems. Microsoft don't then general-release these eye-wateringly expensive patches, since that would completely remove the motivation for anyone to pay, and everyone who could get away with it would still be on Windows 95.
I honestly don't really see the justification for blaming MS in this, sorry. They told us years in advance that XP would be end of life in 2014. They told us to get off XP in that timescale. They even had a decent OS in Win 7 to migrate to. They then extended that support while again pointing out that you should GTFO XP. And then a bunch of organizations running XP get hit by security holes. What more do you want MS to actually do here? Keep releasing security patches for XP until the end of time?
" it sort-of was a fairly pointless/useless product when I bought it - tho' it allegedly is now quietly wiping the floor with all the other wearables."
Worth noting that's still not exactly a high bar. Wearables still haven't actually found a good reason to exist yet, so wiping the floor with the competition is like being the hardest kid in preschool.
Did you not forget to dilute it with 8 trillion leading 0s?
There is literally no evidence that email was a vector here. The cryptolocker spread by copying itself out to every machine in the subnet over port 445. So no, beefing up email defense would not have had any impact.
"Anon because I might be right"
Firstly, a state actor attack would be far better targeted. Stuxnet, for example, actually checked the serial numbers of the centrifuges it targeted to ensure that it only hit ones created in the right date span to impact only those bought by Iran. The vector on this attack, on the other hand, literally just spammed itself out to every available IP address that had port 445 open.
Second, US retaliation would almost certainly involve using a few zero-days. If you want to prove that you have vastly more power than your opponent, then you want to do something that literally resembles friggin' magic from his point of view. You want to show him that he can do nothing whatsoever to defend his critical infrastructure from your attacks. This did not; nothing in this hadn't already been discovered and patched. If the best thing the US can throw at Russia could be taken out by just switching on your WSUS server in the past three months, then there's no point even doing it because it would make them look weak, not strong.
Thirdly, and most importantly, most of the original bits of this were actually quite shittily written. Oh sure, there was a genuine bit of high-tech NSA code in there from the shadow broker leak... but there was also a fair load of primitive crap there too. It's a bit like an 16 year old came into possession of an F-16; it was destructive as hell but he didn't really know how to fly it.
I've just finished in a webinar on the incident, and there's literally 5 different layers of my SMB's security that blocked this (patching, permissions, firewall, commercial AV, VLANs). And we're not exactly cutting-edge - just running best practice.
In short, if this was state-backed, then the state in question would have to be somewhere like Honduras, not one of the big-league infosec powers.
It's international. UK, Spain, Italy, China, Russia, Vietnam, Kazakhstan and Taiwan so far reporting massive numbers of infections.
"Curious then that it has affected so many dispersed bits of the country. "
The term you're looking for is 'continent'. Or possible 'world'; Russia has millions of infections right now, with Taiwan and China both heavily hit too. Half of Europe is being hit. List on the BBC's breaking news site currently says UK, Spain, Italy, China, Russia, Vietnam, Kazakhstan and Taiwan. Avast alone has 36,000 infections going live right now.
This is fucking massive.
"Eventually this will become a business school case study of a potentially great company destroyed by management arrogance"
And what an astonishing level of arrogance there's been. Uber's regular flouting of laws and regulations, actual attempts at obstruction of justice, willingness to spy on it's own customers for no reason beyond their own entertainment, contempt for employees, rampant price-gouges and the rest of the litany of horrors that pours out of the company every week means there's already a dozen fair cases for imprisoning half the upper management.
The business model adopted by Uber tries to bankrupt all competitors through burning vast quantities of venture capital, in the hope that they can then become a monopoly and jack up prices to insane levels (while simultaneously slashing wages). This is just writing abuse of markets into your business plan, and is the kind of thing which can only occur when there's an excess of available investment capital floating around.
Frankly, it's the same kind of behaviour that was discovered to be going on at Enron. But at least Enron were more competent about covering it up.
"I think Mark Thomas had the best idea in one of his routines - People in elected office should be taken outside and shot at the end if their five years term. "
Actually, that could work. Assemble the firing squad at random from the whole non-mad, non-criminal population, and then specify that there's no penalties from killing the politico, but they don't have to do so. A popular politician who governed fairly would find that most people tried to miss him. A politician who targeted specific groups for unfair treatment, on the other hand...
Yeah, there's whole warehouses in India being used to run this kind of scam. Wasn't there one that was busted a few months back, and the people working there were amazed to discover that they didn't really work for Microsoft?
Does anyone else get a sneaking suspicion this responsibility is rapidly heading for Jared Kushner's desk? After all, he probably needs something to do between reforming the whole federal government and sorting out the middle east.
Would it make any difference if he did read them? He has demonstrably no understanding of policy and no understanding of IT security, so even if someone can get him to sit down and read an IT security policy document, it's hard to see anything resulting from it.
Can't tell if the downvoters were Microsoft fans, or just people who don't believe there'll ever be a working version of the fall update.
And the working version of the Fall creator's update can be expected the following spring.
"Your contempt of "most users" notwithstanding (albeit probably not far off the mark)"
I wasn't being contemptuous. It's literally true. Most users cannot do anything outside their usual daily tasks on a computer, and there's no reason to assume that they should be able to, or that it makes them stupid for not being able to. I can't operate half the software that they use either, since an in-depth knowledge of CAD software simply never comes up in my day-to-day life. That doesn't make me stupid either.
The vast majority of users don't have the slightest idea of what a proxy is, or how you'd tell your computer to use one, or even why you'd want to. As for programming a spam bot... why would you assume that they're doing that, as opposed to just downloading one off the shelf?
Look, if you want to talk slacktivism, look at actual slacktivists rather than postulating the existence of some kind of hyper-intelligent superengineer with an in-depth understanding of networking principles and a working knowledge of Java. Thankfull,y we have loads of examples from the real world of who they actually are. Take a look at Anonymous's Project Chanology, for example, the infamous attack on the Church of Scientology back in 2008. They used a network stress-testing tool called Low Orbit Ion Cannon to commit DDOS attacks. The guys using it didn't write a DDOS tool, they just downloaded it. The majority of them had no idea that their IP addresses could be traced from it, and that they'd need to use VPNs or proxies to mask the traffic. They were not IT engineers and had no clue what the tool itself was actually doing, they just knew that if you put a name in the address bar, it would knock the website offline. They were mostly just kids, and many of them were horrified when the police turned up at their door because they simply didn't know that the traffic could be traced back to their computers.
THAT is the vast majority of the people we're discussing here. They're accountants, or architects, or customer service advisers, or shop assistants. They want to do something about political subject X, and they know that they can download this tool to enable them to do it, but without the slightest understanding of the context or consequences of their actions and with no idea how to cover their tracks or evade even basic countermeasures. They'll use the tool which achieves the objective, but they're not going toknow about the other tools which ill enable them to hide the fact they've done it.
"Contractors are supposed to be people with years of experience and skill, who can come into a job and hit the ground running with merely a 'here's your desk'."
That's the ideal, yes. The actual reality of the situation is that many companies hire 'contractors' to cover first line support. They're putting kids who just left high school on contract - hell, my first job, at the age of about 19, was a contract role. That's not some seasoned expert coming in and being paid top dollar for highly specialized skills. It's companies exploiting contracting in order to avoid having to offer full-time positions. I knew guys who were stuck on contracts for 5+ years in jobs that barely paid enough to feed them week to week, because there was no other options. And that was prior to 2008, during the boom years - now it's actually worse.
It's easy to look at shit like this and think 'I'm a contractor, I bill $8k a month, I don't see why they're having trouble', but actually a lot of contractors aren't really self employed by any sane measure, don't want to be self employed, and just don't have a choice in the matter.
"I got out of that gig because suddenly there was a "dispute" and no one got paid"
It's very commonplace. I started out as a contractor, because at the time if you got into IT where I was in the UK you more or less had to be - permanent jobs were only ever offered to people who'd been contracting for a company for ages (long enough to plausibly sue for employment rights). And yeah, there's a couple of dozen jobs that I simply never got paid for - including ones from massive, big-name companies who could easily afford it but simply elected not to pay me for 2-3 days work because I was unlikely to take it to court for a few hundred quid. Hell, one time, a local NHS hospital failed to pay us due to office politics between our boss and our boss's boss. Nothing we'd done wrong - just one dilbert trying to screw another one over.
That's why I gave up on contracting as soon as I could.
With interest rates on UK 30-year gilts currently 0.8% lower than inflation, lenders effectively pay the government to borrow from them. Our 30-year gilt fixed rate is presently 1.5%, so if the economy picks up and inflation rises the amount that the debt shrinks in real terms will increase. So it makes perfect macro-economic sense to borrow and invest now, and bugger all sense to pay off debt at present.
And 'Labour's last set of debts' - or, more accurately, all debt ever generated by all Labour governments in the UK, ever, combined - is about the same amount that the present Conservative government have added to that debt in the last 7 years. Partially this is because austerity has never worked to clear up an economic mess, and partially it's because most governments manage to borrow more money than their predecessors because inflation is a thing that happens.
There's some really, really, really good reasons not to vote for Labour right now - the fact that the party is about as united as a combined Israeli-Palestinian football team, for example - but the supposedly superior economic competence of the Tories is certainly not one of them. Macroeconomics is not the same as balancing your household budget.
"So that they can access their list of logins even if they don't have access to their home PC or the drawer containing the USB key I assume."
Well, except for when there's literally any connection problem between you and wherever the hell the company has decided to dump your data. Like, y'know, what just happened.
Combine with that the increased attack vectors when your password DB is always online and relying on the security regime implemented by the work experience kid of passvault company A, which in turn relies on the security regime implemented by the work experience kids at Cloud Company B, and you're looking at a whole bunch of downsides for the sake of not having to carry a 7 gram USB stick around with you.
"Thus being able to sketch on a tablet is of little value if the sketch isn't easily and quickly available (ie. with little if any user intervention) on another device eg. CAD station"
Even there, as the construction industry moves away from old-school CAD-based technical drawing (eg, Microstation and suchlike, which basically just allow you to do the same thing architects have done for 300 years only faster) and toward 3D Building Information Modelling, the usefulness of a hand-drawn sketch or markup is slowly disappearing too - the buildings are being literally built in Revit from programming-style classes instead of drawn, so interaction with the model is becoming increasingly dependent on a fully interactive data-rich environment.
Tablets not only lack the sheer grunt needed to open such models, but would be a nightmare to try and interactive with effectively when what you really want is an 8-button mouse and a full keyboard of shortcuts.
I don't see why this is Facebook's problem. Like, at all. That's like arguing that a paper company is responsible for the Daily Mail printing bullshit on their product.
News should always be consumed from multiple sources and cross-referenced. Failing to do that is pretty much on the consumer's head, not the medium he found it on. If you can't tell the credibility difference between the NYT and Breitbart... well, that's an education system issue. For example, if I paste a fictional story about an alien invasion on FB, and someone is stupid enough to think that it's true, then it's not Facebook's fault for failing to flag it up as fiction. It's your fault for not being able to check other news sources over trusting a completely unknown dude on the internet suggesting Arkansas has been taken over by the Vogons.
This is what happens when your education system doesn't provide any critical thinking skills until halfway through university. Which more or less all Western education systems have been designed around for decades - there were tight oligopolies controlling the dissemination of news, most of whom had comfortable by subservient relationships with the state (even when they were nominally hostile to it), so you just didn't have avenues for fake news to go global. Meanwhile, you only wanted, say, 20% of the population to be involved in managerial tasks that required the ability to critically engage with data; everyone else was bound for the factory floor or mostly just needed to know not to stand under the steam hammer while it was in motion. So it was fine - and actually desirable - for 80% of the population to have no serious critical thinking skills whatsoever, as otherwise they might start questioning official narratives - like 'We're always the good guys, even when we're dropping napalm on peasant villages'.
The internet (along with aggressively partisan media outlets that overtly challenge mainstream narratives for political ends, like Fox News) is slowly breaking that shit down. Iraq 2 was when it really started to kick off, as the official narrative (Sadam is connected to 9/11 and has WMDs) simply wasn't convincing to those who did have critical thinking skills, and they could circumvent the controlled official channels to talk about it rather than being shut out in the cold. Every major newspaper supported the war at the start. It was bloggers who were saying it didn't add up. That caused a collapse of trust in the 'official' narrative forming channels (the newspapers etc), but people didn't suddenly develop the ability to critically analyse information themselves - they disliked the media because it had proven to be lying AFTER the event, not because they didn't find the arguments convincing while they were being made. So now they don't trust the Lying New York Times, but they don't have the skills needed to determine whether Infowars.com is a pile of fictional dogshit until some idiot goes and shoots up a pizza parlour to prove it wrong.
The answer to this is not 'we must re-empower the gatekeepers of information', because the gatekeepers are inevitably corrupt, whether they're Facebook, or the NYT, or the Washington Post, or the Guardian. It's to make sure people can actually tell if an argument is actually convincing or not. Which is what Facebook's ten points are actually trying to do - point out a basic university-level critical thinking process which you really need to understand to survive in an information-rich environment.
So no, I don't want Facebook to suddenly declare itself the editor in chief of the internet, and yes, it should be down to the user to check.
No, since the demand for real news is (and always has been) much smaller than the market for fake news. People like to be told what they've always thought is correct, and so will pay more to have their existing biases confirmed than they will for something which challenges them.
See literally any post from Big John in a Trump thread for examples.
Biting the hand that feeds IT © 1998–2017