Cambridge Analytica was a small player.
Give me enough money, and watch what I can do with only the three people that work at my company. Welcome to the Big Data era, sonny: it's truly amazing what you can do with a few scripts these days.
7060 posts • joined 31 May 2010
Cambridge Analytica was a small player.
Give me enough money, and watch what I can do with only the three people that work at my company. Welcome to the Big Data era, sonny: it's truly amazing what you can do with a few scripts these days.
Even if all of this is true - and I honestly have no way of knowing if it is - I am not sure Supermicro would be to blame for this. Or, if they are, at least not any more than Cisco is when US.gov tampers with their stuff.
I don't know...if this is true, and the Chinese have both the technology and the expertise to pull this off...why would this be limited to Supermicro in any way? Wouldn't they be doing this to all vendors that supply their high value targets? Who else might be doing it? I have so many questions.
The one thing I don't believe, however, is that any of the brass at Supermicro would be in on it. This isn't because I don't think it couldn't happen...but because I don't think the brass at Supermicro are all that plugged in to the factory floor. Supermicro is run by engineers. They like R&D. They like design and testing. They all seem to absolutely loathe logistics.
And there's evidence of that in how Supermicro comports itself. Supermicro is absolutely terrible at parts and component distribution in a number of geos. They have significant delays on system sourcing to partners in almost every geo, and I'm pretty sure their shipping department couldn't find their own ass with two hands, a roadmap and a sherpa. That's not the sort of thing you'd expect is top executives were concerning themselves with details like product assembly or distribution.
So all of that brings me back to: I'm unsure what to make of all of this. I just don't have enough data, and I've poked my contacts at Supermicro to see what I can uncover. From what I know of the people involved, I think it's entirely possible that Supermicro's production facilities could have been compromised and the executive layer be completely unaware of it. Whether they have or not, who can say? It will be interesting to see it play out.
Microsoft discovers webmin.
"What you say is only correct because the defenders use standard processes that are predictable: "common practice". Once you depart from this predictability, an attack becomes much harder and potentially less effective."
Yeah, and rolling your own crypto is a great plan. Pfffffft.
Standard security processes and procedures actually do work, except against exceptional (read: statistically extremely unlikely) threats. The problem isn't standard security processes and procedures. The problems are lazy administrators who don't implement them, and companies that don't pay for them.
You wake me up when your'e running a fully containerized and microsegemented environment with complete data path inspection, automated baselining, baseline deviation sensing, and automated incident response that includes at the very least auto-quarantining.
Unless and until you manage to get your security solutions to at least the above level, you have no place disparaging standard security procedures. If you understood today's IT security and were able to implement it, you'd understand the huge gap between today's best practices and the poor bastards cowering behind an edge router like it was 1993.
Damn it, I'm not old!
I'm just alt-young.
Disrespect Mars and I'll go through you like a door.
Dinosaurs remain among the most populous animals on the planet. Perhaps you've heard of their modern incarnation. They're called birds.
And there are reportedly half a trillion of them out there.
So next time, pick a better analogy. One, perhaps, that doesn't demonstrate you failed to evolve your thinking to incorporate scientific knowledge that is now decades old.
I should point out that ad block penetration has surpassed 30% already.
Crayon does seem the appropriate MID for Microsoft fans.
Maybe if Microsoft put in a "stop fucking spying on me, you lousy git" button, people would be more inclined to adopt it. At least with Windows 7 I can murder the bloody call home with a hosts file. In Windows 10, they hard coded the bastard in so that even this doesn't work.
Fuck Microsoft. Fuck them with Lennart Pottering.
Come in may. Openstack Vancouver. It's a great reason to hit the continent, and also it's Vancouver, not Edmonton. That's always better.
It's Red Hat. They'll give the whole thing to Pottering, and he'll build it into systemd. It will end up riddled with bugs and with security as an afterthought's afterthought. When the first logoed vulnerability emerges, he'll blame end users. Nobody will be able to fix anything because all the logs will be binary, and corrupted by the attacks.
Red Hat will snicker and go "who else are you going to buy from". Three more RHEL derivatives will emerge the next day. Oracle will pretend it didn't hear anything about any of this.
So, you know, a standard Tuesday in IT.
Why do you assume CPUs have to be "safe" for people to buy them? Do you honestly think that Spectre has slowed down CPU purchases? Do you think anyone but the handful of nerds that haunt these forums and some security nerds that read the Grugq a lot actually care about any of this?
Make no mistake, Intel is still the seller of server CPUs. They'll keep on being the seller of server CPUs through the lawsuits, and they'll emerge from this as the seller of server CPUs.
I'm as much of a fan of the underdog - and hence AMD - as anyone, but let's be realistic. Intel has an iron-fisted monopoly, and unless someone goes in there with the Almighty Axe Of Anti-Trust and cleans them out, they'll continue being a monopoly for at least the next decade.
You know it. I know it. Everyone except a few deluded die-hards knows it. So let's no pretend about this, shall we?
The people who buy things don't give a fnord about "security", and they never have. If they did, the Internet of Things wouldn't be such a gor'ram security dumpster fire. Nobody would ever buy Cisco or Supermicro again, and the list goes on and on and on.
But you know what else? It's not their asses that end up in front of the judge. It's us. The hoi polloi at the coal face. Nobody sends suits to jail. They make some poor bastard working in ops the lightning rod and ruin his life, and the lives of his family instead.
So yeah, Intel's dominance isn't going anywhere. Nobody's going to do a bloody thing about it. We're going to be responsible if/when it all goes horribly wrong, and we should know about this ahead of time so that we can take precautions and/or run the hell away in terror. (Depending on how you view risk.)
For suits, the only risk they care about it "will this cost me some of my bonuses". For nerds, the risk we need to worry about is "will this land me in front of a judge"? I leave it as an exercise for the reader to work out how likely (or not) they feel this is to affect their chances of negative consequences.
But we should all have eyes open here and understand that this issue isn't going away, and that we, as the plebians, have no choice but to deal with it.
AMD and some ARM chips are affected by Spectre. But let me be 100% clear on this: AMD is completely irrelevant to this discussion.
AMD chips might powerful a smallish percentage of endpoints, but they power almost none of the existing fleet of deployed servers. Even if, for some reason, we all decided to buy AMD tomorrow, AMD couldn't deliver. At full ramp, AMD would struggle mightily to put out enough silicon to cover 10% of planetary server capacity during the next refresh cycle, and there is zero indication that demand exists for them to invest in that many wafers.
I'm sorry, but in the real world, AMD just isn't part of the any discussion about server chips. There's only one player in that market, and they'll be the one everyone buys replacement chips from.
"And Google's, obviously."
Fuck off with your whataboutism. Who are you, Sean Hannity?
Jibbers garlic-covered Crabst...
Yes, I'm sure. I'm sure because already "serverless" doesn't mean "Amazon". Serverless has a lot of ardent followers, and they're building open source solutions that do what Amazon's Lambda does.
Similarly, machine learning, AI, BI and other BDCA tools are seeing a lot of open source growth. In short: a commercial entity (like Amazon) might come up with the initial concept and get to milk it for a while, but all technology eventually ends up democratized.
The basic approach of serverless - white simple scripts (or have a UI/digital assistant write them for you), and have those scripts simply pass data from one black box to another - isn't going away. Humans don't typically uninvent things, especially in IT.
The increasing adoption of digital assistants also makes this seem like a permanent thing to me. The black boxes used by serverless types are really no different than the "skills" one can build for Alexa. Indeed, many of those skills are nothing but serverless scripts that call black boxes, and I've already seen Alexa used to create new serverless apps, which could be published as Alexa skills...
The whole thing has already reached critical mass and started a cascade. While a bunch of whingy nerds who can't disconnect "application development" from "enterprise apps" might not get the importance of an ever-increasing library of digital capabilities that can be used by any Tom, Dick or Harry, non-nerds seem to get the importance really quickly.
So we, IT nerds used to the way things were, we might not see the utility, or ever get around to using serverless to do our jobs. Our kids, however, will use serverless-style tech to do all sorts of stuff. Using - and trusting those black boxes will be as natural to them as smartphones are to my generation, or staring blankly at VCRs flashing 12:00 was to my parents' generation.
So sure, in the short term I expect some of these black boxes to disappear. That will lead to a backlash, and to standardization, to the development of "skill libraries" and all of the predictable evolution of responses to this problem. Corporate greed is eventually overcome in tech, even it takes a decade or so for us to get our shit together.
It is very early days for this technology yet, but the basic approach is sound. And those who have used serverless in anger tend to become adherents pretty quickly. Even the disenfranchised and cynical nerds.
@criagh yes, but you are doing enterprise work. Application development for commercial purposes. What you - and all the rest of the angry nerd mob of commentards are missing - is the part in my article where I very specifically said "It is the commoditisation of retail and consumer application development".
Serverless isn't going to replace traditional development for organizations looking to build their own middleware anytime soon. VMware isn't going to make their next vSphere UI using serverless. That's not where the revolution comes in. The benefit of serverless is not "helping highly trained nerds do what they already do better", despite the inability of the commentariat to conceptualize anything different.
Serverless is going to let people who are not nerds create applications that solve problems that nerds and commercial entities are not normally interested in.
For example: Let's say that i want to grow cannabis plants at home. (In a few months we can legally grow 4 plants per household here.) Cannabis is a particularly persnickety beast to grow. Much more so than the bell pepper plants that I have all over my house.
With serverless, I could take data from my house - images of my plants, or perhaps sensors similar to these - and feed that data into a black box up in the cloud. I could set the thing up so that if A happens, the plants get watered, if B happens, a fan turns on and if C happens, it sends me an alert.
Traditionally, if I wanted something like this I would have a few options:
1) Build something myself involving an ardunio (lots of work)
2) See if a vendor has already build a pre-canned solution, probably involving their own sensor ($$$)
3) Hire a human ($$$$$$$$$$$$$$$$$$$$$)
With serverless, however, I just need someone to write a "black box" that can accept some form of data I can provide (images, sensor data, etc) and spit out simple data about the plant in question. That black box does not, to my knowledge, exist today...but I am sure it will soon. (I could even train my own black box using machine learning, but that's another discussion.)
Essentially, serverless is a scripting platform allowing access to an ever-increasing number of "skills" or "capabilities", in a marketplace-like format. A platform I don't envision being used to replace super-niche industry development, but one I envision being used by civilians to automate and enhance their daily lives.
Application development is already a thing that is out there enhancing businesses that can afford qualified nerds. Now it is going to start being something we can use in our day-to-day lives to collect, modify and act on the data around us.
The individual applications wont' last long, it's true. But the black boxes themselves can. Oh, they'll change and evolve as they're maintained, but as a general rule, once invented, they won't be uninvented.
If someone creates a black box that does facial recognition, we aren't going to find ourselves 50 years from now without a black box that does facial recognition. We may find that someone has created better facial recognition systems, but we will always have at least the original black box's capabilities.
The digital skills being created accumulate until, one day, you'll be able to say "computer, watch the roses in the garden and water them if they start to show wilt", and it will happen.
That phrase spoken to the computer is a script. It told the computer to gather a dataset consisting of images of the roses in the garden. It told the computer to send those images through a black box that determines if there is wilt. If there is wilt, it will trigger the garden's watering system. That, right there, is serverless. The only difference is the interface used to create the serverless script.
The end user doesn't care if the black box that checks images of roses for wilt is the same today as it was yesterday. They only care that it works. All that matters to them is that a black box that performs that action exists, and that it doesn't stop existing.
If we did call it gluten-free, we'd probably get a whole bunch of new people interested in application development! :)
There's a whole industry full of supposed trained experts with loads of experience and education and all our information is ending up on haveibeenpwned anyways. You'll excuse me if I don't feel that people doing the application development equivalent of drawing lines between black boxes that are designed and maintained by large multinational public cloud providers could possibly do any worse.
Hell, maybe those public cloud providers will actually hire enough security people that the stuff they offer is secure by default. It would be a nice change from the shitpoaclypses created by the "I know better" or furiously cheapskate crowds.
Ease of use is the difference. Yes, serverless is basically "scripting for noobs". But the "for noobs" part is really critical.
Yes, really smart, well paid, highly experienced nerds can write a bunch of gobbedly-gook scripts, or write applications in proper development languages that called libraries or other application components. That's hard. And it requires expertise. And if you are incorporating some black box (application, library or other component) into your solution you first have to find it, download it, install it, make sure it's in the right place to be called, etc.
Almost all of that goes away with serverless. You write what amounts to a script, but you have access to this (constantly growing) library of tools and features supplied by the cloud provider. Making use of those tools and features is generally much easier than a bash script, and you don't have to manage or maintain the underlying infrastructure in any way.
When you take something that requires significant expertise and make it something that nearly everyone can use, that's a pretty big deal.
I'm sure that there were a bunch of nerds who knew how to build and maintain their own electrical generators who also said that national grids were a stupid idea. If they can run their own generation capacity, and wire up their homes/businesses exactly according to their own specifications, why can't everyone? Standardization means everyone isn't using the optimal plug/wire gauge/whatever for the job! There might be inefficiency! A national grid won't be revolutionary or change everything, because people can use electricity now!
I, for one, am glad nobody listen to them.
@Craigh You are correct: there is no reason for various cloud features (such as BDCA tools) to be tied to serverless. On the other hand, serverless doesn't offer you much of any real use except the ability to push data into and pull it out of various cloud features.
The point here is that serverless basically allows one to lash together some pretty complex applications about as easily as one could write a (reasonably simple) batch script.
Real developers will build the various black boxes. Trained IT practitioners will be able to use those black boxes wither through serverless or other cloud solutions. But for the milled masses, serverless will be their gateway to application development.
Right now, today, in order to build serverless applications you need to actually type a few lines of code. But the code you need to write in order to pipe data from A to B to C and back out again is simple. It's the sort of thing that you could build into a drag-and-drop GUI and have 4 year olds use.
Nerds think about serverless from the viewpoint of "how does this help me do what I do now"? Non nerds don't do any of this now, and when shown how easy serverless is, say "wow, look at all the neat things I can do that I couldn't do before".
So serverless - in the form of AWS lambda - is itself pretty useless. But it is this gateway to all the other things that Amazon's packed into AWS. It isn't the only way to use all of AWS's goodies, but it absolutely is the means by which individuals who aren't specialists, trained practitioners or experienced cloud architects will set about making tools for their own needs.
@Sir Runcible Spoon well that comment certainly created a combination of wry smile and crushing, crushing depression. A little bit too on the nose...
Maybe if I had been, you'd have learned a few things. :P
"Minix used it and look how well that didn't do out in the wild"
Minix is easily the most widly deployed OS in the world. It's in every Intel IME, and a whole heap of other embedded systems.
Think before engaging keyboard, mate. There's a lot more to tech in this machine-to-machine world than end-user-facing OSes.
KVM: Now powering two of the big three public clouds. What's that VMware? KVM is "not ready for the enterprise"? I think you know better.
"Camping with a projector to watch a movie of a campfire do you mean ?"
Where's the fun in that? Who doesn't like fire? I like fire. Hmmmm. Fire.
"Given the general lack of flat white walls in the general area when out camping in the woods/fields/wilderness, won't the average millennial fairly-well-off youngster be a bit pissed off lugging an 8ft projector screen around with them?"
I'd just bring a white blanket and hang the thing off the side of the car, but hey, that's me. "Flat" is a luxury. You're camping, eh?
"Isn't the point of camping to rough it a little?"
Depends on whom you ask. I think of camping more as "getting away from people and all the noise of a city". If I could, I'd live on an acreage surrounded by trees all the time. I'm too poor. So I go camping.
Camping with a projector to watch a movie while I poke the campfire? Sign me up.
Now, if we could just extinct mosquitoes...
They're less "technically challenging" than they are "a damned lot of tedious, boring, thankless work". I.E. a miserable pig.
You have to go through a lot of data before you even get to a candidate. Long after the fun work's been done, and the challenges of building the 'scope are long past, there's just crunching image after image. If you're really lucky an algorithm can help some. Even then, that's an awful lot of faint smudges and maths.
G09 83808 isn't the galaxy spotted earliest in the universe's history, however, it is spotted very early on, and we've gotten a chance to get a slightly better look at it than we could with previous instruments. Any and all objects that we can spot which are from about 1B years after the universe formed or earlier will get press, and rightly so. They're a miserable pig to find, and they can tell us a great deal about the early universe.
This is a feature, not a bug.
Oracle Microsoft Amazon don't have customers, they have hostages.
Mine's the one with the empty wallet -->
While the benefits of current robot-assisted surgery is somewhat questionable, this doesn't mean surgery robots are useless...or at least that they will remain so. Consider, for example, this prototype.
This prototype robot uses machine vision - amongst other modern techniques - to create a (mostly) autonomous surgery bot that is actually better than humans. Yes, it has some bugs, but it's an early prototype. I expect to see some rapid enhancement in this area.
"Should Google fact check Presidential Tweets before showing them on its site?"
Yes. Fuck yes. Absolutely yes. Yes a dozen more times and yes.
So should everyone else.
No they weren't. Drop a sandworm in water, watch it shatter into sandtrout. In relatively short order they would sequester all your planet's water beneath the surface and convert the planet into a desert filled with sandworms.
Do not fuck with Shai Hulud. He will your entire planet.
Except, you know, for most of us. Microsoft isn't trustworthy, and likely never will be.
I have some questions. Where can I reach you?
"people like them give women a bad name"
So a handful of female CEOs that screwed up "give women a bad name", but thousands of years and millions of male leaders of companies/governments/transcontinental empires etc. don't give men a bad name?
Look, I question the validity of affirmative action as much as the next guy, but dude, that's some completely unsupportable sexist bullshit right there. What the fuck.
"religion muslim or jewish or some recognized religion other than anything those based on Jesus"
Um...Jesus is an important prophet in Islam...
"Are most IT jobs actually bullshit jobs where the performance in the role actually has little impact on the org?"
I second the "yes" comment.
Speaking as a Canadian - and one who apparently has quite a bit more knowledge of what you're babbling about than you do - I would like to, in the kindest, politest possible way convey my feelings about your inane babble:
No, we can't. And the reason is that politics, privacy and security are tightly wound up into an inextricable morass.
Regardless of your politics, and what side you take in the multi-tentacled debate, politics affects the balance chosen between security and privacy. Or whether the needs/desires individuals (as opposed to corporations and/or states) should be considered at all.
So put on your big boy pants and welcome to the real world. Politics is everywhere.
Thank you kindly for choosing to engage directly with the community. It is nice to see anyone from any vendor taking the time.
I do have one small question however: does the rest of Microsoft know you're doing this? You're harming their ruthlessly customer hostile image. (Well, not a lot, as the Windows team exudes so much animosity towards literally everyone that it's hard to overcome...but you are denting the evil overlord image a little...)
I am told by some of the hard core gamers in my sphere that if you want to do VR at 240hz then having 8+ cores @ 2..8Ghz or better is usually required. As I'm poor, and still working on a video card from 3 years ago and a Sandy Bridge-era CPU, I cannot confirm this.
Apparently VR is a thing that some people do. I don't understand. Why do you need VR to play Scorched Earth?
Containers = multiple apps, single OSE
VMs = multiple apps, multiple OSEs
In car IT, there are a lot of very specific differences between OSEs for the different apps.
@John 104 Of course not. Though you are demonstrating that far too many of us cling, terrified, to the past.
Biting the hand that feeds IT © 1998–2018