W.O.P.R
I misread it on my phone, and thought W.O.P.R!!!
P.
Rogue NSA sysadmin Edward Snowden says his former employer has developed software that will automatically attack foreign computers deemed to be a threat – without checking in with a human first. The system, dubbed MonsterMind, is designed to detect strikes against key US servers and block the assaults as quickly as possible. …
>but Snowden said incoming salvos on US servers are often routed through other countries,
If anyone is going to attack the USA they will probably do so from within as the following statement helps to justify.
>The NSA's internal audits were so lax this wasn't picked up, and to this day Snowden says he believes the NSA has no idea exactly how much data he took.
( Does the U stand for "United" or "Useless")
Here's my take on things...
1. Politician abuses power and lies..
2. Wannabe in opposition or on a committee sees an opportunity to call them on the lie, promoting their own position whist at the same time damaging the standing of the other person.
3. Newly promoted/now in power politician goes on to abuse power and lie.
Rinse and repeat...
Thank god (I'm not religious, it's a figure of speech) for whistleblowers.
>The system, dubbed MonsterMind, is designed to detect strikes against key US servers and block the assaults as quickly as possible. But it is also designed to fire back to take out the perceived attacker without anyone it giving specific authorization.
How does it retaliate? Does it use a botnet for a DDoS attack, scan what zero day holes are in the systems and automatically hack them, send multiple robots back in time to kill mothers, nuke them from orbit??? Also, where can I get one?
Will these 'foreign computers' accept RPC calls to any ole open port. Cause unless they do, then software that automatically runs on 'foreign computers' don't exist outside of Persons of Interest?
"algorithms would scour massive repositories of metadata and analyze it to differentiate normal network traffic from anomalous or malicious traffic" ref
I thought this fricken traffic was encrypted. So unless this hypotical network was - like swiss cheese - designed to be full of holes, I don't see this is even feasable. Unless that is, the switch makers stop asking the NSA for help in designing secure networks.
Thumbs up for remembering that gem of utter psychosis and paranoia.
Sweat made Stafford’s forehead slick with moisture. “Do you know what would cause a Genux-B to conclude that we’re under attack? A million separate factors, all possible known data weighed, compared, analyzed—and then the absolute gestalt. In this case, the gestalt of an imminent attacking enemy. No one thing would have raised the threshold; it was quantitative. A shelter-building program in Asiatic Russia, unusual movements of cargo ships around Cuba, concentrations of rocket freight unloadings in Red Canada…”
“No one,” the man at the controls of the flapple said placidly, “no nation or group of persons either on Terra or Luna or Domed Mars is attacking anybody. You can see why we’ve got to get you over there fast. You have to make it absolutely certain that no orders emanate from Genux-B to SAC. We want Genux-B sealed off so it can’t talk to anybody in a position of authority and it can’t hear anybody besides us. What we do after that we’ll worry about then. ‘But the evil of the day—’ ”
“You assert that in spite of everything available to it, Genux-B can’t distinguish an attack on us?” Stafford demanded. “With its manifold data-collecting sweepers?” He thought of something then, that terrified him in a kind of hopeless, retrospective way. “What about our attack on France in ‘82 and then on little Israel in ‘89?”
“No one was attacking us then either,” the man nearest Stafford said, as he retrieved the tape and again placed it within his briefcase. His voice, somber and morose, was the only sound; no one else stirred or spoke. “Same then as now. Only this time a group of us stopped Genux-B before it could commit us. We pray we’ve aborted a pointless, needless war.”
Hi, Destroy All Monsters,
Reading that particular and peculiar part of that gem of utter psychosis and paranoia, had me thinking if Uncle Sam was being supplied programs and executing actionable intelligence delivered by a Genux-B.
Certainly the last couple of decades have provided media and news organisations everything spoken of in the extract posted, albeit with the nation states attacked being of different names to France and Israel.
Snowden is correct that the proper solution to surveillance is technical, not political. Relying on a political solution to mass surveillance is pointless, because if they can snoop then surely they will snoop. Universal encryption is the preferred solution.
I have given up disbelieving anything said about the NSA, but a Yottabyte? Seriously? Did nobody at Wired do a sanity check before going to print?
That's multi-trillion dollars in storage cost alone, and will be for the foreseeable future. The entire US Black Budget is only* ~$50 Billion.
*"only"... heh...
IDC "Overall HDD unit demand will likely remain relatively flat through 2018, but industry revenue should grow modestly, exceeding $40 billion by 2017"
http://www.idc.com/getdoc.jsp?containerId=248130
So not just the cost, who the F&*^ is making them and that amount of drives should show up, even if the Yottabyte is the maximum expected size over x years, a bit investigation should show huge amounts of HDD capacity going some place, GCHQ, NSA or any other country for that matter.
It can only be on tape.
That's still half a trillion LTO6 tapes. How large is that cube?
I can only imaginge NSA has some very special compression algorithms or this is the target number for end of the decade.
Could that much data have even been pumped into Utah over fibre up to now?
However, based on blueprints of the facility obtained by FORBES – and published here for the first time — experts estimate that the storage capacity of the data center is lower than has previously been reported given the technology currently available and the square footage that the center has allocated for its servers.
(...)
Even that reduced number struck Internet infrastructure expert Paul Vixie as high given the space allocated for data in the facility. He came up with a lower estimation. Assuming larger 13 square feet racks would be used, factoring in space between the racks, and assuming a lower amount of data storage per rack, he came up with an estimate of less than 3 exabytes of data capacity for the facility. That would only allow for 24-hour recordings of what every one of Philadelphia’s 1.5 million residents was up to for a year. (But who would want to watch that?) Still, he says that’s a lot of data pointing to a 2009 article about Google planning multiple data centers for a single exabyte of info.
> Could that even be pumped to Utah?
Ha! Good question. I just did some digging for figures:
No... No it could not.
Total _Global_ IP traffic per year: 1.1 zettabytes in 2016, according to Cisco.
(http://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/VNI_Hyperconnectivity_WP.html)
So that's either a millennium too much storage, or the NSA are actually the Men In Black.
Warfare becomes super-cyber, run by cyber-spook-geeks.
Human fighters (army / navy / air force) are side-lined.
Human fighters (with actual, cell-mangling guns) get fed up and take their actual guns to shoot at spook-geeks.
War returns to historical norms - shoot-em-up.
<endov>
<repeat, ad infinitum>
And we should learn how to prove code.
We've known how to prove code for decades. The trouble is - it's ruinously expensive.
So when a PHB is presented with the choice of a £50 piece of code that isn't proven and a £50,000 piece of code that is, guess which one is chosen?
Thus anyone proving his code goes out of business...
The only solution is to make PHBs responsible for their actions. Yeah, and that'll happen.
Vic.
> We've known how to prove code for decades. The trouble is - it's ruinously expensive.
Well not necessarily. Certainly hand proving your code, which is done in some areas routinely, is rather expensive. However there are little things where automatic proofs are already extremely common. Many languages, for example, will make sure that your stackpointer is at the same value it used to be before a function call. (unless you do some really weird things) This may seem trivial, but it helps preventing certain problems.
There is research going on into how to make proving more complex things easy. One idea is, for example, to have useful types. For example your compiler and system could know that a certain memory location contains an integer, which is a prime number. Your compiler could check the code path, and insert code to check this condition where necessary. It can even can throw a compiler error if you try to write in the exit value of a function that does not produce primes.
Futhermore you could have tags to your types indicating how this data can be moved. For example every word of memory you have could have a type tag which allows you to set that word to be "private". A "private" memory word would stay private during normal operations. So if you add 5 to it, it would stay private. Your network card would refuse to transmit words marked as private. However a special privileged function could, for example, encrypt it and turn that information public. That way you could guarantee that no information marked "private" ever gets out unencrypted.
Essentially the current attempts boil down to the idea that you give your compiler hints on how it can check if the code is right. Early starts to this are "const" attributes to variables in C.
Many languages, for example, will make sure that your stackpointer is at the same value it used to be before a function call.
Most of that sort of protection only operates at run-time; it might prevent the buffer overflow, but it will cause an exception from the bowels of the code. How that exception is handled is a problem unto itself - and if unhandled, simply kills the process. So although you can obviate the effects of certain types of bug, you cannot prove that the code will do anything sensible (like not crashing) without a lot more analysis. And that is expensive...
Essentially the current attempts boil down to the idea that you give your compiler hints on how it can check if the code is right. Early starts to this are "const" attributes to variables in C.
Yes - there are many things you can do to improve code reliability. But that's a long way from having properly proven code...
Vic.
Actually the thing with the stack pointer won't help against stack overflows and is in fact done by the computer. In short it guarantees you that pushes and pops to the stack will be symmetrical.
C also does things like automatically make sure that if you add a float to an integer, the correct "float to int" adding will be called instead of just seeing the bits of the float as an integer or vice versa.
C also does things like automatically make sure that if you add a float to an integer, the correct "float to int" adding will be called instead of just seeing the bits of the float as an integer or vice versa.
Sure - but all of this is still run-time checking; it might stop things being mis-interpreted, but the code still malfunctions in some manner. If this is how the code is being checked, there needs to be *loads* of pre-launch testing - and testing is rarely performed in such a manner[1] as to find these bugs.
Formal methods are important in code design and verification - but they're a lot of work, and that means cost.
Vic.
[1] One of the first questions I always ask is "what coverage was achieved in testing?" It's not uncommon for the test team[2] not even to understand the question :-(
[2] It's a common myth that testing is somehow the poor relation of development. In fact, mthe reverse is true - if you want high-integrity code, you need to put your best engineers onto the testing. Any fool can bash out something which is vaguely to spec, but it takes a skilled engineer to determine that it meets spec in every case...
Throughout history, every mind-blowingly expensive, unbelievably complex new offensive weapon has many times been trumped by some unbelievably less expensive, much simpler, defensive weapon--or in some cases, rendered useless by it's own impossible complexity. Soon, every bit of data stored and sent anywhere and everywhere will be impossible to decrypt in transmission or in situ in any sort of timely or useful fashion. We'll be reduced back to spying on people with binoculars and microphones hidden in a painting. Good.
Superiority by Arthur C. Clarke.
Written at the time when electronic valves where a thing.
Fine, the definition of lying is not telling the truth. Anything more to say ?
People in positions of responsibility should be accountable for their actions and it is not up to them to serve an excuse not to be. The day accountability went out the window is the day democracy went down the creek without a paddle.
I get that National Security is important, I get that good people are doing dangerous jobs abroad and I get that threats do indeed exist and must be found. Unfortunately, the US government and all its departments have lost the trust of The People they are supposed to be (theoretically) working for, and telling more lies is just digging the hole deeper.
But no matter, the Government actively demonstrates utter contempt for the very notion of democracy at every point - and gets away with it due to public apathy.
Well we currently have the problem that compiling software is a slow and error prone thing, that's why installing Gentoo will typically greatly increase your power bill, but let's imagine we'd live in a world where compiling is fast and easy. So fast that your operating system and applications could actually compile from source while starting. Sure that sounds crazy, but it's what Forth and Javascript people are doing.
Now what if whenever you do an update, you actually get to see the changes. Just like with Eclipse you could simply see the changes between the old and the new version. Such changes are much easier to understand than the rest of the software. Of course 99.99% of all people would just accept them without even looking. However with the millions of computer users, 0.01% still amount to hundreds to thousands of people. Compare that to the few people who look at patches today.
There are lots of people who, while not proficient enough to actually write their own code, know enough to be able to spot code they may not want. Those people can then just refuse to accept certain parts of the code.
For other people this may be an introduction to reading code. If it's just a click away people might start looking at it, and it'll gradually make sense to them. Computing would change from some "magic box" to something we all can take part in.