Re: Evidence
If there is enough circumstantial evidence to that effect, then you can blow the whole thing away in round#2 as a mistrial I reckon. Koh out.
16005 publicly visible posts • joined 3 Jun 2008
They are shutting it down to upgrade the hardware so that collisions can go from (I think) 7 Tev/c² in the center-of-mass to 12 TeV/c² in the center-of-mass without blowing the tunnel sky-high. This is still lower than the hoped-for 14 TeV/c² but that value is no longer considered safe (for the copper interconnects, that is). It will be back up in 2 years or so. I suspect the software will have been upgraded too, new experiments and filters suggested and the existing data hadooped something fierce.
> Nevertheless, upping the significance of the particle find brings us a step closer to knowing that this is a new and Higgs-like elementary particle that has never been seen before. ®
HARRRRUMPH!
"Nevertheless, upping the significance of the particle find brings us a step closer to knowing that our mathematical descriptions of nature (based on Lagrangians, Gauge Symmetry, Lie Groups and yadda yadda) are unreasonably correct even though they cannot in the end be totally correct."
Indeed, "single quantum systems" can very well span whatever 4-vector you choose. If Alice has one end of an entangled qubit pair, and Bob another, then relative movement between Alice and Bob will be perceived as an arbitrary difference in relative moment-of-time for faraway observers, thanks to the Special Theory of Relativity.
Nature is powerful for keeping track of all possible connections between anything anywhere anytime down to near-infinite detail at every point in spacetime (is that still the cardinality of R?) but tries to avoid fixing herself on any actual value or as long as it can be avoided (as things get larger, it cannot be avoided).
Informatics people, are used to the cardinality of N, try to fix things immediately (if need be by asking the user) and want throw away any detail as fast as possible. This is practically the reverse.
That was a creative mash-up born in hell.
It's like one of those articles where some prof tries to demonstrate that consciousness actually, objectively exists and cannot be explained by science or imitated by application of engineering. He then sets up a strawman ("experiments with brain damaged patients SHOW that qualia are not generated in the BRAIN!") which is then meticulously destroyed.
Here the need is expressed to show that creativity actually, objectively exists, has nothing to do with plagiarism or imitation and needs the guns and badges of the state for some monopoly action on its products. Then a strawman is set up (some dude from the NYT, who he?) which is then messily destroyed.
Many questions remain!
Am I being creative if I remove the milk bottle from the fridge in a special way while fighting off my cat? Is it creative to think about the neurological basis of creativity? Is finding a creative algorithm that creates for me actually being creative or is it being meta-creative? Do I get a meta-copyright?
"never seen anything written in Ruby that couldn't have been better coded in C"
Never wrote in Ruby. But, either you don't know much about programming languages or the things you see are half-pagers that you get as exercise at uni. Even then, programming these in C would be painful and wasteful. Might even result in a Ruby interpreter.
The mind boggles, indeed.
Some sort of injection attack? Does the terminal create SQL queries based on unsanitized strings sucked off the card's chip? Does it look for a .jar or a .dll file and thinks it would be a good idea to call up the main entry point with max privileges (considering the error messages one sometimes sees, the Windows Administrative User)??
Is this some kind of backdoor for State Security, The Terminal Maintenance Team and/or crooked Developers?
I suppose this must be terminals of the "bold" nature. ANYTHING might happen. You could be maimed by an exploding keyboard. What's been the status on their voting machines lately, btw?
Really, it says there "up to 20x faster queries". The people in charge of the moneybog DO KNOW that this MEANS "we ultimately managed to find a query, that, if optimized in a very special way (compiler options include at least the options -abcdefgh), performed 15 times faster than a similar query on a nonoptimized software-hardware combination of the competitor which had about half the CPUs and half the RAM of our machine and which ran generic softwar"
Right?
Additionally:
"Each A5 processor has a unique identifier that is fused into the chip which cannot be changed and this is used to authenticate the device with software."
So when Intel does it, hell breaks loose, but with Apple it's all right? Very well, then.
I still have to read the REAMDE, it's on the TODO stack occupying most of my in-house BossDesk.
> slowing pace of technological advancement, had contributed to the science fiction genre shifting from the optimism of Clarke and Asimov to a darker, more dystopian bent
It might also have to do with the depressing conveyor belt of retards, psychos, political entrepreneurs, showmen and socialist economists runining the economy as well as foreign lands and transforming people into sizzling steaks to remain in power or in the spotlight a few months more.
Even this type of work doesn't *deserve" patents because it's built on the work of many, most of which will be swept under the rug and forgotten when the patent is granted to the player who acts first (generally by having a lawyer running to the bureaucracy with an obfuscatory paper in hand making grandiose claims while the real deal is still being worked out in the labs). Then prices are set to "high", lawsuits fly, industry uncertainty reigns, he-said, she said bullshit occurs and progress may be stifled for decades.
"The intelligence community had heard of Osama bin Laden for years, but it wasn't until the world watched the planes hitting the buildings that the threat was taken seriously."
Looks like the memory hole sucked up the remembrance of the FIRST bombing of WTC. Same guys. Flagged up by Able Danger. Dossier went to the shredder for bureaucratic arse covering, IIRC.
They are all buffers, we also have "pre-buffering" to load up on slow incoming data before playing it back fast and the dreaded "buffer underrun" when burning a CD, whereby the buffer holding data to be written awkwardly was found to be empty when the laser hit the virgin plastic area, resulting in a coaster.
> the need for buffers indicates a good network, not a bad one.
It most certainly indicates a shunt from a data sink to a data source where both are working at differing rates and someone wants to temporarily cover up this ugly truth.
"Otherwise how can anyone else avoid inadvertently infringing a patent?"
You *cannot* avoid inadvertently infringing a patent. That is what it is about.
They are state-granted monopolies in the space of ideas.
Anyone defending this kind of bullshit needs to be on the receiving end of serious correction of attitude.
I don't know how many lorries you will need to carry off the amount of faggotry and hypocrisy generated during this discussion round. Maybe we can get some Hillary Hysterics and Mitt Foppery out of this though.
Do I need to link to
article again?
But when all is said and done...
"Apple gets flaccid $2.5 Samsung patent payout"
"Lawyers gets $25.25mio handout"
Meanwhile in the real world, Apple will again be un-hipsteriffic and shit-tier while the product cycle will be firmly driven by far east manufacturers.
Also reminds me of the old times when the USA still had some hope of growing up in style. We didn't know the truth about what horrible things were going down even back then but who would have foreseen the full-scale devolution into the ridiculous corporatist-socialist-imperialist-warfare state that we have now, with a droning harpie and black narcissus as figureheads?
I will be at the pub.
"Fortress has a unique syntax that resembles mathematical notation, the native language of computer science."
I don't know of a "native language" of computer science (maybe predicate calculus? or lambda calculus?), but does this mean Fortress is more like APL, more like Haskell, more like a LISP or more like Prolog?
Expect Larry to raise a stink about "valuable IP" once someone uses this to make a penny.
IBM discount /n./
"A price increase. Outside IBM, this derives from the common perception that IBM products are generally overpriced (see clone); inside, it is said to spring from a belief that large numbers of IBM employees living in an area cause prices to rise. "
"Making money" and "cutting the workforce" are of course orthogonal concerns outside in the real world (i.e. away from "progressive" talking circles). But still...
> Just so unfortunate that their support base are so dumb as not understand that the President did not cause the economic problems or the over 10 trillion increase in the national debt.
Not singlehandedly, no.
But Narcissus-in-Chief didn't exactly help. He listened to his economic advisors and thought he could influence the economy ("just another switchboard with dials; we can handle that...."). Instead, it's on a par with the Utter Fail level of FDR and the New Deal, followed by the New New Deal, followed by War. The latter looking increasingly more likely. Might not stay "regional" either, but at least now we have nukes, there won't be so much pointless drama. We can solve our differences rather quickly.
Isn't that the "Lair"
You also have to install the mantrap on top of a large, transparent, piranha-filled tank, into which members of PUTA[1] regularly lower hapless Bernardiners, causing great consternation in the suits, BOFHs and PFYs trapped in the mantrap.
[1]: People for the Unethical Treatment of Animals
Only if you don't realize that "leaving to the runtime system the task of freeing up memory that can never be used again" and "keeping all the memory reachable, forever" are two totally different things.
Why are we back in the early 90s discussing GCs? It's tiresome. But at least, today there is Jimmy Wales' Big Bag of Trivia. Have a gander:
http://en.wikipedia.org/wiki/Garbage_collection_%28computer_science%29