Re: Just One Of Their Brainfarts
"Just taking away property rights to books completely incompatible with human rights"
Well, MY books are safely stored in a case made of wood in my living room. Problem?
16005 publicly visible posts • joined 3 Jun 2008
"Publishers are the big beneficiaries."
Only if the publisher gets a monopoly grant for issuance of said work. Otherwise FTP servers can spring up anywhere. Does he? Possibly. Not clear.
"And the French Free Software movement, recognising the freedoms of software libre depend on strong copyright"
Oh yeah? Howzzthat then? I would rather think it depends on strong copyleft.
Break out the SS regalia. Reanimate Judge Freisler. Open the star chamber.
Then again, what would one expect from the a nation in which the president arrogates himself the absolute freedom[tm] to terminate random citizens (which would thus be anti-citizens, I guess) if a case can be made in a secret legal document that he can, in fact, do so?
The US is currently shit tier and heading deeper into the brown bog. It doesn't need to be this way. Things can be changed with judicious application of crowbars.
>lack of knowledge
Don't be bashful. It just means "to all evidence, problems that are hard to solve for a classical computer will STAY hard to solve on a quantum computer (more generally, mother nature won't hand you over the keys to the car no matter what) except for some niche stuff like factorization - and simulation of physics"
>processing data at speeds we can barely imagine
Unfortunately, the space of NP-hard problems *will* stay out of reach. You will have P plus a few extra tendrils that become practical (as far as we know).
But you will be able to at least build efficient simulators of quantum system.
This should open a lot of doors in material science, possibly in experimental mathematics.
"...found a way to extend the quantum coherence of the qubits by up to 100 microseconds, two to four times greater than previous records"
Which means that they managed to have the qubit not leak information about its current state of superposition (aka. its current wavefunction) into the surrounding system (the "observer") for > 100 μs (i.e. avoid it getting "measured"). Apparently at that point you can do quantum error correction (I suppose, something that keeps the quantum state of interest in its superposition. like a sacrificial anode keeping a ship's hull from rusting)
[Mr. Holmes comes nearest to the pipe-smoking, tweed-jacket-with-leather-elbow-patch-wearing professor, so there...]
I can tell you that operations staff may well be at the same level, what with 24/7 operations that have to run on a shoestring budget powered by hairy rats running in threadmills in the server room. You then get "agility" pushed in your face which basically means that an upgrade will be forthcoming at date X and operations has to somehow choose what to leave out to nevertheless reach an acceptable acceptance level [which basically means the application stack will crash after 2.5 hours instead of immediately, great perspective], you write up an assessment of the problems to be solved [most of them of an as-yet-unknown or even unidentified nature], nobody reads it or reacts on it, marketing makes fun of you because you seem to pressure your coworkers for no good reason, at date X-2 you get told that you should have "informed" people beforehand that there actually were problems (you don't say?) and that your assessment is anyway worthless because there is no precise planning in it, just about thirty blocking points of which two have recently switched to "green". Come again? THIS IS CLOWN INDUSTRY.
Yes and No! We are talking MACHOs. But there are limits on these...
See
http://en.wikipedia.org/wiki/Baryonic_dark_matter
http://en.wikipedia.org/wiki/Nonbaryonic_dark_matter
"A small proportion of dark matter may be baryonic dark matter: astronomical bodies, such as massive compact halo objects, that are composed of ordinary matter but which emit little or no electromagnetic radiation. Consistency with other observations indicates that the vast majority of dark matter in the universe cannot be baryons, and is thus not formed out of atoms. It also cannot interact with ordinary matter via electromagnetic forces; in particular, dark matter particles do not carry any electric charge."
> There are OpenSolaris forks out there
http://www.theregister.co.uk/2010/09/14/openindiana_launch/
"Then Oracle changed the licensing terms for the Solaris 10 freebie distribution, which only allows those who download the operating system to use it in test and development environments; if you use Solaris 10 in production, you are supposed to pay Oracle $1,000 to $2,000 per socket per year, depending on the scalability of the server. Then the OpenSolaris community died of neglect and eventually committed ritual suicide. While the Illumos Project, launched in early August to create an open source alternative to the OpenSolaris and Solaris kernel and core network features (called OS/Net in the Sun lingo), Illumos did not go so far as to create a full distribution."
The future of MySQL also is murky.
Looks rather like standard work in precision measurement.
Noisy people (including sundry sunday-physicist bloggers) pushing extra-dimensional neutrino travel, string pretzel phenomena or imaginary mass will have to wait for another measurement discrepancy and in the meantime might fall back - if they are testosterone-fuelled enough - to neutrino-denial conspiracy theories. But really this is all as it should be. A suspicious result leads to publication leads to checking of gear and statistics which leads to the result disappearing. Thus the telenovela of "still no kinks in current physical models" continues.
Then again, it might reappear after even MORE checking.
> http://technet.microsoft.com/en-us/library/cc770873(v=ws.10).aspx
OTOH, the darkest times of helpless despair and groveling at the feet of the Beast are indeed upon us when people need a howto about copying configuration files over.
Biblical apocalyptic genocide at the command of Jehova icon is missing? Nuke will do.
So what is the problem? No need to discuss basic facts, this is not show & tell.
-> If you are on an open-sourced Unix, check the implementation of /dev/random [for random numbers derived from the system state] or /dev/urandom [for random numbers derived from an algorithm]
-> If the random-number generator comes with the package [in whatever form], check that. See also the Debian event: http://www.schneier.com/blog/archives/2008/05/random_number_b.html
-> If you see "RSA Data Security" on the package [in whatever form] and the PRNG is part of said package, ask for a signed statement of the CEO that this PRNG is correctly seeded and passes sundry statistical tests etc.
-> If you have a hardware number generator, use that, but stay away from naive implementations like "as simple as a CMOS gate input left open circuit". See here: http://spectrum.ieee.org/computing/hardware/behind-intels-new-randomnumber-generator/0
"Copyright law" is just another tool to pull in the rent.
Really, how can something called "The estate of the deceased writer Horselover Fat" pull in the monies on his production thirty years after his heavenly ascent and accuse others of "greed"? It's like a carpenter hammering away and accusing other carpenters of f*cking him over because they also use hammers.
Take those hammers away.
Fast chips running off solar cells? Genius. This eliminates THE bottleneck to embedded intelligence / ubiquitous computing and makes shipping adequate compute nodes to villages in the steppes or wherever actually possible.,
On the other hand, the NSA must be fapping hard at the prospect of putting sensors everwhere and increase the compute density of their data centers without a higher electricity bill.
"but they have been running this nasty Internet dirty-trick campaign with taxpayers' money"
As opposed to pushing frankly bad legislation with taxpayers' money, which gives everyone a hard time and burns even MORE taxpayers' money?
[Anyway, how is this Harper government still in session? From here, Harper is permanently trying to outdo the US in outrageous neocon behaviour like a pimply-faced kid pandering to the schoolyard bully.]
>>Just for the record, you were probably not even working on the internet pre-2000, and were still in your nappies.
Don't know why you would want that for the "record", nor for whose record exactly, but I think you mistake me for someone from a younger generation. Connecting with a Tektronix 4014 to the Unicos machine downstreet? Yup, been there, done that.
I don't know whether I should be dismayed because that Dziuba guy sounds like a maintainer of 4chan popping speed pills or moderately in accord because underneath all the Kid Rage Talk there are these points:
1) Separation of concerns is good, moving applogic into the webserver is bad.
2) You are not as blocking as you naively might think you are
... but then we knew all that, right?
I still don't get where his REAL problem is.
"Like many attracted to the original Node.js, Kang likes the non-blocking architecture .. Node.js combines all user requests as a single thread but offloads I/O operations that can slow things down for things such as disk or database operations from that main thread."
But doesn't everybody do it this way?
In this world of multithreadingmulticore hardware, you really do not want a "main" or "single" thread which requests I/O [including the ugly, ugly RPCalls that are everywhere these days] then waits until I/O is done.
You want a pool of worker threads checking new work as fast as it comes in.
You want SEDA : http://www.eecs.harvard.edu/~mdw/proj/seda/
>>I've had enough problems with the java jvm in production environments never mind using javascript!
Yes, I suppose so. Your "production environment" wouldn't be an Acer laptop running in your cellar hanging off the fridge power strip and pumping bad early-2000 websites with blinking gifs to innocent punters?
> A strategic partnership with IBM
By first-hand experiences, it CAN work. But:
1) Take what the salesmen say during the tendering process with a large grain of salt.
2) Secretly prepare political ammo for justifying:
a) Doubling the project length.
b) Doubling the project cost. In particular, lots of "included" features will turn out to be "extras".
This is standard in particular for GovOrg where no-one wants to admit to real cost or complexity.
3) Keep your own management away from meddling and re-planning once it's started.
4) Particular for GovOrg.: Set recent hires / young people on the project who don't endlessly try to not take a decision and actually have enough Courage Wolf inside to actually do some work.
> Calling them "random" was a guarantee of the imminent meeting of a chalk board eraser and my head.
But then again, if your string generated by a good PRNG is "short enough", it will be "random enough for your purpose".
Note that good definitions of randomness are quite recent too, so your prof may have been exceedingly brutal to you:
http://en.wikipedia.org/wiki/Algorithmically_random_sequence
The PRNG algorithms may be good, but it seems they are started off with bad "seeds". Suppose you have a range of devices that hash the current output of "ps", then use the corresponding value as seed in "new Rand(seed)". If these are devices in which ps "more often than not" just lists the "init" process doing nothing and a "routerd" with low activity, then these seed value would tend to not be randomly distributed. As the values returned by "Rand" will be used in taking stabs at possible 1024-bit primes [testing for primality being easy], the primes found will tend to not be randomly distributed. Thus the RSA key (p-1)(q-1) IIRC will tend to not be randomly distributed.