Re: Good for them & the judge
Other media reports have suggested the convictions have been passed to the Criminal Review Board on the basis of this case.
3278 publicly visible posts • joined 23 May 2011
Mostly we use if for caching. That stuff will be recalculated - which will make for a slow app start and force the installation wizard to rerun, but it won't otherwise cause harmful effects for anyone who has the internet.
Users without internet, however, will lose access to their purchased features until they're back on line. At that point, Play Services will renotify the app that they own the feature and it will be recached. (I've just had a panic about subscriptions. But I think that's true even for them.)
Ironically we use a file on iOS because storage was far more flaky. (And also because iTunes wouldn't renotify us of purchases and because on iOS we had to store the level of consumables.)
Everything in the Solar System, from the massive hot burning Sun to the blue ice giant Neptune and all asteroids and bits of dust and rock in between and beyond, are made from dead, leftover stardust.
Most of the hydrogen (and helium) kicking round the solar system will be from the big bang - as it is in general for the interstellar medium. It's true no supernova fuses all its hydrogen, and stars and planets shed protons like a Christmas tree sheds pine needles. But that's a small addition to the stuff that has never been incorporated into a star.
"Based on experience of using web apps, I’d say that JavaScript is rubbish and slow."
Javascript can be very fast or very slow. The transition between the two can seem inconsequential at the source level and can vary between engine. And that's before you get to the giant anchor that is the DOM and devs who insist they're going to do it their way because they don't care about speed, and then pile on React, Jquery and yet gods knows what else, only to code using a functional paradigm. (I don't have a problem with functional programming - but javascript ain't optimised for it.)
Back in the day you could find C++ apps that ran like overweight dachshunds and apps that ran like greyhounds. The dachshund-guys guys can write code that's even slower using javascript. But I used to be able to get core maths heavy stuff to run within a factor of 2 of native code. It doesn't seem to have got any slower - although increasing mobile phone screen sizes have put the old code under more strain.
Bennu isn't an interstellar object; it's local rubbish. So it will have enjoyed a balmy couple of hundred kelvin for most of its life and never known the chill "4K" of the interstellar medium - obviously varying with it's distance form the sun and with strong diurnal variation between day side and night side.
And sabre-tooth tigers aren't tigers. But as an ambush predadator, smilodon probably had slit eyes.
Let me point out that cats' pupils are circular in limited light. And while dogs may not have slit pupils, foxes do. But this turns out to be a fascinating and poorly researched rabbit hole. And according to the goto paper, the large cats evolved circular pupils from an ancestor with slit pupils. It's not clear whether smilodon would have had slit pupils or whether it's lifestyle fulfilled the conditions to evolve them - it's eyes weren't as forward facing as modern cats. But it's certainly possible.
"...a situation where a single protocol contains unrelated resources, like file: containing /usr and /home;..."
As I read it (disclaimer: I've not used the OS) that's exactly what he means. You seem determined to view it as a hierarchy. But if I give you the Cartesian coordinates (3,4) would you insist they're hierarchical and that the y value is subservient to the x value? Likewise, (I'm guessing) the protocol is part of a co-equal tuple describing a resource - i.e. behaving exactly like an url.
If I'm understanding the spec correctly, HDMI <= 1.4 has signalling rates of up to 1.6GHz; HDMI <= 2 has signal rates of up to 6 GHz; and HDMI 2.1 heads off into the near infra-red. But I'd suspect the cable before the board - maybe wrap it in foil?
Somebody suggested the foil should be grounded. But I wouldn't have thought it necessary as we're trying to attenuate an outgoing signal rather than shield a cable. The skin depth of a >2GHz signal in aluminium is <2μm so, theoretically, 0.1mm of aluminium should reduce it by a factor of at least 1E-22. Although I guess ungrounded foil might generate unhelpful inference in the cable itself.
"While the...U.K. [has] corporate tax rates of....28%..."
I think you'll find it's 19%. [SOURCE: gov.uk]
Imagine all the services we could fund if it was 28%. Or, shudder, the American 35%. But, no, the world will collapse if we don't get it down to 17%.
Yes, I know the Tories are going to stick at 19%, for the time being. But this is a party political broadcast on behalf of the ABC party.
"But that would on no way be in the ball park of the energy we get from the sun. Orders or magnatude off."
I'm not sure what you're saying here. The problem is there is too much energy around a black hole. (The paper says 1010-12L⊙ - with planet formation inside 106 AU.) The authors' problem is finding a region cold enough for planet formation to take place, given observational evidence of warm dust. They do that. But it's not inconvertible that the planet could migrate to somewhere slightly warmer - it's a turbulent environment.
I still don't think there'd be life there, but.
Oscillating wavelengths are a feature of the old analogue signals. That's what causes the interference. Digital wavelengths don't oscillate at all. They're rock solid and stick to a single frequency wavelength. (Or possibly two, and switch between them. I've just got to check that with Derek out back.)
That's what NAT does - it adds two extra bytes (the port number) to the address. That's a kludge.
We could have added an "options field" to the header with the extra address info, but that wouldn't have been backwardsly compatible. Although it probably could have been made to work and we might even have upgraded our systems by now.
Would that be a case of Poincaré reoccurence?
This joke depends on recognising that the "Lorentz equations", taken as the Lorentz group, are a subgroup of the Poincaré group, the full symmetry group of spacetime without gravity, and we can therefore pun on the notion of Poincaré reoccurence for someone who intends to return on Lorentz equations. Yes, we need an icon fro this joke requires a subscript. I now, mainly, write javascript.
The ususal source (my emphasis):
Compton scattering...is the scattering of a photon by a charged particle....Inverse Compton scattering occurs when a charged particle transfers part of its energy to a photon.
"Inverse" is a convenience label which tells you what is giving the energy to what. And, at these kind of energies, they're both particle-like.
"That said, a permanent base on the moon could yield a tremendous amount of science that is useful in the near future as well as down the road."
I imagine most of any science that could be done could be done more cheaply by robots.
Although, I admit, science that might not otherwise be funded will happen because we have people up there and we need to find something sciency for them to do.
"No ability to launch any executable other than the application required (SAP or similar) on a device without Internet access and someone from IT babysitting you whilst you are on the device."
I'm in a browser. I hit F12. I now have access to the console and ability to write code. I can start automating the scraping of any data I can access from the browser.
Our colour system is wrong way round. We've made the highest energy colour the "safest"; and the lowest energy wavelength, the most dangerous. So if something is serious I have to go, "Boss, this one is infra-red." Or "Boss, this one is radio waves."
But if we matched danger with energy then I'd always have the option of going "Boss, this is ultraviolet." Or "Boss, this one will vaporise half your fucking face."
"You wouldn't do it that way!"
It wasn't my suggestion; I was pointing out it was silly way to do it!
I suspect the problem here is that these are libraries which live in the process space of the app and are called by userspace libraries the app is allowed to use. It would be a bit like saying you don't want code calling printf()
directly but using authorised output routines that call into it on your behalf. But printf()
is there, in my process space; all the OS can do is hinder me finding it - it can't stop me calling it.
That's actually quite a hard problem to solve. Maybe the function could validate the return address is in allowed code range? But I bet that could still be exploited. The only way to block this is to move the library into a separate process. So I return to my original point: imagine printf
means IPCing to a separate process which then calls back into the kernel on behalf of the first process and IPCs back the result. I'm a fan of message-passing and microkernels, but that's a step too far.
s/companion/black hole/
in the above - I'd just been reading the paper where the black hole is the companion to the observed star.
Fair point about neutron star mergers. My point was (AIUI) the 4 solar mass limit is down to process rather than a hard theoretical barrier. And, yeah, given that that neutron stars can't be that heavy to start with, it's entirely possible they could produce low mass black holes. So maybe that's what we're seeing. (But I'd still bet it's just the black hole is heavier than they think!)
The best way to make a small fortune? Start with a big fortune. Black holes are the same.
Getting to a black hole requires a supernova from a star with a mass many times that of the final black hole. AIUI the smallest stars that will go supernova and still produce a black hole, produce a black hole "weighing" in at at least 4 solar masses. So getting a black hole this small is an engineering problem - not a theoretical one. (AIUI.)
That said, if I had to put money on it, I'd say they've underestimated the mass of the companion - their error bars go above the mass gap.
I was willing to give generic deserialization a pass, in case I was being a old fogey. ("Bah! Deserialization! Use proper file formats, you punks!")
But the verdict is bad as my instincts feared. You cannot include arbitrary data schema in the file format and let the software reconstruct the data according to that schema.
I suspect 10 year-old algorithms would be completed owned by a modern SEO website.
And you can search for an exact phrase by putting it in quotes.
"I think the problem is that bureaucrats want to make their personal mark on their area."
With the added bonus that, if you're a British bureaucrat, you can blame the EU when your personal innovations prove to be hopeless (cf. every instance of Westminster gold-plating EU directives).
:=
)
Does Forth not cut it? :P
This is bread and butter stuff and I'd've thought any modern language will do what you want, if you understand how. So use the one you are most familiar with (does modern Fortran not do this?) as detailed knowledge of the language far outweighs the gain from shifting to one that makes it marginally easier. (I suspect knowing how to do this might be the real problem.)
You can use 64 bit integers. But double precision floats can handle integers ≤ ±252 without loss of precision.
If you really don't have a favourite l language, and C++ is off the agenda, I'd be tempted to point at node.js. (Downvote away.) It has a crude, record-free approach that feels like a throwback to the 8 bit micro days. Load a file with const fs = require("fs"), buffer = fs.readFileSync("filename");
After that const text = buffer.toString("ascii", byteIndex1, byteIndex2);
will retrieve an ascii string between two byte indices. "ascii" can be changed to "utf8", "utf16le" and probably other well known encodings. If that's a hex value turn it into a number with let value = parseInt(text,16);
The Buffer API lists methods that will allow you to peek at binary values by file index.
Piece of piss.
As others say, keep the data importing separate from processing.
But sometimes things are only complicated because nobody has tidied them up.
Software can end up like a matryoshka doll with layer built upon layer built upon layer. Each layer is constrained by the one below it, probably includes convoluted hacks to work around the limitations of the lower layers, and introduces quirks and deficiencies of its own. Removing some of intermediate layers can often reduce complexity, improve performance and eliminate bugs.
Likewise, when you end up with a ragtag bag of libraries and tools they often have a lot of overlap that can be deduplicated. You don't have to go all SystemD to realise even giving them a consistent interface can make things simpler. (I just removed a bug where two libraries were using two subtly different coordinate systems and nobody had noticed.)
No matter how many times I read "Orion", I can't get the Project Orion spaceship out of my head. I just expect it to be a massive, nuclear-power juggernaut.
"...if you can not debug your design then how quickly you got to gates does not matter one iota."
In my book, "time to market" includes ironing out the biggest bugs. And the kids will be able to use these tools to do that. Yes, they'll miss or dismiss the more subtle bugs. But the management will be happy. And the more talented ones will learn the lower level stuff.