The word "Tinker"
Bit of a pejorative term this side of the pond. I reckon he's better off with "Tinkerer".
2189 posts • joined 8 Nov 2007
Bit of a pejorative term this side of the pond. I reckon he's better off with "Tinkerer".
That, or the court should rule that the DHS was engaging in vexatious litigation.
Might I suggest free hammocks for all users?
The young man stepped into the hall of mirrors
Where he discovered a reflection of himself
Even the greatest stars discover themselves in the looking glass
Hmmm. I didn't know that Adapteva were bringing out a new model. Last time I checked on their website (around a month and a half ago?) the whole effort looked pretty moribund. While reading this article I was tempted to start messing around with my 16-core board again.
Flappity, floppity, flip
The mouse on the mobius strip;
The strip revolved,
The mouse dissolved
In a chronodimensional skip.
Well, one of them anyway.
Hopefully people will start looking at GTK3 in general and systemd. Unity (along with Mir) and GTK3 were the main reasons I abandoned Ubuntu long ago. There's still that fucking metastasis that is systemd to extricate from the heart of Debian-based distros.
Reading that paragraph, and the following one (rebutting his point by looking at the cost of the service they offer) you completely missed out on the more salient fact: You don't have to use Google to search, whereas most people are effectively in thrall to one or two ISPs. Google represents a captive market only insofar as customers are unaware of the options available to them.
Ahh. I have seen the light, and ze goggles, zey do nothing!!!
What the hell is that message (plus all icons set to trollface) all about?
When subtracting two numbers that are the same except for some transposed digits (say 35 vs 53), the result is always a multiple of 9. The difference between 53 and 35 is around 20, so pick the multiple of 9 that's just less than that. So 53,000 - 35,000 can be quickly calculated as 18,000 with no need to do awkward carrying or the like.
Actually, I was also thinking about Bitcoin. Since the ledger is public, you can encode your "go/no-go" message using a transaction of a certain amount. I assume that wallet IDs are stored in the ledger, although it's impossible to know who they belong to unless you find it on someone's PC, which shouldn't happen if you're doing it right.
As someone mentioned above, it's much easier to use specific pre-arranged codes, preferably one use only. Something simple like posting an animal picture or video on a certain day, with the choice of animal (or no post at all) giving a traffic light-like status update or selecting from a set of targets, or whatever. Assuming you can meet up in person at least once without being bugged/spied on, it's trivial to pre-arrange this sort of thing and no amount of technology or anti-encryption laws can defeat it.
(Hmm... I didn't see that post that's (now) right above mine, suggesting exactly the same thing)
It's like you make one sexy Star Trek role-play video for your own private amusement, and everyone's like "I know they call it the naughties, but that sort of thing isn't acceptable"
Plus, role-play is generally all that this is.
So, pretty much the Socratic method of pedagogy.
No data checksumming, unlike ZFS.
I've no idea about whether this is true or not, but could it be that the flash controller includes bad block detection and recovery (where possible; returning an I/O error otherwise) at a lower level? If so, perhaps there's no need for duplicated functionality. Besides, I think that read errors are much less of a problem with flash: it's write endurance that's the main problem.
the original Nexus 7
The solution there is to root it and periodically run fstrim. I don't think that they ever fixed that problem and even with the workaround, my tablet still falls off a performance cliff before I can run the fstrim, followed by a complete crash/reboot.
That's what I want. I want a building block that slots in nicely, is easy to switch out, doesn't phone home (not that I'm particularly singling out Ikea on this), and connects to just the stuff that we want it to. All the ancillary stuff to do with data collection and actuation (as well as the logic glue that holds sensor nets together) should be done in a secure manner using your own properly-firewalled home network subnet (and possibly a portal via a secure VPN for secure control when outside). All this stuff about connecting to some mothership can go stick its head in a pig.
Causing excruciating pain when you step on it barefoot is optional.
Then it might be a goer (or blower, with an added SIM)
The last prime minister called this election to pacify his own right wing, safe in the knowledge he'd comfortably win.
This, a hundred times. He gambled the entire country for the sake of party politics and to cement his own position. It was a spectacularly stupid gamble to take, and now all the dominoes are falling. I'd say that this is not unlike the domino effect that cause the first World War, with parochial local politics somehow managing to ensnare the whole fucking continent. The sort of clusterfuck that the EU originally set out to avoid happening again, I might add.
They're intransigent no matter how much it benefits them
That has still to be seen. If May, as she indicates she will do, goes into negotiations effectively saying "fuck you, we're prepared for these negotiations to fail, despite the massive collateral damage this will cause to both sides," then it's hardly a good strategy for dealing with the other member states. You brought them to the negotiating table so if you want to soften their perceived intransigence, this strategy is pretty much guaranteed to achieve the opposite.
Maybe Trump could ride in on Rocinante and save all those Murcan jobs?
ditto, but Minter was just channelling Floyd.
> The home secretary said it was “completely unacceptable” that
> the government could not read messages protected by end-to-end
So not only was the cloud-related stuff as mentioned in the article here a bit fluffy, but so is the secretary's grasp of what "end-to-end" encryption means. If WhatsApp is actually end-to-end, then what the hell is ranting to the company going to achieve: they surely wouldn't be able to decrypt it even if they wanted to.
The most annoying thing on Windows is with automatic updates. It goes and downloads something and pops up an alert at some random time. If you're typing something at the time and just happen to be hitting enter, it's always "yes, do shut down my machine, ignoring any unsaved work that I have, and while you're at it, why don't you make the machine unusable for the next 20 minutes". Aaagh.
Also, speaking of UI components that jump around the place for not good reason, whoever designed the UI for Netflix in a browser deserves to be shot. Stop fucking moving shit around when I mouse-over on it!
You dismiss the possibility of interfering with the supply chain, but how does that square away with more recent events:
OK, it's apples for oranges (lemons?) and different animals on your free Chinese takeaway calendar, but still...
Better to think a bit after someone corrects your English and you decide to hit back by criticising their comprehension skills... You really should have said "since-defunct" (though I'm not 100% sold on the need for the hyphen here). I didn't downvote you, by the way.
But it's a fair comment, AMBxx. Trump has filled many posts with people who aren't experienced or educated in the job area that they're supposed to be managing. I'm not going to look up a list, but Rex Tillerson, for one, has no experience in the public sector or the military, and yet he's been doing the rounds trying to be a diplomat. There are many other examples. The guy with control over the EPA doesn't "believe" in man-made global warming/climate change, for one. Maybe faith/believe trumps science?
I don't underestimate Trump or many of his appointees in the same way that I don't underestimate a rabid and unpredictable dog.
(too late to edit my last post above, so ...)
Come to think of it, chord typing could be pretty comfortable using touch sensors rather than buttons and 2-handed operation instead of one hand. Use your four fingers in the rear for ASDF + G on one side, and thumb up/down on the front to select a different row, or move to top left/top right for numbers/punctuation. Use pressure-sensitivity rather than taps to "squeeze out" the chorded key. Same thing for H + JKL; on the other hand, and two thumb squeezes together for space. It might take a little bit of getting used to for someone used to touch typing (since thumb is moved instead of fingers going up/down (in/out, relative to the edge of the screen), but it might be sufficiently similar to be able to transfer your muscle memory over.
Take a Nintendo Switch...
Funny enough, I had a similar idea when I was reading the article, except I was thinking of something a bit older (MicroWriter, ca. 1980). Figured it'd be best used in a two-handed configuration so you can grip the device while looking at the screen. Is eye tracking good enough that you could use it for moving a cursor or tabbing?
is performing within operational parameters.
"wipe it and reinstall" is indeed a painless way ...
But, Apples aren't (Windows) PCs.
If it happens, apple are bad for not paying [...]if it doesn't apples are bad
Oh, no. bad apples make Apple bad? Sounds like perps are trying to shoot fish in a barrel, but not using any core vulns to pip them post paste.
Go Bambi! Go Thumper!
(xmas rule 34.007 and all that; maybe get some cheese-smelling action in later... )
More to the point, can we have 愛 and セックス at the same time?
And not just the police, as it happened:
(There may be oil / under Rockall)
> To all those saying bah to the concept of dual hashing...
I think that those doing the poo-pooing are imagining the scenario hash2(hash1(message)), and they're right to do so. If you use hash = hash1(message).hash2(message) then the strength of the resulting hash should be the product of the strength of both constituent hashes.
An interesting feature of a perfect compression is that the output bit stream is (if one did not know that it was a compressor's output) perfectly random.
Not true, for two main reasons.
First, you're not defining what "perfectly random" means. You have an general idea of what this means, but I'm pretty sure that you're not able to back that up with maths. Also, why the hedge "if you don't know it's a compressed file"?
If you take a simple compressor that takes an input stream and does something like Lempel-Ziv-Welch (LZW) compression then you have a stream of output tokens that either encode for a verbatim section or a pointer + length symbol that refers back to a previously-seen section of the file. Both have patterns that make them distinguishable from random data.
Second, this "perfect" compression. Again, there's this terminology deficit, but a fundamental theorem of compression is that not all data can be compressed using a given compression algorithm. If a compression algorithm is to be reversible (and can accept arbitrary input) then some inputs will compress to smaller outputs (or the same size) while others will compress to larger outputs. These larger outputs (and even smaller outputs, to some degree) will tend to be at least partly systemic, meaning that some input symbols will appear verbatim in the output. The only thing that running something through a compressor proves is how good the compressor is at exploiting certain kinds of redundancy/structure in the input file, not how much entropy the input/output files have in absolute terms.
This false line of thinking (that compression outputs have to be random and uncorrelated with input) has been used before in attacks on SSL connections:
> dominated by Intel because there was a standard configuration
Perhaps this is part of it, but I think that the more salient point is that not many attempts have been made to make and offer "server-class" ARM systems. The PC desktop/server world has been constantly evolving its Industry-Standard Architecture, bringing in new types of memory, buses and peripheral interconnect the whole time it has existed. By contrast, ARM systems tend to favour the System-on-a-Chip approach, with features that are much more suited to embedded applications than being at the centre of a peripheral-focused/interconnected ecosystem. So you tend to see soldered-on RAM instead of pluggable DIMM chips, vendor-specific emmc storage (there are no standards surrounding how to interface with this class of flash memory) and no sign at all of standard PCI, SATA buses unless it's bolted on as an afterthought (daughter card going over USB, say).
For years, this has been fine. Nobody (apart from uninformed end users who, eg, expected their Windows ARM tablets to be a drop-in replacement for their x86 equivalent) expects the ARM systems that they buy to have a PC-like ISA, apart from some obvious consumer-level interfaces like VGA/HDMI/USB. Also, all these ARM vendors have been working in their own niches with little incentive to rally round some kind of ISA for more "internal" components (equivalents to DIMM, PCI, SATA, RAID controllers, discrete GPUs, etc.). It's only recently that ARM chips/SoC are beginning to be viewed as potential competitors in the PC-like/data centre role thanks to constant improvements in single-core performance, plenty of cores, and the step up to 64-bits. And, of course, energy efficiency compared to x86 legacy systems.
I'm sure that ARM in the data centre is definitely only a case of "when", not "if". I think that the author is right that this SBSA initiative will be a huge step forward for getting vendors to rally around and produce more PC-like architectures, but I think that it's only part of the way. You need standardisation not only at the SoC level (having a standard build configuration so you know how to address all the MMIO registers and such), but also at the level of having standards for physical/electrical interconnects for pluggable DIMMs and PCI-like peripherals.
Eh, they're all perfectly cromulent words round my way.
And the classic: "Do not meddle in the affairs of wizards, for they are subtle and quick to anger"
> "fixed" would be a better word choice all round
But, "Buffy the Vampire Fixer"? Eew. Who'd want to watch that?
Well said, AC. I hate it when people take offence on others' behalf.
For one thing, if it's a de-dupe problem, then it's much more efficient to use a hash of a file than to do a pairwise comparison of all files that could have the same contents. The problem of finding duplicates would be pretty intractable otherwise. Secondly, since the total number of duplicates will most likely be very small (compared to the full population) and the de-dupe step needs to be done only once. I can put up with a bit of extra overhead if it increases safety and finds me extra disk space.
As it happens, I actually use SHA-256 (using a tool similar to shatag in Debian), but notwithstanding that, I don't think that there's a problem using MD5 as a kind of heuristic to find identical files, so long as you have a second line of validation after it. In fact, you could use one or more different hashing functions as part of the validation step here, before you delete and create hard-linked copies...
... DevSkim would show a pop-up telling the user they're making a critical error
Maybe, maybe not. What if I'm aware of its shortcomings and decide that it doesn't matter in my case. For example, I could be using it in a program to de-dupe a filesystem, but I know that before hard-linking files together I'm going to do a bit-for-bit compare on them because I'm paranoid about accidental hash collisions and my own programming errors.
Right now, I wouldn't be too concerned about using MD5 in a HMAC (hash-based message authentication code) implementation. The Wikipedia page here states "attacks on HMAC-MD5 do not seem to indicate a practical vulnerability when used as a message authentication code." Likewise, I wouldn't be too concerned about using it in a Merkle tree implementation where hash collisions are only advisory (like the file de-dupe example above) or I have other explicit measures that prevent pre-image (or whatever) attacks.
> I guess what happens to them is the same thing that happens to all those decade+ old laptops and desktops.
Yes, the ASIC hardware is ultra-specialised, so it can really only calculate sha256(sha256(message)) < D, where "message" is the concatenation of the previous block's header, a proposed new block and a "nonce" (effectively a random number, though they are scanned in sequence).
This kind of thing is no use for, eg, breaking passwords, so if Bitcoin dies, the hardware is effectively useless.
https://www.bitcoinmining.com/ has a nice block diagram explaining this. See the "What Is Proof of Work" section.
> One wonders how the arrival of quantum computing will upset these crypto-currencies.
Presumably not at all. A quick check on how Shor's algorithm (which could potentially defeat RSA) works tells me that it relies on the quantum Fourier Transform, which isn't applicable to SHA256 or hashing functions in general.
I think that anyone who uses this phrase (non-ironically) should see the film "Ikiru".
systemd'oh! DNS lib underscore bug bites everyone's favorite init tool, blanks Netflix
Biting the hand that feeds IT © 1998–2017