Re: Prior art
Wasn't the Frumious Bandersnatch a Victorian era creation of Dodgson's?
Indeed it was.
I think this is what's known as an homage.
1348 posts • joined 8 Nov 2007
Wasn't the Frumious Bandersnatch a Victorian era creation of Dodgson's?
Indeed it was.
I think this is what's known as an homage.
because it was closer to their mouse cursor, any number of reasons.
"downvoting from a tablet or smartphone" would probably be one of those other reasons, presumably.
... due to the lack of atlatls. But then, lo!, there they are in paragraph 3. Top notch!
RiscOS was ARM Archimedes, not the 6502 based BBC Micro. It just happened to have the similar BASIC.
Oops. I just assumed that it was the same BBC Basic on both BBC Micro/Archimedes. I got that impression from this (excellent) guide to ARM assembly:
giving your hippocampus the chance to dredge up some very old syntax from deep memory.
BBC Basic has an embedded assembler, so you've actually got two different syntaxes to remember (or, in my case, learn).
to get rid of Apple(s).
we still don’t have any reported incidents of confirmed security breaches with them.
With respect to bugs and backdoors in voting systems: absence of evidence is not evidence of absence.
This is particularly true with closed, proprietary black box systems that are not double-checked after the election. That's even if it's possible to validate it after the fact, and the box didn't just silently change things without leaving a permanent record.
They'll be modifying perl scripts before we know it!
You might be thinking of Parrots.
Or as I like to call it, "Seldonomics"
I was curious about this so I did a little test. Opened up an incognito tab to point at el Reg, then hit home, then recent applications. The result? The list shows a thumbnail screenshot of the "incognito" page.
I didn't really expect any other behaviour--apps can't tell the OS they don't want the snapshot when they leave the main activity, but they could blank the screen first in response to the task switch/leave activity message (or whatever they call it). I know it's a very minor oversight, but beware of not closing the incognito tab before going back to the home screen!
It was a pretty good year.
I second that. In fact, it's not even clear whether it's in the range of 3x more likely or 4x. My reasoning? If it were 100% more likely then we're talking twice as likely, or 100% for the baseline + 100% extra. So is "314% more likely" supposed to mean it's about 3.14 times as likely, or 4.14 times (100% + 314%)?
Whatever it is, the whole sentence (including the "whopping" part) is too confusing.
My copy of Watchmen has a copyright date of 1986. I'm thinking, of course, of Ozymandias' s multi-screen display allowing him to absorb a multitude of information streams at once.
I wish I had a secret Antarctic lair :(
You can choose to have an alternate opinion and you can choose to believe that screen size is relevant factor in this.
You mean like "big screen size is so important that we're not going to make small ones" to "we made a small screen size (with crappy resolution/dpi) and it's brilliant" to "big screen size is everything, man", to ... (you get the picture).
Any link to a doc that describes what these new capabilities are? I'd dearly love to see a CPU that could do arithmetic over GF(2) fields, as used in AES, among other schemes. It doesn't even take much to do this in silicon, though I guess chip designers probably think it's too specialised to bother. There's always FPGA, though, I suppose...
"Wizard needs food, badly."
sudo cp /var/games/nethack/save//$UID-`whoami`.gz $HOME/scum.sav
"My, that was a yummy Slime Mold!"
There is the theory of the Möbius... a twist in the fabric of space where time becomes a loop. When we reach that point, whatever happened will happen again...
Combustion is so important, they have to mention it twice
Obviously a Blazing Saddle fan...
"Physics, combustion, materials science, nuclear energy, and combustion."
"You mentioned combustion twice."
"I like combustion."
I quite fancy the prospect of them adding deep-fried Mars bars to the in-flight menus.
The open source (usually libdvdcss2-based) solutions usually fail spectacularly when the DVD publisher has included some arsey extra copy protection measure that makes the movie appear to be 99 titles, or makes it so open source players break unless you skip the first megabyte or so of the disk.
Agreed. I could never figure out why the default players/libs on Linux don't disregard the TOC info if it disagrees with the physical information on the disk. I had a look at the source code myself and figured out what would need to be changed in order to defeat any of the standard copy protection schemes from among DVDs that I own, so that I can transcode them and have them available on my DLNA server. I think there are probably two reasons. First, these bits of software are generally written so that they comply with the standards and they don't deal very well with the copy protection schemes, which deliberately throw in junk. This often leads to disks that work fine in a DVD player but won't play on a computer. Second, I suspect that there might not be the collective will to get around the "copy protection" (in quotes because almost all of these schemes are laughable in how they operate--essentially, as I said, adding junk so that that faithful implementations of the specs, as on computers, won't be able to read/play the disk) because of fear of litigation. Even though it's trivial to get around the copy protection schemes I've seen (and I'm not even an expert), I'm sure the big media companies have patents on exactly how they fuck with the standards (and break them--I don't think they should get away with using the DVD mark on these) so I'm sure any open source distro would get slapped with a patent infringement suit if they implemented the changes needed to ignore the copy protection mechanisms.
It's all a bit sad really, especially considering how technically stupid DVD copy protection is...
And full ACLs throughout from the ground up - not as an after thought - like for instance in UNIX type OSs.
I don't know what you have in mind when you say that Unix only has security as an afterthought. It was built from the ground up to be multi-user, with strict separation among those users (both for in-memory applications and on the file system). It also had the novel setuid mechanism and associated su and chgrp functionality pretty much from the outset. I think that the creators actually got a patent on the setuid mechanism, possibly combined with its use with the passwd program which effectively allowed each user to change their own password in a single system file while not allowing it to change anything else there.
Almost anything that can be implemented using ACLs can also be implemented using the user/group and setuid/setgid mechanisms. About the only area that I can think of where Unix is perhaps more permissive than it should be (for a paranoid sysadmin) is in allowing network access for all users (*). But then again, Unix wouldn't have been such a resounding success without networking, I think. If the designers had wanted to include some sort of "access rights" for the network, then they'd basically end up with something like VMS's security model instead. But then, it obviously wouldn't be the Unix that we know and love :)
* Actually, I realise that this can be done in modern Linux using an iptables command to drop traffic based on userid. I don't actually know how early Unix implementations implemented network access. For all I know, all the network access functions might have actually used a device file at the lowest level. If so, then it actually would have been possible to restrict net access on a per-user basis using the standard user/group security mechanisms...
Whereas with ARM there is no opcode translation to do at all!
No, but there is still a decoding stage, so there's still something there to "get hot". As ARM CPUs have a fairly orthogonal instruction set, though, this stage is vastly less complex than the decode stage on CISC CPUs. This also allows, for example, reserving a few bits to encode for conditional execution and a few more for whether and how to rotate one of the instruction operands. These features are available with most, if not all instructions and effectively come "for free" from the programmer's perspective.
As you might have guessed, I quite like the ARM architecture. It's one of the nicest CPUs I've coded for, though 68000 is really nice too, and I've also got a soft spot for the PS3's Cell architecture. All of these are a complete joy to write for compared to the abomination that is the x86 architecture!
Except they already do.... Comment is "Free"
I've been wondering when they're going to rebrand that "Talk is Cheap" ...
Maybe he was just channelling Nostradamus and typo'd "Hitlers country" for "Hister country", aka "the land of the Danube".
Bestes farouches de faim fluves tranner:
Plus part du camp encontre Hister sera,
En caige de fer le grand fera treisner,
Quand Rin enfant Germain observera.
Within 6 months it will be cracked wide open...
That's not a given. It's possible to have a DRM system with mathematically provable security. That doesn't mean that they're easy to implement, though, and the state of the cryptographic arts always gets better while your DRM has to stay static.
The biggest problem with DRM (from a technical standpoint) is key exchange and key management. You could theoretically make a perfectly secure DRM system, but it means that every user or device has to have their own personal key, and the hardware has to be resistant to tampering. In practice, this makes it totally impractical.
The biggest problem with DRM in 3D printers (as in this case) is that it's impossible to prevent them from making a new printer that simply doesn't have any DRM in it :)
And Lo, on the eight day, after his day of rest, lord Jobs invents cartography.
Ah, but did he invent the Dymaxion projection?
They don't have any cell phones any more?
Oh no! Where is mein handi?
And we are RIGHT IN THE MIDDLE?
Unlikely. That's just an effect of isotropy. If everything is receding at an equal pace from everything else then every place seems to be the centre of the universe. No doubt many other planets have astronomers that are still struggling with heliocentricity, so I wouldn't feel too bad about the mistake you made :)
I can only imagine you've never tried to code against [X11] Kafka couldn't have done better.
I don't know. The client-server paradigm they use is pretty cool (even if they decide to swap the names around). I think if you really want Kafkaesque then you have to be an iphone user. Your arms and legs may no longer be in the place you expect them to be and and you're experiencing difficulty coordinating your extremities to perform what should be a mundane task, but still all you have in mind is asking Siri whether you can make the next train in time for work.
Newsworthy because its a load of kernel and other code ...
Yep. I haven't looked yet, but it seems that they'll have fixes for two problems I ran across...
1. No native kernel driver/firmware for some quite popular (read: cheap) wireless dongles.
2. Fix for excessive interrupt rate (dwc_otg.fiq_fix_enable=1 now the default) as mentioned in last paragraph.
I managed to find the fix for these myself thanks to the excellent forum and blog posts that people are making about the Pi, but it's certainly to be welcomed to have these baked in for less technically skilled users. As for the overclocking, the rpi version of xbmc has been overclocking to 800Ghz for quite a while (and there's been the option to do it in the /boot/config.txt in regular pi distros too). Nice to see that there's room to push the envelope even further and still be safe.
Heh... if you were Huckleberry Finn you'd just nick the one off the neighbours windowsill.
simultaneous events are ones which occur at the same time according to an observer
Hmm.. I was going to pounce on this and ask "yes, but relative to what observer?" My point being that simultaneity is a relative concept. Then I reread what you'd written and realised that you hadn't made the mistake I thought you had (when working in a relativistic framework).
Still, at least I get to post a link that explains it a little bit better
at 25:25 explains it all
if man is still alive ...
Re: no SD on Neuxs or Nexus 7
Yes, Google also fucks up
The way I see it, the lack of an upgrade slot was a deliberate design decision to achieve two things. First, since they're basically subsidising the hardware, they want to keep costs down. Second, they don't want to piss off the other Android suppliers by making a phone or tablet that's too good (again, particularly if they're subsidising the cost). I'm only surmising this, but I feel that they want to produce something that's a pretty good showcase for Android, but want to avoid being accused of "stealing" the market from other Android makers. So I see the lack of an SD slot here as being kind of a middle ground, with the assumption that if users want to upgrade, they'll check out the other android manufacturers.
A few years ago I was using a basic Nokia 6310 phone. I think that's the model number. Anyway, it had pretty good standby time for the most part--probably about 2.5 days. The big problem with it, though, was that if it went out of coverage it meant that it ramped up the power to the GSM radio so if I forgot to turn it off or keep it on charge during the day (we had very bad coverage where I worked), I'd have a flat phone by the end of the day. That's the first reason people want a replaceable battery--as a backup in case they get caught with a flat battery and no easy way to recharge once they notice it.
The second reason is that batteries deteriorate over time. A three-year old phone won't last as long as the day you bought it. I agree totally with the article here--it does seem like a very cynical ploy by Apple to keep you on the upgrade cycle to the next shiny, when really all that's wrong with the 3-year old phone is that it needs a new battery.
On a similar (ahem) note, I think someone should do a Lisa Simpson on it. Bring along their saxomophone, do all the rehearsals and then on the night do the signature solo and swiftly exit off stage.
Well, probably not.
I guess they might be able to find a job in a different profession that's better paid.
Not meant to be taken seriously, of course. I just enjoyed the film!
Looks like the "write once, read never" approach. Makes me wonder why I bothered.
And yes, I did use the "send corrections" link.
Probably not very good. Ray tracing tends to exercise the I/O an awful lot so even if you assign one pi to a particular section of the screen it'll still end up accessing other parts of the scene in a pretty random access pattern (rays bounce). With only a 100MBit connection (and the fact that the USB and Ethernet share a bus) it's easy to saturate the available data channels--a problem that only gets worse as you scale up (though working with different net topology and having more control nodes could definitely help, to a degree).
On the other hand, having the farm render a typical fractal image would be a perfect application for it since each screen section is typically independent of each other one.
Despite how impractical this thing is, I'd still love to have one. I'm sure it's also a great teaching resource in spite of (nay, even because of) its shortcomings, necessity being the mother of invention and all that.
"What is the point ... with Fanboi trolling?"
Maybe because it's easy to get a rise out of Apple fans with it? Low-hanging fruit, you might say.
Well think about ... with that level of bandwidth maybe some kind of immersive virtual reality setup would be possible. Maybe you wouldn't even need to know you're in Kansas?
I had very similar thoughts on reading this. At first I was thinking, what's wrong with adapting the Transport Stream protocol, but that's not exactly scalable to differing pipes or screen sizes. At best it'll let you tune transmission for poor quality connection. My next thought was to use something like the "progressive" modes in JPEG, PNG, or (IIRC) DIrac (or similar trick as used in FLAC audio, separating the stream into a lossy part and a set of deltas). This would be much easier if the encoding system was based on wavelets (again, I think Dirac does this), but an FFT-based system can work too. The problem there, though, is how to do flow control so that the sender can stop sending the high-detail part of the stream. Then it struck me... why not use UDP for the fine level of detail and use TCP for the base image? You'd still have to do dynamic tuning on the encoder side (and a bit of buffering and stitching together at the decoder end), but at least the congestion part could be mostly handled by the network itself.
I also don't like the way that DRM is being baked into HTML5, but it's also hardly surprising. Sad, though.
Haven't Amazon been outed as doing this for a while? And in this august organ, no less...
No-one comes out smelling of roses (surely it's all relative, but Einstein was the last great patent clerk).
We will control the horizontal. We will control the vertical We control the diagonal.
proves intelligent design? Really?
I've got this banana over here. It might help to clarify His Pungent Effulgent. (or maybe not)
The US has black president and white rappers and now Sony seems (if I'm not dreaming) to be actually supporting Linux.
A vibrating ring for haptic feedback might be handy, though. Stop sniggering at the back!
No sniggering here. I do think that memory wire would be a lot nicer than mere buzzing. For "handling" 3d objects, obviously.
There is nothing beyond the edge of the solar system, it's just a big black board with pictures of stars on it.
Reminds me of Omon Ra by Victor Pelevin. On what really happened with the CCCP's space programme.
Or they might just hear a loud *thunk* as it hits the edge, Truman Show stylie.
Or maybe it just wraps around, Misner-space stylee :)
the patissier sues the boulanger for using the same oven-based technique for cooking food.
Surely, since this is croissants we're talking about, they'd sue over the method of folding in the edges to make them nice and curved ;-)