quite a nice breakthrough, if it, eh, breaks through
I quite fancy the prospect of them adding deep-fried Mars bars to the in-flight menus.
1330 posts • joined 8 Nov 2007
I quite fancy the prospect of them adding deep-fried Mars bars to the in-flight menus.
The open source (usually libdvdcss2-based) solutions usually fail spectacularly when the DVD publisher has included some arsey extra copy protection measure that makes the movie appear to be 99 titles, or makes it so open source players break unless you skip the first megabyte or so of the disk.
Agreed. I could never figure out why the default players/libs on Linux don't disregard the TOC info if it disagrees with the physical information on the disk. I had a look at the source code myself and figured out what would need to be changed in order to defeat any of the standard copy protection schemes from among DVDs that I own, so that I can transcode them and have them available on my DLNA server. I think there are probably two reasons. First, these bits of software are generally written so that they comply with the standards and they don't deal very well with the copy protection schemes, which deliberately throw in junk. This often leads to disks that work fine in a DVD player but won't play on a computer. Second, I suspect that there might not be the collective will to get around the "copy protection" (in quotes because almost all of these schemes are laughable in how they operate--essentially, as I said, adding junk so that that faithful implementations of the specs, as on computers, won't be able to read/play the disk) because of fear of litigation. Even though it's trivial to get around the copy protection schemes I've seen (and I'm not even an expert), I'm sure the big media companies have patents on exactly how they fuck with the standards (and break them--I don't think they should get away with using the DVD mark on these) so I'm sure any open source distro would get slapped with a patent infringement suit if they implemented the changes needed to ignore the copy protection mechanisms.
It's all a bit sad really, especially considering how technically stupid DVD copy protection is...
And full ACLs throughout from the ground up - not as an after thought - like for instance in UNIX type OSs.
I don't know what you have in mind when you say that Unix only has security as an afterthought. It was built from the ground up to be multi-user, with strict separation among those users (both for in-memory applications and on the file system). It also had the novel setuid mechanism and associated su and chgrp functionality pretty much from the outset. I think that the creators actually got a patent on the setuid mechanism, possibly combined with its use with the passwd program which effectively allowed each user to change their own password in a single system file while not allowing it to change anything else there.
Almost anything that can be implemented using ACLs can also be implemented using the user/group and setuid/setgid mechanisms. About the only area that I can think of where Unix is perhaps more permissive than it should be (for a paranoid sysadmin) is in allowing network access for all users (*). But then again, Unix wouldn't have been such a resounding success without networking, I think. If the designers had wanted to include some sort of "access rights" for the network, then they'd basically end up with something like VMS's security model instead. But then, it obviously wouldn't be the Unix that we know and love :)
* Actually, I realise that this can be done in modern Linux using an iptables command to drop traffic based on userid. I don't actually know how early Unix implementations implemented network access. For all I know, all the network access functions might have actually used a device file at the lowest level. If so, then it actually would have been possible to restrict net access on a per-user basis using the standard user/group security mechanisms...
Whereas with ARM there is no opcode translation to do at all!
No, but there is still a decoding stage, so there's still something there to "get hot". As ARM CPUs have a fairly orthogonal instruction set, though, this stage is vastly less complex than the decode stage on CISC CPUs. This also allows, for example, reserving a few bits to encode for conditional execution and a few more for whether and how to rotate one of the instruction operands. These features are available with most, if not all instructions and effectively come "for free" from the programmer's perspective.
As you might have guessed, I quite like the ARM architecture. It's one of the nicest CPUs I've coded for, though 68000 is really nice too, and I've also got a soft spot for the PS3's Cell architecture. All of these are a complete joy to write for compared to the abomination that is the x86 architecture!
Except they already do.... Comment is "Free"
I've been wondering when they're going to rebrand that "Talk is Cheap" ...
Maybe he was just channelling Nostradamus and typo'd "Hitlers country" for "Hister country", aka "the land of the Danube".
Bestes farouches de faim fluves tranner:
Plus part du camp encontre Hister sera,
En caige de fer le grand fera treisner,
Quand Rin enfant Germain observera.
Within 6 months it will be cracked wide open...
That's not a given. It's possible to have a DRM system with mathematically provable security. That doesn't mean that they're easy to implement, though, and the state of the cryptographic arts always gets better while your DRM has to stay static.
The biggest problem with DRM (from a technical standpoint) is key exchange and key management. You could theoretically make a perfectly secure DRM system, but it means that every user or device has to have their own personal key, and the hardware has to be resistant to tampering. In practice, this makes it totally impractical.
The biggest problem with DRM in 3D printers (as in this case) is that it's impossible to prevent them from making a new printer that simply doesn't have any DRM in it :)
And Lo, on the eight day, after his day of rest, lord Jobs invents cartography.
Ah, but did he invent the Dymaxion projection?
They don't have any cell phones any more?
Oh no! Where is mein handi?
And we are RIGHT IN THE MIDDLE?
Unlikely. That's just an effect of isotropy. If everything is receding at an equal pace from everything else then every place seems to be the centre of the universe. No doubt many other planets have astronomers that are still struggling with heliocentricity, so I wouldn't feel too bad about the mistake you made :)
I can only imagine you've never tried to code against [X11] Kafka couldn't have done better.
I don't know. The client-server paradigm they use is pretty cool (even if they decide to swap the names around). I think if you really want Kafkaesque then you have to be an iphone user. Your arms and legs may no longer be in the place you expect them to be and and you're experiencing difficulty coordinating your extremities to perform what should be a mundane task, but still all you have in mind is asking Siri whether you can make the next train in time for work.
Newsworthy because its a load of kernel and other code ...
Yep. I haven't looked yet, but it seems that they'll have fixes for two problems I ran across...
1. No native kernel driver/firmware for some quite popular (read: cheap) wireless dongles.
2. Fix for excessive interrupt rate (dwc_otg.fiq_fix_enable=1 now the default) as mentioned in last paragraph.
I managed to find the fix for these myself thanks to the excellent forum and blog posts that people are making about the Pi, but it's certainly to be welcomed to have these baked in for less technically skilled users. As for the overclocking, the rpi version of xbmc has been overclocking to 800Ghz for quite a while (and there's been the option to do it in the /boot/config.txt in regular pi distros too). Nice to see that there's room to push the envelope even further and still be safe.
Heh... if you were Huckleberry Finn you'd just nick the one off the neighbours windowsill.
simultaneous events are ones which occur at the same time according to an observer
Hmm.. I was going to pounce on this and ask "yes, but relative to what observer?" My point being that simultaneity is a relative concept. Then I reread what you'd written and realised that you hadn't made the mistake I thought you had (when working in a relativistic framework).
Still, at least I get to post a link that explains it a little bit better
at 25:25 explains it all
if man is still alive ...
Re: no SD on Neuxs or Nexus 7
Yes, Google also fucks up
The way I see it, the lack of an upgrade slot was a deliberate design decision to achieve two things. First, since they're basically subsidising the hardware, they want to keep costs down. Second, they don't want to piss off the other Android suppliers by making a phone or tablet that's too good (again, particularly if they're subsidising the cost). I'm only surmising this, but I feel that they want to produce something that's a pretty good showcase for Android, but want to avoid being accused of "stealing" the market from other Android makers. So I see the lack of an SD slot here as being kind of a middle ground, with the assumption that if users want to upgrade, they'll check out the other android manufacturers.
A few years ago I was using a basic Nokia 6310 phone. I think that's the model number. Anyway, it had pretty good standby time for the most part--probably about 2.5 days. The big problem with it, though, was that if it went out of coverage it meant that it ramped up the power to the GSM radio so if I forgot to turn it off or keep it on charge during the day (we had very bad coverage where I worked), I'd have a flat phone by the end of the day. That's the first reason people want a replaceable battery--as a backup in case they get caught with a flat battery and no easy way to recharge once they notice it.
The second reason is that batteries deteriorate over time. A three-year old phone won't last as long as the day you bought it. I agree totally with the article here--it does seem like a very cynical ploy by Apple to keep you on the upgrade cycle to the next shiny, when really all that's wrong with the 3-year old phone is that it needs a new battery.
On a similar (ahem) note, I think someone should do a Lisa Simpson on it. Bring along their saxomophone, do all the rehearsals and then on the night do the signature solo and swiftly exit off stage.
Well, probably not.
I guess they might be able to find a job in a different profession that's better paid.
Not meant to be taken seriously, of course. I just enjoyed the film!
Looks like the "write once, read never" approach. Makes me wonder why I bothered.
And yes, I did use the "send corrections" link.
Probably not very good. Ray tracing tends to exercise the I/O an awful lot so even if you assign one pi to a particular section of the screen it'll still end up accessing other parts of the scene in a pretty random access pattern (rays bounce). With only a 100MBit connection (and the fact that the USB and Ethernet share a bus) it's easy to saturate the available data channels--a problem that only gets worse as you scale up (though working with different net topology and having more control nodes could definitely help, to a degree).
On the other hand, having the farm render a typical fractal image would be a perfect application for it since each screen section is typically independent of each other one.
Despite how impractical this thing is, I'd still love to have one. I'm sure it's also a great teaching resource in spite of (nay, even because of) its shortcomings, necessity being the mother of invention and all that.
"What is the point ... with Fanboi trolling?"
Maybe because it's easy to get a rise out of Apple fans with it? Low-hanging fruit, you might say.
Well think about ... with that level of bandwidth maybe some kind of immersive virtual reality setup would be possible. Maybe you wouldn't even need to know you're in Kansas?
I had very similar thoughts on reading this. At first I was thinking, what's wrong with adapting the Transport Stream protocol, but that's not exactly scalable to differing pipes or screen sizes. At best it'll let you tune transmission for poor quality connection. My next thought was to use something like the "progressive" modes in JPEG, PNG, or (IIRC) DIrac (or similar trick as used in FLAC audio, separating the stream into a lossy part and a set of deltas). This would be much easier if the encoding system was based on wavelets (again, I think Dirac does this), but an FFT-based system can work too. The problem there, though, is how to do flow control so that the sender can stop sending the high-detail part of the stream. Then it struck me... why not use UDP for the fine level of detail and use TCP for the base image? You'd still have to do dynamic tuning on the encoder side (and a bit of buffering and stitching together at the decoder end), but at least the congestion part could be mostly handled by the network itself.
I also don't like the way that DRM is being baked into HTML5, but it's also hardly surprising. Sad, though.
Haven't Amazon been outed as doing this for a while? And in this august organ, no less...
No-one comes out smelling of roses (surely it's all relative, but Einstein was the last great patent clerk).
We will control the horizontal. We will control the vertical We control the diagonal.
proves intelligent design? Really?
I've got this banana over here. It might help to clarify His Pungent Effulgent. (or maybe not)
The US has black president and white rappers and now Sony seems (if I'm not dreaming) to be actually supporting Linux.
A vibrating ring for haptic feedback might be handy, though. Stop sniggering at the back!
No sniggering here. I do think that memory wire would be a lot nicer than mere buzzing. For "handling" 3d objects, obviously.
There is nothing beyond the edge of the solar system, it's just a big black board with pictures of stars on it.
Reminds me of Omon Ra by Victor Pelevin. On what really happened with the CCCP's space programme.
Or they might just hear a loud *thunk* as it hits the edge, Truman Show stylie.
Or maybe it just wraps around, Misner-space stylee :)
the patissier sues the boulanger for using the same oven-based technique for cooking food.
Surely, since this is croissants we're talking about, they'd sue over the method of folding in the edges to make them nice and curved ;-)
So everybody is Everywhere Girl now? What a bunch of cheapskates.
Sounds like a job for an inanimate carbon rod to me.
My what now?
The current sense of entitlement in IT is shocking.
It's has nothing to do with "our" sense of entitlement and everything to do with Oracle's moral responsibility. Think of Java as being like a teenager going out into the world and Oracle being its guardian. It's up to Oracle to ensure that their brat isn't going to become a public menace. A very large software ecosystem is built around Java and people need to be able to depend on it. At this rate Java is sure to end up hanging around with Flash, and that definitely won't end well.
Math is not patentable so don't try your logic on us.
Shouldn't be, but that didn't stop the patents on RSA encryption, Lempel-Ziv-Welch compression or Arithmetic encoding, not to mention the myriad other patents surrounding video and audio compression and even bloody container formats.
FOSS code is copyrighted. I think you're confusing it with "public domain" (as defined in the USA).
Why do they need microsecond timing on graphics generation when the viewable result changes only about every 20 milliseconds (roughly)?
Possibly because more complex games will involve multiple rendering passes, probably with tunable parameters for LOD and the like, and being able to budget accurately can make all the difference between being able to hit your window for accurately syncing to the next frame update or not. Just a little drift and you end up losing frames. You might also have to account for vsync, and being off there will really screw things up.
Shenanigans galore smashing doors (via doire, for oak?) into smithereens with a shillelagh after drinking whisky and blathering on with a thick Irish brogue about seeing banshee and leprechauns.
This is amazing that no real linguists have answered this thread.
Yeah, I was hoping for that, hence my tongue-in-cheek post about phrenology and so on. I'm still kind of curious about the names of the days of the week, and it would have been nice to have a linguist give an explanation. It's nice to know that many Europeans have a God/Sun day and a Moon day (along with other planetary namings), but that doesn't explain why Japan has (and apparently China had at one point) pretty much the same system. Is it actually a case of parallel evolution or did knowledge of the planets and the fashion of using them for naming the days spread via language?
Another coincidence I've noticed between east/west is "-bury" in the UK at the end of place names and "-buri" at the end of place names in Thailand. Is it just coincidence or does it denote a common root language (Sanskrit/Indic languages)? Again, I have no idea, but it would be nice to know...
Gotta chime in here too. It sounds suspiciously like they're using phrenology to back up their claims :-)
Seriously, though, it's all well and good pointing out similar linguistic constructs and then jumping to a conclusion, but a lot of this stuff might be coincidental or maybe a case of parallel development (why is the first day called "Sun" day and the second "Moon" day in so many languages, for example? I don't actually know--just throwing it out there). I'm all for clever theories but the problem with many linguistic theories is that they're not falsifiable. That being said, what's the point?
Beer, since they seem to have run out of jynnan tonnyx.
No, but as a fellow (geodesic) dome dweller I can totally sympatise with you on the exorbitant prices they charge for curved sofas.
I suppose that people eat dumplings nearly the world over. My favourite would have to be Japanese-style gyōza. Mix up minced pork, cabbage (finely chopped, lightly salted, then squeezed to remove moisture), spring onions, shrimp, ginger, garlic (all finely chopped or minced) and sesame oil and for the filling with just plain flour and water for the wrapping. There are as many ways to cook these as pierogi, but I think the best is to fry them first in a very small amount of oil then put a small amount of water in the pan and cover it so that the steam cooks everything. Remove from the pan when all the water evaporates and serve with a mix of soy sauce and chilli oil.
Besides tasting delicious, they look great too if the edges are pleated properly (very fiddly to get exactly right, unfortunately).
I'm reminded of that joke in Trading Places. You know... the "look at that S car go" one...
pretends to be something useful in order to trick
Like a giant wooden horse, for example. Someone should surely be able to find a use for that.
re: change in ratings, perhaps it's because in this review it's stacked up against other sub-£100 phones. As an android phone it might get 80% overall, but 90% if you're buying on a budget. That's my take on it.
I might as well recommend looking up John Cooper-Clark's "Evidently Chicken Town." It's probably just as relevant (hint: not).