I guess they might be able to find a job in a different profession that's better paid.
Not meant to be taken seriously, of course. I just enjoyed the film!
1362 posts • joined 8 Nov 2007
I guess they might be able to find a job in a different profession that's better paid.
Not meant to be taken seriously, of course. I just enjoyed the film!
Looks like the "write once, read never" approach. Makes me wonder why I bothered.
And yes, I did use the "send corrections" link.
Probably not very good. Ray tracing tends to exercise the I/O an awful lot so even if you assign one pi to a particular section of the screen it'll still end up accessing other parts of the scene in a pretty random access pattern (rays bounce). With only a 100MBit connection (and the fact that the USB and Ethernet share a bus) it's easy to saturate the available data channels--a problem that only gets worse as you scale up (though working with different net topology and having more control nodes could definitely help, to a degree).
On the other hand, having the farm render a typical fractal image would be a perfect application for it since each screen section is typically independent of each other one.
Despite how impractical this thing is, I'd still love to have one. I'm sure it's also a great teaching resource in spite of (nay, even because of) its shortcomings, necessity being the mother of invention and all that.
"What is the point ... with Fanboi trolling?"
Maybe because it's easy to get a rise out of Apple fans with it? Low-hanging fruit, you might say.
Well think about ... with that level of bandwidth maybe some kind of immersive virtual reality setup would be possible. Maybe you wouldn't even need to know you're in Kansas?
I had very similar thoughts on reading this. At first I was thinking, what's wrong with adapting the Transport Stream protocol, but that's not exactly scalable to differing pipes or screen sizes. At best it'll let you tune transmission for poor quality connection. My next thought was to use something like the "progressive" modes in JPEG, PNG, or (IIRC) DIrac (or similar trick as used in FLAC audio, separating the stream into a lossy part and a set of deltas). This would be much easier if the encoding system was based on wavelets (again, I think Dirac does this), but an FFT-based system can work too. The problem there, though, is how to do flow control so that the sender can stop sending the high-detail part of the stream. Then it struck me... why not use UDP for the fine level of detail and use TCP for the base image? You'd still have to do dynamic tuning on the encoder side (and a bit of buffering and stitching together at the decoder end), but at least the congestion part could be mostly handled by the network itself.
I also don't like the way that DRM is being baked into HTML5, but it's also hardly surprising. Sad, though.
Haven't Amazon been outed as doing this for a while? And in this august organ, no less...
No-one comes out smelling of roses (surely it's all relative, but Einstein was the last great patent clerk).
We will control the horizontal. We will control the vertical We control the diagonal.
proves intelligent design? Really?
I've got this banana over here. It might help to clarify His Pungent Effulgent. (or maybe not)
The US has black president and white rappers and now Sony seems (if I'm not dreaming) to be actually supporting Linux.
A vibrating ring for haptic feedback might be handy, though. Stop sniggering at the back!
No sniggering here. I do think that memory wire would be a lot nicer than mere buzzing. For "handling" 3d objects, obviously.
There is nothing beyond the edge of the solar system, it's just a big black board with pictures of stars on it.
Reminds me of Omon Ra by Victor Pelevin. On what really happened with the CCCP's space programme.
Or they might just hear a loud *thunk* as it hits the edge, Truman Show stylie.
Or maybe it just wraps around, Misner-space stylee :)
the patissier sues the boulanger for using the same oven-based technique for cooking food.
Surely, since this is croissants we're talking about, they'd sue over the method of folding in the edges to make them nice and curved ;-)
So everybody is Everywhere Girl now? What a bunch of cheapskates.
Sounds like a job for an inanimate carbon rod to me.
My what now?
The current sense of entitlement in IT is shocking.
It's has nothing to do with "our" sense of entitlement and everything to do with Oracle's moral responsibility. Think of Java as being like a teenager going out into the world and Oracle being its guardian. It's up to Oracle to ensure that their brat isn't going to become a public menace. A very large software ecosystem is built around Java and people need to be able to depend on it. At this rate Java is sure to end up hanging around with Flash, and that definitely won't end well.
Math is not patentable so don't try your logic on us.
Shouldn't be, but that didn't stop the patents on RSA encryption, Lempel-Ziv-Welch compression or Arithmetic encoding, not to mention the myriad other patents surrounding video and audio compression and even bloody container formats.
FOSS code is copyrighted. I think you're confusing it with "public domain" (as defined in the USA).
Why do they need microsecond timing on graphics generation when the viewable result changes only about every 20 milliseconds (roughly)?
Possibly because more complex games will involve multiple rendering passes, probably with tunable parameters for LOD and the like, and being able to budget accurately can make all the difference between being able to hit your window for accurately syncing to the next frame update or not. Just a little drift and you end up losing frames. You might also have to account for vsync, and being off there will really screw things up.
Shenanigans galore smashing doors (via doire, for oak?) into smithereens with a shillelagh after drinking whisky and blathering on with a thick Irish brogue about seeing banshee and leprechauns.
This is amazing that no real linguists have answered this thread.
Yeah, I was hoping for that, hence my tongue-in-cheek post about phrenology and so on. I'm still kind of curious about the names of the days of the week, and it would have been nice to have a linguist give an explanation. It's nice to know that many Europeans have a God/Sun day and a Moon day (along with other planetary namings), but that doesn't explain why Japan has (and apparently China had at one point) pretty much the same system. Is it actually a case of parallel evolution or did knowledge of the planets and the fashion of using them for naming the days spread via language?
Another coincidence I've noticed between east/west is "-bury" in the UK at the end of place names and "-buri" at the end of place names in Thailand. Is it just coincidence or does it denote a common root language (Sanskrit/Indic languages)? Again, I have no idea, but it would be nice to know...
Gotta chime in here too. It sounds suspiciously like they're using phrenology to back up their claims :-)
Seriously, though, it's all well and good pointing out similar linguistic constructs and then jumping to a conclusion, but a lot of this stuff might be coincidental or maybe a case of parallel development (why is the first day called "Sun" day and the second "Moon" day in so many languages, for example? I don't actually know--just throwing it out there). I'm all for clever theories but the problem with many linguistic theories is that they're not falsifiable. That being said, what's the point?
Beer, since they seem to have run out of jynnan tonnyx.
No, but as a fellow (geodesic) dome dweller I can totally sympatise with you on the exorbitant prices they charge for curved sofas.
I suppose that people eat dumplings nearly the world over. My favourite would have to be Japanese-style gyōza. Mix up minced pork, cabbage (finely chopped, lightly salted, then squeezed to remove moisture), spring onions, shrimp, ginger, garlic (all finely chopped or minced) and sesame oil and for the filling with just plain flour and water for the wrapping. There are as many ways to cook these as pierogi, but I think the best is to fry them first in a very small amount of oil then put a small amount of water in the pan and cover it so that the steam cooks everything. Remove from the pan when all the water evaporates and serve with a mix of soy sauce and chilli oil.
Besides tasting delicious, they look great too if the edges are pleated properly (very fiddly to get exactly right, unfortunately).
I'm reminded of that joke in Trading Places. You know... the "look at that S car go" one...
pretends to be something useful in order to trick
Like a giant wooden horse, for example. Someone should surely be able to find a use for that.
re: change in ratings, perhaps it's because in this review it's stacked up against other sub-£100 phones. As an android phone it might get 80% overall, but 90% if you're buying on a budget. That's my take on it.
I might as well recommend looking up John Cooper-Clark's "Evidently Chicken Town." It's probably just as relevant (hint: not).
Yes we're all different..
Eh, "individuals", surely? "Different" (like Apple?) might make some sort of vague sense, but let's not mess with the canon here.
OK, so I'm not doing this for a PhD, but there are some fairly obvious improvements possible.
The main one involves an anonymous broadcast protocol, eg some version of the Dining Cryptographers problem (hello, "suppernode"). It's a simple protocol where all the diners flip a coin in secret between himself and his immediate right-hand neighbour then they announce "same" or "different" (there are as many coins as there are diners, and each diner can see two coins). If everyone is truthful then a parity calculation should always yield an even number of people saying "different", but if someone lies they throw the parity calculation off and so they've effectively broadcast one bit anonymously.
That algorithm isn't practical for tor, but there are other algorithms for achieving the same results. Mostly they include a fixed hierarchy (think nested dinner tables and "suppernodes" again) but they'd be much better if there was some sort of protocol for assigning "seating" on a random, ad-hoc basis. All these protocols also need to be hardened against deliberate disruption. Search for "dining cryptographers with cheaters"...
Chaffing and winnowing is another sort of protocol that might make sense. If you can set up a secure data channel to your egress point and you have some method of guaranteeing that the channel goes through some number of nodes whose sole function is to introduce random noise into the conversation stream, then a shared, private (configurable) checksumming algorithm is enough to defeat eavesdroppers with any required degree of confidence (it then becomes a form of probabilistic encryption).
The third alternative I can think of is to leverage the store and forward aspect of the network. Tor has some specialised DHT (distributed hash table) functionality but as far as I can tell it's not used in the normal case of operation where you just want to connect to a particular site anonymously. In other words, tor is mostly just forwarding packets and not acting as a storage network for the most part. What I have in mind is changing things so that tor would act more like a short-lived storage network, with each chunk of data effectively having a "half life" and having peers shunt around parts of each chunk among members of their ad-hoc peer groupings and randomly dropping a fixed percentage of all chunks. I won't go into details (the algorithms I have in mind aren't too hard, though), but it should be possible to maintain a coherent DHT even in the face of all this deliberate data loss and also in the presence of cheaters. The point of this is to spread delivery of your requested data (web page) out in the time domain so that even if an attacker can snoop your incoming and outgoing traffic and he has also subverted the exit point (so he knows what web page was downloaded) they can't prove that you were the person who requested it. I suppose what I'm basically saying is that there are algorithms and protocols that would enable a "probabilistic delivery" model with tunable parameters for how often you want your download to succeed within a given time frame (the "half life") and what level of protection you want against eavesdroppers in the worst case scenario. I guesstimate that the network would only have to store something between 6 and 10 times the volume of peak traffic of the equivalent "forward-only" network for this "probabilistic delivery" method to be viable.
like "military intelligence" or "jumbo shrimp"?
Or maybe "King pawns?"
I'm probably not a typical user, but what I'm mainly using Dropbox for is to provide a synch mechanism between my real XP installation and the version that I run in a virtual machine under Linux. Even then, there are actually only two applications that I regularly run in the VM that I want to keep synched, and the amount of data isn't very much at all.
I wouldn't trust these kinds of services for backups unless (a) I had some sort of front-end encryption meaning they couldn't snoop on what I'm storing and (b) I already had better/more secure backs in place anyway (say using something like Dropbox for daily backups, but making sure I do my own weekly one to my own backup box). They are kind of nice to have for "ad-hoc" backups as you put it. I could see myself using the "selective sync" option in Dropbox to set up a spare machine as an occasional repository for (relatively small) backups. I like the way that you don't need to power up the machine straight away (just get to it when it's convenient and start Dropbox to sync into your local copy) but obviously it does mean double the data transfer burden on your net connection. That's not an issue if you're syncing between home and work machines, though, and one or the other is firewalled so you can't do push transfers.
So actually I think that it's making data transfers and file sync easier is the main advantage of these things--and not actually as a primary/backup source for your data. Also remember not to trust that they won't read your stuff or screw up your files irrevocably once in a while.... otherwise, it's a pretty nifty tool.
So you call for citations from AC, in the very same sentence as you say "I think it was Apple and Microsoft spreading FUD about Vorbis infringing on unspecified patents"? Where are your citations?
There. Happy now?
And by the way, if you go back and read what I said again, maybe you'll understand it in the way I intended: there indeed have been rumours of Ogg violating patents. I [thought] it was Apple and Microsoft making those [foundless--hence "FUD"] allegations.
Have a nice day, Mr. AC.
Ogg Vorbis treads on a lot of patented stuff from MPEG from what I hear, for example
Citation definitely needed on that one. I think it was Apple and Microsoft spreading FUD about Vorbis infringing on unspecified patents simply because they wanted to ensure that H.264 got enshrined into the HTML5 spec rather than it becoming an open and royalty-free standard. The Ogg guys have always striven to avoid using any potentially patented technique no matter how stupid or obvious the patent might be.
I've got no problem with your comments about Android and FAT (evidently true), but you'll have to give us something more concrete than just rumours if you want to claim that Ogg Vorbis "treads on a lot of patented stuff from MPEG".
Finally an agreement then that Java requires special* hardware to run it with decent performance?
I can't tell if you're joking or not, but are you aware that some ARM devices have the capability of running Java bytecode (more or less) natively with Jazelle? That's not to say that Java needs special hardware, and it certainly wouldn't be the first time that hardware has been built with support for a particular high-level language in mind--most notably the Lisp Machines of the '80s. The wikipedia page for them (http://en.wikipedia.org/wiki/Lisp_machine) also mentions some other languages where special CPUs/computers were built: Prolog, Modula-2, Erlang and, yes, Java.
That certainly doesn't mean that any of these languages need special-purpose hardware. I think it's more a case of "if we can improve performance by building custom CPUs/machines, then why not try?"
While not specifically to do with GPUs as such, I did some research a while back into the possibility of getting a Java VM that could run on a PS3 so that it would be able to run code on the SPUs as well as on the main CPU (PPU). Besides the approach of actually adding new keywords to the language (as is mentioned in the article), I found two projects that actually got some way towards the goal. Both aimed to take unmodified Java code and have it run on an asymmetric CPU setup (ie, the PS3's Cell). The first (*) was based on the CACAO JIT compiler and hooked into the function call mechanism so that each method got executed on a separate core. I don't think the author got very far, but he did go over a lot of options and documented a lot of the design decisions very well. The second (**) was based on the Jikes Research VM and it used thread creation as the point to migrate control over to a new core. That project got a lot further, but I don't think they ever publicly released the code (although I'm sure an email to the authors would probably get you access). Again, the various papers and such that they produced give really good descriptions of the approach taken and all that.
Where I'm going with this is that it's hard enough to target the JIT code generation so that it can run on an asymmetric setup like that of the Cell processor. It's much more difficult when you try to target it to graphics hardware. OpenCL is a nice step in the right direction for doing GPGPU work, but graphics hardware design (except maybe for the really high end stuff--I'm not sure) still tends to be geared towards fixed ideas of the execution pipeline (eg, shader models, emphasis on textures and matrices) and there are generally fairly high penalties for such things as branching, sending data back out to the CPU (outside of the frame buffer mechanisms), context switching and inter-core communication (again, if it lies outside the standard shader/pipeline model). I would love to see GPU cores and the interconnects between them and the CPU moving more towards the Cell model, but OpenCL notwithstanding I don't think we're making much progress in that direction. Likewise, I'd love for these guys to succeed, though I think it's going to be a long hard climb.
Do any of these DLNA browsers support loading external subs? I've only tried out Media House, and it doesn't seem to have the option unless I download the files and try to play them locally, which I haven't tried. Are external subs even supported in DLNA, or do I have to mux them into a different container (mkv?). Obviously I'd prefer not to do any transcoding or on the fly conversion...
> Do any of these have F1-12?
Try Hacker's Keyboard. You have to press the Fn key to bring up the F keys, but if you only want them to start playing, that shouldn't be too inconvenient. It also includes a row of numbers above the letters and non-modal (multi-touch) shift/alt/ctrl keys making it ideal for using with a terminal emulator.
Thanks for pointing out that dosbox is available. I've got stacks of old shareware games archived somewhere. Might be an excuse to dig them up and give them a whirl. I'm surprised that phones and tablets these days have enough grunt to emulate x86. Goes to show how far we've come...
Would you like to change your Internet service providings?
It's 3 times faster than your current providings!
Heh. I had to smile reading that, given that Mr. Bontchev has been posting responses here atm. I'm thinking, of course, of his paper "Possible Virus Attacks Against Integrity Programs and How to Prevent Them":
http://www.people.frisk-software.com/~bontchev/papers/attacks.html (search for "Kuang").
As for the concept of multi-partite, oblivious agent-style viruses... super interesting. Even though the concept is very old, there are lots of fairly new techniques that could be applied. Chaffing and Winnowing (perhaps together with an all-or-nothing transform, and/or time-dependent hashing algorithms or cryptographic time servers) looks like one way of approaching it. Secret sharing schemes (Shamir, Rabin) with cryptographic accumulators (to validate a collection of parts as constituting a whole) is another. Then there's homomorphic encryption combined with polymorphic engines, but I don't think that's practical yet, despite recent advances.
It is very interesting to consider how a swarm of agents can combine to become greater than the sum of their parts and survive as a collection even when individual components are being teased apart and eradicated. Mathematically and architecturally, at least. It's equally important to remember, though, that these "perfect" (in some senses of the word) systems are being controlled by external agents, increasingly for nefarious purposes, as opposed to latter-day virus writers who did it purely for the technical challenge. That, in my opinion, is the weakest point. Sure, it would be nice to crack the key in this case, but wouldn't it be even nicer if we could trace the swarm back to its controllers?
This brings back fond memories of the days that Microsoft and Wang could regularly be heard in the same sentence.
Windows RT software will not be sold or distributed independent of a new Windows RT PC, just as you would expect from a consumer electronics device that relies on unique and integrated pairings of hardware and software
Just what are they trying to compare the devices with? Games consoles? Because I can download Android (or aftermarket replacements) independently of any device purchase.
I suppose, though, that the speaker is correct: platform lockdown/lock-in is exactly what I expected from Microsoft.
the correct usage would have been "you are boring". You need a noun ...
Don't be such a bore.
toasted ham and grated Emmental sandwich
Not sure about melty cheese (is Emmental a melty one?) but I guess Gruyere would be nice.
Putting goats' cheese on top and then grilling it definitely works. Put a bit of chutney or relish on the other slice and the whole thing is complete.
Getting back to bacon and avocado... I think it's nicer with chicken, but chicken, bacon and avocado work really nicely together. Only thing is that you probably want foreigner style bacon---thin, streaky and crisp (Black Forest, Serrano and Parma ham all fry up quite nicely on a dry pan, and are miles better than US style crispy bacon) rather than thick back rashers if you go down that route. I can't seem to get smoked paprika here, so I've never tried it. I generally use Tabasco, a good dose of black pepper and a little salt (avocado needs salt, even if the bacon is salty) and maybe lime juice. That's just me, though. I probably like guacamole (proper home-made stuff) a little bit too much so that's why I'd head in that direction.
Finally, not sure about the ducks. I think that's going a wee bit over the top...
This mention of Scotch whisky reminds me of a radio show I heard many years ago talking about the history of the liquor. The gist of it is that although part of the credit for the modern distilling process that enabled large-scale production goes to a Scotsman, Robert Stein, it was actually an Irishman, Aeneas Coffey, who improved on Stein's design to make the process cheaper and more efficient and thus economically viable. Coffey had been for many years the head of Customs and Excise in Dublin and, as the Crown's representative in Ireland and the person who was responsible for destroying illegal stills and collecting taxes, was a hated figure at the time. He also had a keen interest in whiskey, though, and had the technical skills to come up with his own his improved still design. When he left his position as the number one excise man, he patented his invention and tried to get backing to apply the invention in his native Ireland. Due to his past, however, nobody would support his endeavour (apart from one brief, failed venture, it seems) and he was forced to travel abroad in order to further his venture. So it was that he came to have his invention put to practical use in Scotland where, through a combination of his inventiveness, happenstance (a disease affecting grapes used in the premier spirit of the time--Cognac--left an opening in the market, while the whisky produced appealed to the English taste) and venture capital, Scotch whisky effectively exploded onto the scene and changed the industry completely.
So in short, the single malt whiskies that are today synonymous with Scotch, actually owes a huge debt to the Irish--on the one hand thanks to Coffey's design, but also, on the other, thanks to the spite of his fellow Irishmen in turning their back on his invention due to his past job and associations.
As the OP said, I might get punched in the face for recounting this story were I in Scotland. As we're online, though, I think that downvotes are a more likely outcome, so I've had to do a bit of searching to corroborate this account. I wasn't able to verify everything, but these two links seem to cover the basic outline:
In other words, yes, Wikiland still has a way to go before the label "reliable" can be applied.
Unlikely, since it's based on UDP*. Building a BGP** layer on top might make it a little better though.
* Unreliable Dictionary Protocol
** Byzantine General Protocol