use your images for promotion?
So everybody is Everywhere Girl now? What a bunch of cheapskates.
1348 posts • joined 8 Nov 2007
So everybody is Everywhere Girl now? What a bunch of cheapskates.
Sounds like a job for an inanimate carbon rod to me.
My what now?
The current sense of entitlement in IT is shocking.
It's has nothing to do with "our" sense of entitlement and everything to do with Oracle's moral responsibility. Think of Java as being like a teenager going out into the world and Oracle being its guardian. It's up to Oracle to ensure that their brat isn't going to become a public menace. A very large software ecosystem is built around Java and people need to be able to depend on it. At this rate Java is sure to end up hanging around with Flash, and that definitely won't end well.
Math is not patentable so don't try your logic on us.
Shouldn't be, but that didn't stop the patents on RSA encryption, Lempel-Ziv-Welch compression or Arithmetic encoding, not to mention the myriad other patents surrounding video and audio compression and even bloody container formats.
FOSS code is copyrighted. I think you're confusing it with "public domain" (as defined in the USA).
Why do they need microsecond timing on graphics generation when the viewable result changes only about every 20 milliseconds (roughly)?
Possibly because more complex games will involve multiple rendering passes, probably with tunable parameters for LOD and the like, and being able to budget accurately can make all the difference between being able to hit your window for accurately syncing to the next frame update or not. Just a little drift and you end up losing frames. You might also have to account for vsync, and being off there will really screw things up.
Shenanigans galore smashing doors (via doire, for oak?) into smithereens with a shillelagh after drinking whisky and blathering on with a thick Irish brogue about seeing banshee and leprechauns.
This is amazing that no real linguists have answered this thread.
Yeah, I was hoping for that, hence my tongue-in-cheek post about phrenology and so on. I'm still kind of curious about the names of the days of the week, and it would have been nice to have a linguist give an explanation. It's nice to know that many Europeans have a God/Sun day and a Moon day (along with other planetary namings), but that doesn't explain why Japan has (and apparently China had at one point) pretty much the same system. Is it actually a case of parallel evolution or did knowledge of the planets and the fashion of using them for naming the days spread via language?
Another coincidence I've noticed between east/west is "-bury" in the UK at the end of place names and "-buri" at the end of place names in Thailand. Is it just coincidence or does it denote a common root language (Sanskrit/Indic languages)? Again, I have no idea, but it would be nice to know...
Gotta chime in here too. It sounds suspiciously like they're using phrenology to back up their claims :-)
Seriously, though, it's all well and good pointing out similar linguistic constructs and then jumping to a conclusion, but a lot of this stuff might be coincidental or maybe a case of parallel development (why is the first day called "Sun" day and the second "Moon" day in so many languages, for example? I don't actually know--just throwing it out there). I'm all for clever theories but the problem with many linguistic theories is that they're not falsifiable. That being said, what's the point?
Beer, since they seem to have run out of jynnan tonnyx.
No, but as a fellow (geodesic) dome dweller I can totally sympatise with you on the exorbitant prices they charge for curved sofas.
I suppose that people eat dumplings nearly the world over. My favourite would have to be Japanese-style gyōza. Mix up minced pork, cabbage (finely chopped, lightly salted, then squeezed to remove moisture), spring onions, shrimp, ginger, garlic (all finely chopped or minced) and sesame oil and for the filling with just plain flour and water for the wrapping. There are as many ways to cook these as pierogi, but I think the best is to fry them first in a very small amount of oil then put a small amount of water in the pan and cover it so that the steam cooks everything. Remove from the pan when all the water evaporates and serve with a mix of soy sauce and chilli oil.
Besides tasting delicious, they look great too if the edges are pleated properly (very fiddly to get exactly right, unfortunately).
I'm reminded of that joke in Trading Places. You know... the "look at that S car go" one...
pretends to be something useful in order to trick
Like a giant wooden horse, for example. Someone should surely be able to find a use for that.
re: change in ratings, perhaps it's because in this review it's stacked up against other sub-£100 phones. As an android phone it might get 80% overall, but 90% if you're buying on a budget. That's my take on it.
I might as well recommend looking up John Cooper-Clark's "Evidently Chicken Town." It's probably just as relevant (hint: not).
Yes we're all different..
Eh, "individuals", surely? "Different" (like Apple?) might make some sort of vague sense, but let's not mess with the canon here.
OK, so I'm not doing this for a PhD, but there are some fairly obvious improvements possible.
The main one involves an anonymous broadcast protocol, eg some version of the Dining Cryptographers problem (hello, "suppernode"). It's a simple protocol where all the diners flip a coin in secret between himself and his immediate right-hand neighbour then they announce "same" or "different" (there are as many coins as there are diners, and each diner can see two coins). If everyone is truthful then a parity calculation should always yield an even number of people saying "different", but if someone lies they throw the parity calculation off and so they've effectively broadcast one bit anonymously.
That algorithm isn't practical for tor, but there are other algorithms for achieving the same results. Mostly they include a fixed hierarchy (think nested dinner tables and "suppernodes" again) but they'd be much better if there was some sort of protocol for assigning "seating" on a random, ad-hoc basis. All these protocols also need to be hardened against deliberate disruption. Search for "dining cryptographers with cheaters"...
Chaffing and winnowing is another sort of protocol that might make sense. If you can set up a secure data channel to your egress point and you have some method of guaranteeing that the channel goes through some number of nodes whose sole function is to introduce random noise into the conversation stream, then a shared, private (configurable) checksumming algorithm is enough to defeat eavesdroppers with any required degree of confidence (it then becomes a form of probabilistic encryption).
The third alternative I can think of is to leverage the store and forward aspect of the network. Tor has some specialised DHT (distributed hash table) functionality but as far as I can tell it's not used in the normal case of operation where you just want to connect to a particular site anonymously. In other words, tor is mostly just forwarding packets and not acting as a storage network for the most part. What I have in mind is changing things so that tor would act more like a short-lived storage network, with each chunk of data effectively having a "half life" and having peers shunt around parts of each chunk among members of their ad-hoc peer groupings and randomly dropping a fixed percentage of all chunks. I won't go into details (the algorithms I have in mind aren't too hard, though), but it should be possible to maintain a coherent DHT even in the face of all this deliberate data loss and also in the presence of cheaters. The point of this is to spread delivery of your requested data (web page) out in the time domain so that even if an attacker can snoop your incoming and outgoing traffic and he has also subverted the exit point (so he knows what web page was downloaded) they can't prove that you were the person who requested it. I suppose what I'm basically saying is that there are algorithms and protocols that would enable a "probabilistic delivery" model with tunable parameters for how often you want your download to succeed within a given time frame (the "half life") and what level of protection you want against eavesdroppers in the worst case scenario. I guesstimate that the network would only have to store something between 6 and 10 times the volume of peak traffic of the equivalent "forward-only" network for this "probabilistic delivery" method to be viable.
like "military intelligence" or "jumbo shrimp"?
Or maybe "King pawns?"
I'm probably not a typical user, but what I'm mainly using Dropbox for is to provide a synch mechanism between my real XP installation and the version that I run in a virtual machine under Linux. Even then, there are actually only two applications that I regularly run in the VM that I want to keep synched, and the amount of data isn't very much at all.
I wouldn't trust these kinds of services for backups unless (a) I had some sort of front-end encryption meaning they couldn't snoop on what I'm storing and (b) I already had better/more secure backs in place anyway (say using something like Dropbox for daily backups, but making sure I do my own weekly one to my own backup box). They are kind of nice to have for "ad-hoc" backups as you put it. I could see myself using the "selective sync" option in Dropbox to set up a spare machine as an occasional repository for (relatively small) backups. I like the way that you don't need to power up the machine straight away (just get to it when it's convenient and start Dropbox to sync into your local copy) but obviously it does mean double the data transfer burden on your net connection. That's not an issue if you're syncing between home and work machines, though, and one or the other is firewalled so you can't do push transfers.
So actually I think that it's making data transfers and file sync easier is the main advantage of these things--and not actually as a primary/backup source for your data. Also remember not to trust that they won't read your stuff or screw up your files irrevocably once in a while.... otherwise, it's a pretty nifty tool.
So you call for citations from AC, in the very same sentence as you say "I think it was Apple and Microsoft spreading FUD about Vorbis infringing on unspecified patents"? Where are your citations?
There. Happy now?
And by the way, if you go back and read what I said again, maybe you'll understand it in the way I intended: there indeed have been rumours of Ogg violating patents. I [thought] it was Apple and Microsoft making those [foundless--hence "FUD"] allegations.
Have a nice day, Mr. AC.
Ogg Vorbis treads on a lot of patented stuff from MPEG from what I hear, for example
Citation definitely needed on that one. I think it was Apple and Microsoft spreading FUD about Vorbis infringing on unspecified patents simply because they wanted to ensure that H.264 got enshrined into the HTML5 spec rather than it becoming an open and royalty-free standard. The Ogg guys have always striven to avoid using any potentially patented technique no matter how stupid or obvious the patent might be.
I've got no problem with your comments about Android and FAT (evidently true), but you'll have to give us something more concrete than just rumours if you want to claim that Ogg Vorbis "treads on a lot of patented stuff from MPEG".
Finally an agreement then that Java requires special* hardware to run it with decent performance?
I can't tell if you're joking or not, but are you aware that some ARM devices have the capability of running Java bytecode (more or less) natively with Jazelle? That's not to say that Java needs special hardware, and it certainly wouldn't be the first time that hardware has been built with support for a particular high-level language in mind--most notably the Lisp Machines of the '80s. The wikipedia page for them (http://en.wikipedia.org/wiki/Lisp_machine) also mentions some other languages where special CPUs/computers were built: Prolog, Modula-2, Erlang and, yes, Java.
That certainly doesn't mean that any of these languages need special-purpose hardware. I think it's more a case of "if we can improve performance by building custom CPUs/machines, then why not try?"
While not specifically to do with GPUs as such, I did some research a while back into the possibility of getting a Java VM that could run on a PS3 so that it would be able to run code on the SPUs as well as on the main CPU (PPU). Besides the approach of actually adding new keywords to the language (as is mentioned in the article), I found two projects that actually got some way towards the goal. Both aimed to take unmodified Java code and have it run on an asymmetric CPU setup (ie, the PS3's Cell). The first (*) was based on the CACAO JIT compiler and hooked into the function call mechanism so that each method got executed on a separate core. I don't think the author got very far, but he did go over a lot of options and documented a lot of the design decisions very well. The second (**) was based on the Jikes Research VM and it used thread creation as the point to migrate control over to a new core. That project got a lot further, but I don't think they ever publicly released the code (although I'm sure an email to the authors would probably get you access). Again, the various papers and such that they produced give really good descriptions of the approach taken and all that.
Where I'm going with this is that it's hard enough to target the JIT code generation so that it can run on an asymmetric setup like that of the Cell processor. It's much more difficult when you try to target it to graphics hardware. OpenCL is a nice step in the right direction for doing GPGPU work, but graphics hardware design (except maybe for the really high end stuff--I'm not sure) still tends to be geared towards fixed ideas of the execution pipeline (eg, shader models, emphasis on textures and matrices) and there are generally fairly high penalties for such things as branching, sending data back out to the CPU (outside of the frame buffer mechanisms), context switching and inter-core communication (again, if it lies outside the standard shader/pipeline model). I would love to see GPU cores and the interconnects between them and the CPU moving more towards the Cell model, but OpenCL notwithstanding I don't think we're making much progress in that direction. Likewise, I'd love for these guys to succeed, though I think it's going to be a long hard climb.
Do any of these DLNA browsers support loading external subs? I've only tried out Media House, and it doesn't seem to have the option unless I download the files and try to play them locally, which I haven't tried. Are external subs even supported in DLNA, or do I have to mux them into a different container (mkv?). Obviously I'd prefer not to do any transcoding or on the fly conversion...
> Do any of these have F1-12?
Try Hacker's Keyboard. You have to press the Fn key to bring up the F keys, but if you only want them to start playing, that shouldn't be too inconvenient. It also includes a row of numbers above the letters and non-modal (multi-touch) shift/alt/ctrl keys making it ideal for using with a terminal emulator.
Thanks for pointing out that dosbox is available. I've got stacks of old shareware games archived somewhere. Might be an excuse to dig them up and give them a whirl. I'm surprised that phones and tablets these days have enough grunt to emulate x86. Goes to show how far we've come...
Would you like to change your Internet service providings?
It's 3 times faster than your current providings!
Heh. I had to smile reading that, given that Mr. Bontchev has been posting responses here atm. I'm thinking, of course, of his paper "Possible Virus Attacks Against Integrity Programs and How to Prevent Them":
http://www.people.frisk-software.com/~bontchev/papers/attacks.html (search for "Kuang").
As for the concept of multi-partite, oblivious agent-style viruses... super interesting. Even though the concept is very old, there are lots of fairly new techniques that could be applied. Chaffing and Winnowing (perhaps together with an all-or-nothing transform, and/or time-dependent hashing algorithms or cryptographic time servers) looks like one way of approaching it. Secret sharing schemes (Shamir, Rabin) with cryptographic accumulators (to validate a collection of parts as constituting a whole) is another. Then there's homomorphic encryption combined with polymorphic engines, but I don't think that's practical yet, despite recent advances.
It is very interesting to consider how a swarm of agents can combine to become greater than the sum of their parts and survive as a collection even when individual components are being teased apart and eradicated. Mathematically and architecturally, at least. It's equally important to remember, though, that these "perfect" (in some senses of the word) systems are being controlled by external agents, increasingly for nefarious purposes, as opposed to latter-day virus writers who did it purely for the technical challenge. That, in my opinion, is the weakest point. Sure, it would be nice to crack the key in this case, but wouldn't it be even nicer if we could trace the swarm back to its controllers?
This brings back fond memories of the days that Microsoft and Wang could regularly be heard in the same sentence.
Windows RT software will not be sold or distributed independent of a new Windows RT PC, just as you would expect from a consumer electronics device that relies on unique and integrated pairings of hardware and software
Just what are they trying to compare the devices with? Games consoles? Because I can download Android (or aftermarket replacements) independently of any device purchase.
I suppose, though, that the speaker is correct: platform lockdown/lock-in is exactly what I expected from Microsoft.
the correct usage would have been "you are boring". You need a noun ...
Don't be such a bore.
toasted ham and grated Emmental sandwich
Not sure about melty cheese (is Emmental a melty one?) but I guess Gruyere would be nice.
Putting goats' cheese on top and then grilling it definitely works. Put a bit of chutney or relish on the other slice and the whole thing is complete.
Getting back to bacon and avocado... I think it's nicer with chicken, but chicken, bacon and avocado work really nicely together. Only thing is that you probably want foreigner style bacon---thin, streaky and crisp (Black Forest, Serrano and Parma ham all fry up quite nicely on a dry pan, and are miles better than US style crispy bacon) rather than thick back rashers if you go down that route. I can't seem to get smoked paprika here, so I've never tried it. I generally use Tabasco, a good dose of black pepper and a little salt (avocado needs salt, even if the bacon is salty) and maybe lime juice. That's just me, though. I probably like guacamole (proper home-made stuff) a little bit too much so that's why I'd head in that direction.
Finally, not sure about the ducks. I think that's going a wee bit over the top...
This mention of Scotch whisky reminds me of a radio show I heard many years ago talking about the history of the liquor. The gist of it is that although part of the credit for the modern distilling process that enabled large-scale production goes to a Scotsman, Robert Stein, it was actually an Irishman, Aeneas Coffey, who improved on Stein's design to make the process cheaper and more efficient and thus economically viable. Coffey had been for many years the head of Customs and Excise in Dublin and, as the Crown's representative in Ireland and the person who was responsible for destroying illegal stills and collecting taxes, was a hated figure at the time. He also had a keen interest in whiskey, though, and had the technical skills to come up with his own his improved still design. When he left his position as the number one excise man, he patented his invention and tried to get backing to apply the invention in his native Ireland. Due to his past, however, nobody would support his endeavour (apart from one brief, failed venture, it seems) and he was forced to travel abroad in order to further his venture. So it was that he came to have his invention put to practical use in Scotland where, through a combination of his inventiveness, happenstance (a disease affecting grapes used in the premier spirit of the time--Cognac--left an opening in the market, while the whisky produced appealed to the English taste) and venture capital, Scotch whisky effectively exploded onto the scene and changed the industry completely.
So in short, the single malt whiskies that are today synonymous with Scotch, actually owes a huge debt to the Irish--on the one hand thanks to Coffey's design, but also, on the other, thanks to the spite of his fellow Irishmen in turning their back on his invention due to his past job and associations.
As the OP said, I might get punched in the face for recounting this story were I in Scotland. As we're online, though, I think that downvotes are a more likely outcome, so I've had to do a bit of searching to corroborate this account. I wasn't able to verify everything, but these two links seem to cover the basic outline:
In other words, yes, Wikiland still has a way to go before the label "reliable" can be applied.
Unlikely, since it's based on UDP*. Building a BGP** layer on top might make it a little better though.
* Unreliable Dictionary Protocol
** Byzantine General Protocol
given what most Applytes I know are like ...
Jobs forbid that this trial opens their eyes and they become Applostates ...
Their ARM-based, colour screened, touch sensitive, mini computer with telephone, camera, sensor and internet capabilities looks like our ARM-based, colour screened, touch sensitive, mini computer with telephone, camera, sensor and internet capabilities. It's not fair!
If they did we'd all be running around munching pills in darkened rooms listening to repetitive beat music.
Having a choice of one does rather smack of a Stalinist ideology
A choice of one (aka Hobson's choice) strikes me more as being a Fordist ideology. Goes to show that a monopoly can form under political systems of either extreme, I guess.
Valve are blatantly using this as a starter to "STEAM OS"
I'll get this in early before any announcement: vaporware.
Another possibility is to allow for dynamic linking, but include the .so files as part of the steam infrastructure. You can have a totally sandboxed environment for playing the games (ie, only use those libs provided by steam rather than depending on whatever cruft you have installed on your system) by using LD_LIBRARYPATH, or you can pick and mix between the game/distro-supplied libs by using LD_PRELOAD. All this can, of course, be hidden behind a graphical game config screen or launcher with simple toggles to choose between game or OS-supplied libs.
Majo on anything hot is just weird
I used to think the same, but tuna melts without mayo just doesn't work.You may need to lightly toast the side of the bread that gets the tuna mixture to keep it from getting too soggy though.
I know kaminari (雷) is Japanese for "thunder"(*), but what's with the 'o' at the end? Is it actually deriving from some Spanish word?
* Jim Breen's JDIC also lists kaminarioyaji (雷親父) as an irascible old man (a Victor Meldrew type, no doubt) but that's surely not what the name is hinting at...
HTF can a one word answer be taken out of context? it IS the context
Read the article a bit more closely. To paraphrase it seems that the Gartner man meant something like "we usually write paid-for reviews, and in this case we weren't paid to say nice things about Metro without a touch interface. Summing up the experience as 'bad' was an obvious thinko on my part in the context of getting paid for further shill work".
Yes, I find it quite astonishing that he'd say something like that, but that's what I read the article as meaning.
The simplest explanation is that 101 in binary is 5 decimal, which is a tacit acceptance that windows "8" is actually two steps backward and in no way an advance on Windows 7.
(sure as every second Star Trek movie is shit, ...)
Reading this I was expecting them to announce a Judge Dredd style "Time Stretcher".
> OK, show us the small gains ... and we'll be pleased yo add them up.
Oh, I don't know. Here are a few productivity boosters I've come across...
^t to transpose the last two letters (if you've types teh instead of the, for instance). Alt t does the same for words and ^x t does it for lines.
^x n n to narrow to a region so all your global (ish) search and replace and reformatting is restricted to just where you want it (^x n w to "widen" again after)
alt / to autocomplete (cycles through all previous words with the same prefix; also works after a word)
abbrev-mode to save typing on common expressions (like fwiw, atm, otoh, etc.). Put in common misspellings that you make to have them corrected (teh, embarrased, whatevar)
^x r m to mark a bookmark at the current point and file, ^x r b to jump to a saved bookmark
^x 2 to split screen into two horizontal panes, ^x 3 to split vertically, ^x 1 to go back to single pane view
^x o to jump to the other (next) pane in a multi-pane display (a good one to assign a keystroke like keypad-enter to)
^ space to set a mark (one end of a region) ^x x to jump back to it (previous point then becomes the mark)
Alt c capitalise word at point (or the region, if set); alt -c capitalises last word (also u and l for all upper and all lower case)
alt $ spellcheck current word
alt a, alt e jump to start/end of this/next logical section (paragraph)
^x (, ^x ) define a keyboard macro, ^x e to execute it (or alt number ^x e to do it that many times)
^x tab indent region by default amount (or prefix with alt number for a particular delta)
I don't know about you, but all these features are great time-savers for me. I'm sure I could go on, but I think you get the point.
Indeed, respect to Weird Al. Loved his take on Subterranean Homesick Blues with the Dylan/Ginsberg style video, but all the lines are palindromes*. Called, fittingly enough, "Bob".
* Note, doc I dissent. I diet on cod. A fast never prevents a fatness.
It's definitely too late to get that nomenclature out of the minds of regular folk, and even techie sites like this still succumb to using it despite knowing how wrong it is. I think I may have a solution: wherever you'd write "God", simply write "God*" instead. You could put a footnote at the bottom of the article if you wanted (eg, "Not your God", "No relation" or "Yes, we know") but I think it would be even better without the footnote. The asterisk has a fine tradition as a way to let people say "fuck" to prudish audiences, so why can't God* stand in for "Goddamn"? Occasional hilarity from readers confusing God* with a completely irrelevant footnote could be seen as a bonus.