81 posts • joined Monday 11th February 2008 13:44 GMT
Then they should call it "Wave Function" because it's so unreliable it collapses when you just look at it. (*)
(*) Note; I haven't actually used Fedora 19- but I wanted to make that joke anyway. :-)
AC: "Only by Americans. And morons."
As Mark Twain (*) might say "...but you repeat yourself." (^_^)
(*) Yes, I'm aware of the irony of using a quote from an American to insult Americans. Not my fault that Twain was a moron. ;-P
Purple Ketchup? Been done...
Heinz got there years ago with their "funky purple" (*) EZ-Squirt ketchup. Anyone else remember that? They also did it in other colours such as lurid green. According to this article:-
...this was back in 2000.
(*) What the f*** is it with marketing tossers that anything brightly-coloured aimed at young people is described as "funky"?! To paraphrase Alexei Sayle, have you noticed that anyone who uses the word 'funky', who isn't involved in the music industry is a right twat?
"This was 1990, when most people at home would have been using all sorts of different machines. Amigas, STs, Archimedes and so on. These machines were all good but they all tended to lack something."
Since you mention the Amiga, it's worth pointing out that both its hardware *and* operating system was in many respects more advanced and modern than Windows, even in 1990, five years after its launch.
MS-DOS started life as QDOS ("Quick and Dirty Operating System"), a bought-in, early-1980s 16-bit knockoff, er... workalike of an 8-bit 1970s operating system called CP/M. It was nothing special even then. MS-DOS was upgraded piecemeal over the years with numerous kludges to work round the countless design and architecture limitations of the original PC and OS, which made it more complicated. (The PC itself was made from almost entirely off-the-shelf parts and sold mainly because it was an IBM.)
Windows at that time was just a graphical add-on plastered on top of this text-based OS- more clunkiness for all.
It really grates when people get nostalgic about messing about with DOS config files and say "that's just the way computers were back then". No, *that's* the way computers running a messily-upgraded OS with very dated origins (even by the standards of the time) were. Those config files were only required because of DOS's hackily-upgraded 8-bit-derived design. People who only used PCs back then have a blinkered view, and it's a shame that the Amiga only really enjoyed success as a games machine and niche use in multimedia and video. It was extremely ahead of its time when it first came out (4-channel sampled sound and up to 4096 colours on screen at once).
The Amiga had true pre-emptive multitasking in 1985, whereas Windows 3.0 (1990) still only supported co-operative multitasking (e.g. I remember Windows 3.1 telnet locked up the whole OS when the remote server didn't respond, and didn't relinquish control until the connection timed out after two or three minutes).
Of course, the problem with the Amiga is that Commodore sat on their laurels and only made minor changes to the Amiga OS and architecture until firstly the PC hardware then the OS caught up then overtook it. I wouldn't suggest that it's a viable competitor today (even though it's still being updated as a niche product in order to milk diehard Amiga fanatics). But at the time of Windows 3.0, it *was* better.
Re: It takes a scammer to see another scammer
That's a matter of opinion- people can judge the value of the recreated C64 for themselves. It's not like they were misled, nor that they were charged silly money for the things.
Anyway, regarding the original story... It's true that from an investment point-of-view one has to err on the side of caution. If something gives off the warning signs of *potentially* being a scam, it should be treated as such until there is sufficient evidence otherwise.
At a personal level though, the guy's entitled to the benefit of the doubt. Since it isn't entirely clear that it *was* a would-be scam rather than a badly-thought-through scheme by someone with more enthusiasm than sense, he shouldn't be being flamed as the former. While Ellsworth was probably right to warn people off the scheme, she could perhaps have been less personal about it.
Re: Lazy Fat Americans.
"The reason for the use of HFCS is quite simple. Liquid HFCS is much easier for robotic food processing machinery to work with thatn granular [sucrose] sugar. The slightest humidity tends to clump granular sugar."
If that was true, HFCS would be much more popular outside the US than it actually is. (AFAIK, Japan is the only other major market where it's used that significantly- around a quarter of sweetener consumption there).
The reason for the massive use of HFCS in the US is simple. The corn it's made from is massively subsidised by the US government, meaning the HFCS itself is in effect subsidised and cheaper than it would otherwise be. Sugar tariffs on imports are high, increasing the differential.
Obviously the sugar tariffs will be different elsewhere, and- while I'll admit to ignorance of the actual legal situation, I'm guessing that trade agreements would prohibit HFCS being sold outside the US (or at least outside the NAFTA region) at the same artificially cheap price that makes it popular there.
Flash succeeded where the much-hyped Java Applets failed
As others said, it's more like it's one of the "death of a thousand cuts" and the indifferent (rather than indignant) response to the move highlighting its decreasing importance- good or bad.
As we all know, while Java is still around, Applets themselves never took off for a number of reasons (not least their slow speed and resource hungriness by the standards of the time).
However, we did end up with something that filled almost the same niche (at least from the end-user's point of view)... that "something" was, of course, Flash. Yep, the one-time animation-centric plugin.
This wouldn't be to say that Flash was the primary cause of Java Applets' failure. Truth be told, the latter had already pretty much failed on their own merits by the time Flash had started moving past its early presentational roots.
Remember the use of "Terrorism" Act against Labour Party protester?
"The problem is that no matter how much politicians promise not to use this against the "average" person it will end up being used that way."
This is correct. Regardless of whether or not claims as to legislation's *intended* use are made in good faith or not, experience has shown that this cannot- and must not- be relied upon.
One notorious example is the use of "anti-terrorism" legislation, specifically the Terrorism Act 2000 (introduced under Labour's watch), which was used against an 82-year-old German-Jewish émigré who had heckled Jack Straw at the 2005 Labour Party conference. Specifically, the law was (mis-)used to stop him getting back in:-
Regardless of whether or not one thinks he should have been allowed back in, the fact that a supposed "anti-terrorism" law was able to be used- and *was* used- against someone who clearly wasn't engaging in terrorist activity nor in a terrorist context shows that the law was badly designed (assuming it was designed in good faith) and that the very party who introduced it- and were still in power at the time(*)- couldn't be trusted to ensure that its usage was restricted only to the claimed targets. (**)
Similar arguments apply against the use of "anti-terrorist" legislation used to freeze Icelandic bank assets in the wake of the 2008 Icesave bankruptcy.
Whether or not one thinks action should have been taken against those respective parties, the fact that it was done using "terrorist" legislation is the concern, because neither were remotely "terrorist" and nothing like the targets the legislation was claimed to be aimed at.
A law that can be misused for something not remotely related to its claimed purpose (whether or not one thinks that a *specific* "misuse" is justified) is wide open to blatant abuse for a whole range of purposes, desirable or otherwise.
(This shouldn't be taken specifically as an anti-Labour rant; I despise the Tories, and didn't vote for them. However, many- myself included- assumed that they would (at least partially) stop and roll back Labour's egregious assault on civil liberties and pathological disregard for personal privacy. Instead, they're turning out to be just as bad in this respect).
(*) Whether or not it was the police's choice to misuse the legislation this way, the fact remains that Labour were the ones responsible for introducing legislation that could be misused in the first place.
(**) Of course, this assumes that the party that introduced the legislation remains in power to ensure its "correct" usage. Even if *they* can be trusted to act in good faith and ensure its correct use (and in the above cases, they obviously couldn't), this is irrelevant if and when they lose power.
30th anniversary of every man and his dog releasing a Spectrum-basher
There seem to be a *lot* of these "30th anniversary" look backs at microcomputers just now. That's not surprising though, because it was around this point that the home computer market exploded (due to their becoming cheap enough for the man on the street and not just the rich hobbyist). Everyone saw the money to be made and started jumping on the bandwagon.
There were a frankly ludicrous number of home microcomputers being released back then. I have a load of my Dad's old "Your Computer" magazines circa early 1982 to late 1984, and each month there's a review of at least one new computer, frequently two and sometimes three.
Almost all these machines were incompatible, and even then people cared about having a machine that had good software and peripheral support. It would have been obvious to anyone that the market couldn't and wouldn't support them all and that the vast majority would fail- and they did.
In the UK, the ZX Spectrum dominated mainly because Sinclair was the first to release a colour/sound/hi-res computer at that price point. The network effect made its success self-reinforcing and made it harder for the "me too" competitors like the Oric-1 (and countless lesser-known machines) to break its stranglehold, even after it was outspecced.
The C64 did well at a higher price point, and Amstrad's CPC was surprisingly successful for a late-era entry, but aside from a few lesser-supported and/or niche formats (like the Atari 8-bit and BBC), the vast majority of those other computers had disappeared without trace by the mid-80s, never having gone anywhere.
"Then fork'an do it yourself."
While the ability to fork *is* a major advantage, that doesn't automatically make "Don't like it, then fork it yourself" (or some variant) a reasonable response to any criticism of an open source project, and (IMHO) certainly to this one. Else you could use that as a comeback to *anything*(!)
I mean, aside from the fact he might not have the skill to do this, are you seriously suggesting that he should fork it just to have his <blink>? Of course not! :-)
Just because something's open and/or free-as-in-beer doesn't negate people's right to criticise minor aspects of it. Obviously, if they start getting overly entitled (particularly if the complaining "user" is a large company), then, yeah... you can tell them to go fork themselves.
He made a reasonable point; one which (as a generally happy user actually typing this on Firefox) happen to agree with (though IMHO the default option for blink *should* be "disabled"!!)
Re: Real reason for its removal was a security issue...
*reads thread through properly*
*realises his quick text search (to double-check that no-one had already made this reference) had missed mIRCat's comment *
"Damn you! Damn you to hell!"
Real reason for its removal was a security issue...
Mozilla's support of <blink> made it vulnerable to attacks from Weeping Angels.
Re: Cunning plan
"That's not a very nice thing to say.."
Why? The guy was king of Denmark, England, Norway and Sweden, after all...
Re: Inner Mongolia Baotou Steel Rare Earth Hi-Tech Company
"Back in the days when I was at school, ["Inner Mongolia Baotou Steel Rare Earth Hi-Tech Company"] is what I would have called our band"
Yeah, but I'm willing to bet the record company would have forced you to shorten it to "Hi-Tech" to get a deal anyway. :-(
That's assuming they didn't force you to ditch it altogether and give you a bland, marketing exec chosen name like "The Noise". :-6
Mind you, this isn't always as bad as I made it sound- who would disagree that the band "Seymour" were done a favour by having to change their name to "Blur"?
Douglas Adams thought the idea ludicrous 30 years ago, and this version is no better
This sounds oddly reminiscent of a Douglas Adams interview I once read, that was given around *30* years ago:-
“[MIT] were showing me some research they were doing on video telephones. They reckoned that everybody has a number of people they regularly speak to on the telephone at your telephone you could have a small computer, storing video pictures of those people. When somebody rang you, a phonetic program would find the right picture and move the mouth in time with the words.
“They were very pleased with this [but] if you look at that logically you’ll see that this is not increasing communication — it is actually decreasing it.
“If you talk to somebody on the telephone your attention is concentrated on what they are saying. When you talk to somebody face to face or even on a television screen you get the message partly from their gestures and the expression on their face. But if you are seeing a picture which is not giving you any additional information the two impressions are totally contradictory.
“If someone rings up to say ‘Oh God, I’ve just gone bankrupt’ or ‘My wife’s run off' and you have this bright, smiling picture with the lips moving in an utterly grotesque way, it is not actually helping you to understand what the person is saying."
“The whole project is ludicrous and self-defeating but I couldn’t get the researchers at MIT to understand that.”
Now, this present-day equivalent is- in theory- attempting to match the mood of the person to the head. But in practice, it's still closer to Adams' ludicrous example above. It's applying a generic, pre-packed, pre-defined expression to the face that will convey none of the subtleties of *your* real expression and how that conveys your *actual* emotions. In short, it tells you nothing more than a single word or phrase covering your mood would.
In this way, despite its superficial improvement over MIT's early-80s example, it has *exactly* the same problem- it actually *decreases* communication by distracting from the content with misleading visual content.
We like to hear him swear on the TV, hee hee hee...
AceRimmer:"Is that Alexei Sayle the renowned metal and wood worker?"
Actually, to be fair to Alexei, the exact quote as I remembered it- and which Google seems to back up- was:-
"Anyone who uses the word 'workshop', who isn't involved in light engineering, (*) is a right twat."
This lends itself nicely to paraphrasing too, e.g. to anyone outside the music industry using the word "funky" as a cliched marketing attempt to make brightly-coloured soft furnishings sound cool (^_^)
(*) On reflection, one could be pedantic about use of the word "engineering" here, but the meaning (i.e. light industry) was obvious enough, and it was funny, so I don't care :-P
Somewhere between a "good faith" release and an ashcan copy?
"Only about 100 copies of the four-CD set were produced, with sparse packaging and an insert listing the details of the set’s 86 tracks, all previously unreleased studio outtakes and live recordings from 1962 and 1963. "
100 copies is still very low, even for a grossly overpriced Dylan-obsessive-fanboy-milking release (there are plenty of them out there).
FWIW, I wonder if each of the copies has (somehow) been given its own watermark so that they can identify where any leaked illegal copies might have come from? Whether it would be practical to trace ownership and assign blame of even 100 copies though, is questionable.
(Reason I ask is that if they produced and sold few enough, it might theoretically be possible to stop the recordings getting anywhere near the "general public" even if they had- legally- "released" them. Especially if they'd (say) agreed to buy them back from the pre-arranged buyers, i.e. de facto record company employees).
So, while this gives the impression of being something akin to an "ashcan release" (*), one wonders that since they had to sell it anyway- and decided to sell 100- why they didn't just manufacture and sell even more of them and get the money anyway, even if that wasn't the reason for releasing it.
(*) i.e. Something released solely to fulfil a legal obligation to avoid losing rights, and not a "good faith" attempt to genuinely sell it. The definition might not apply precisely here, but the general principle is along the same lines:- http://en.wikipedia.org/wiki/Ashcan_copy
Amazon's "helpful" and "not helpful" rendered worthless by fanboy abuse
"Best to have a user based moderation of reviews to see if they helpful or not (e.g., Amazon)"
Amazon's "useful review" meta-rating system has been abused into worthlessness to the point that I'd rather they simply removed it altogether.
A critical review of almost anything by a musician, author, television series, etc. with a notable fan following, is likely to be modded as "not helpful" by fanboys/fangirls of the creator. It's entirely predictable, and it's also blatantly obvious this has nothing to do with how helpful or informative it is (or isn't) and everything to do with partisan fans punishing dissenting views they don't like.
If Amazon actually cared about the usefulness of the "helpful"/"not helpful" meta-ratings, they'd have figured a way around this now. As it stands, they haven't, and it's worthless.
Even in cases where this isn't likely to apply, I never bother considering that someone else considers the review "helpful" or not. I can judge that for myself.
Re: If i upload something that i don't own
"The terms leave them able to say "Bakunin licensed us to use it for commercial purposes, and we did so in good faith, if there was no permission to use the likeness of the model, Bakunin should not have granted us the license" "
Is there an explicit indication that the agreement includes a model release, or is this just implied (or perceived to be implied by you).
Even if that was accepted that it did, and it was valid, there's a problem.
It's obvious that, as a service aimed at the general public, a significant proportion of people either (a) won't have read the agreement and be aware of what's in it, (b) might have read it, but won't have understood it and/or won't have understood the *implications* of what they were agreeing to and/or (c) won't care about the copyright status of any uploaded random crap anyway.
This is obvious to me, so it wouldn't be remotely plausible for Instagram (or Facebook)- billion dollar companies with presumably massive legal resources- to argue in court that it hadn't occurred to them.
Regardless of whether one could argue that the end users agreed to the terms and *should* have known what they were doing, it wouldn't change the fact that Instagram/Facebook were (I'm guessing) on the hook for any copyright violations or incorrect model releases when they would have known damn well in advance that it would happen.
IANAL, but I doubt Instagram/Facebook could simply wash their hands of responsibility if they weren't (at least) pre-screening and verifying material, regardless of that clause. (Anyone with an appropriate legal background care to confirm if this is correct or not?)
Re: If i upload something that i don't own
"The case that's specifically interesting is what happens if I take a picture of someone, which Instagram then use to sell advertising?"
My first thought was of this infamous case from a few years back:-
Having re-read that article, it's not clear that the issue wasn't also one of copyright (the uploader to Flickr and the one who "granted" permission for reuse wasn't the photographer/owner). But the issue in question is that even if copyright permission *had* been legitimately granted by the owner, the ad agency probably still would have been able to be sued by the girl in the picture because they didn't have a model release.
"I assume the line "Instagram does not claim ownership of ... " is some kind of legalese that leaves you responsible for the image but they get to profit off it."
Trust me, I'm sure a competent lawyer will find some way to argue that, regardless of any attempt to weasel their way out of responsibility through pseudo-legalistic disclaimers, they're still on the hook. A particular motivator being the fact that Instagram/Facebook are the ones with all the money, not the random sod that uploaded it.
(Particularly as the affected person wasn't the one who agreed to such questionable terms and conditions?)
Re: Laserdisc *is* analogue
"Sidenote: Even the AC-3 signal on Laserdiscs is stored in a RF-signal and you need a demodulator to turn it into digital."
In that case, it's still a digital signal though; you don't "turn" it back to digital (which would imply that it had been converted to analogue then re-digitised). If the system is intentionally designed such that the digital source is modulated, and can later be recovered in its original form via demodulation, then it's still a digital signal.
Digital signals can be modulated, transmitted, whatever. They might be affected by noise, but that doesn't change that they're meant to be digital, e.g. with digital radio transmissions. That's not to say that they can't be corrupted (*), but that provided the damage isn't in excess of the system's operating limits and/or error recovery systems, the original "perfect" signal can still be retrieved.
This contrasts with Laserdisc's encoded video, which- despite the use of pits and lands to encode it- remained entirely analogue from start to finish, for the reasons given above.
(*) In the simplest digital encoding, we might represent a "1" with 100% signal and a "0" with 0%. (**) Some noise might get in to that, even with some background hiss, we know that 10% level is probably still meant to be a 0, and 90% or 110% is still probably a 1. On the other hand, if we get a signal at 50%, is that a corrupted "1" or a "0"? And a 0 might be pushed up to a 1 by burst of high-volume interference. We can spot errors using a checksum, and recover from some minor corruption using digital error recovery.
(**) Most digital encoding and transmission schemes are *far* more complex than this though :-)
Laserdisc *is* analogue
"This isn't really true. LaserDiscs were every bit as digital as the CD is. Unlike DVD or Video CDs though, they simply stored a digitised SVHS signal."
The main video signal on Laserdisc *is* an entirely analogue (i.e. non-digitised) representation of a composite video signal.
It's true that the Laserdisc video signal is stored via a series of discontinous (on/off) pits and lands. That sounds fundamentally digital, doesn't it? (Especially as digital CDs and DVDs also have physically similar pits and lands).
However, it's not. The signal is encoded via the *length* and *spacing* of those alternating pits and lands (using analogue "pulse width modulation"). Since that length/spacing is fully variable and non-quantised, that makes it entirely analogue, and not digital.
Re: Ah memories...
"I wonder if that boss is still around and now berating his staff because the company is not on Facebook? Same shit, different year....."
The difference is that people do at least use Facebook at a level matching the hype (much as I dislike it personally).
OTOH, even at its peak, the media obsession with Second Life far, *far* outweighed the number of people who ever actually used it.
Its prominence was probably because it fitted a cool-looking cyberpunk vision of the future and was suited to self-publicising navel-gazing from the few who actually did take part. This would explain its appeal to TV and media journalists who were worried about being seen as out of touch and missing the next big tech thing. They wanted to get there first.
But actual real-world even-my-Granny-is-on-it levels of popularity? It never even got close.
Re: An improvement
"At least with this remake you can tell the difference between the front & the arse."
You said pretty much what I was going to say.
It always struck me that, viewed side-on, the old "new" Beetle was near-symmetric, whereas the original definitely wasn't- at a basic level, they were actually quite different shapes. The "new(er)" Beetle succeeded in that people would look at it and know it was meant to be a Beetle, but put next to its namesake it was quite clearly different.
This new one at least more closely resembles the original in this respect, but it also has undeniable Porsche 911 vibes. Maybe not surprising as both were designed by the same guy and ultimately variations on the same basic theme, albeit taken from totally different directions.
Of course, it's still an overpriced pastiche, a total lifestyle-accessory sellout of the original "people's car" concept. Then again, one probably doesn't want to be *too* purist about respecting its origins- which were, let's not forget- as a Nazi-commissioned car for the "peoples" of the Master Race by a guy who also designed military hardware for them. :-O
Credit where credit's due
"Sensing the shift to digital, Telextext Ltd made an early leap onto the web."
Good grief.... it's not just that Teletext (the system itself, not the company (*)) was fully digital (**) anyway, it's that it was probably the *first* widespread digital service- or digital anything!- aimed at the home user.
Under normal circumstances, this might be considered pedantry, if one is the sort of mouth-breather that considers criticising misuse of the word "digital" to mean "online" instead of... er, "digital" to be pedantic.
But despite all the nostalgia, this is one thing that few- if any- people have given it credit for, despite it arguably being the most important thing about it. Digital in the mid-70s, aimed at the man on the street, years before CDs were launched and the closest thing to a home computer was the Altair 8800 hobbyist kit (screen display and keyboard optional). Information on demand- primitive and limited by modern standards, but still amazing for the mid-70s, and the first spark of the modern information-drenched age.
(*) What f***wit let them get away with launching a service with the same name as the technology itself? It's like calling a new TV station "TV station" or "Television".
(**) What *is* it about the word "digital" that people use it to mean online, or downloadable or whatever? CDs are digital copies. DVDs are digital copies. Blu-Rays are ******* digital copies.
Captcha: "Describe in single words only the good things that come into your mind about your mother."
Response: Server shot to pieces
Conclusion: We're guessing that this one was a bot. Though it might have been someone sick of trying to decipher illegiable captchas.
Re: Element 119 and beyond? Islands of stability ...
"and then to find two isotopes which are available in sufficient quantities and at affordable price that can be combined to make it."
Given that this experiment involved firing zinc ions (element 30) at bismuth (element 83), am I safe in assuming that it was a simple case of 30 + 83 = 113 protons for the new element?
And that if we wanted to create element 115 by firing zinc ions in the same way, then the other element used would have to be element 85, i.e. Astatine? That is, an element that doesn't exist naturally, only via radioactive decay of other elements, is incredibly unstable in its own right- its longest-lived isotope has a half-life of 8.5 hours- and that has never been seen by the naked eye because (according to Wikipedia) "a mass large enough [for that] would be immediately vaporized by the heat generated by its own radioactivity".
Yes, I can see that this would make astatine *slightly* more difficult to work with in a similar setup than bismuth. :-)
Of course, I guess they could try other combinations of elements- they'd have to- but assuming I got that correct, I guess it illustrates your second point quite well. :-)
Re: ROI not part of UK.
Interestingly, I was going to say the same thing until I checked and realised that the ROI are switching *their* telly over on the 24th of October as well. (Can I assume that this isn't just coincidence and that the UK and Irish governments coordinated their switchovers for technical reasons?)
Anyway, that surprised me, as I'd thought digital terrestrial only launched in the ROI a couple of years back... and I was apparently right. They just haven't taken an eternity to make the switch (unlike the UK).
One major advantage- for them- of having switched later is that their system is *all* MPEG-4 (i.e. the newer and more efficient system), i.e. they don' t need separate technologies and incompatible boxes for SD and HD, and shouldn't even require simulcasts of HD material (as the boxes should be able to downscale HD as required).
Whereas in the UK, our original SD digital terrestrial service dates back to the late 90s, and the established base of equipment doesn't support Freeview HD transmissions, so it's space-wasting simulcasts all the way.
Re: Color me unsurprised...
I only read Wired (the US edition) (*) a few times circa 2000-2001. I liked the idea of it- in part influenced by its reputation, but never really took to it.
Later I realised that- rather than the tech/science magazine it presented itself as- it was fundamentally a glossy business (**) and lifestyle-oriented magazine aimed at the sort of people who liked the *idea* of being into technology and science, but who really weren't when it came down to it.
From what I remember, some of their articles *were* quite long and gave the impression of being serious and in-depth. The problem was that- despite this- once you got to the end, you realised that you'd actually learned (or rather, been told) relatively little of substance about the tech or science from the article, or indeed about anything. It was a US-centric, startup-fetishising wank/bloat-fest.
Maybe this got worse after Conde Nast took it over in the late 90s, but I suspect it was always overhyped.
To be fair, I've read some interesting stuff on the Wired website- which I gather was under separate ownership for a long time- but I've still no desire to buy the mag again.
(*) Never read the UK edition, so can't judge that
(**) Specifically the tech business, granted, but it was still about that particular business, rather than the tech itself
Re: Torvalds is turning out to be
Plan 9 has been out for years- you're thinking of the GNU Hurd.
It's quite possible that if Linux had never existed, more developers might have spent time on the Hurd instead. So one could *possibly* make the case that Hurd was a victim of Linux's success and that if the latter had never existed, Hurd *might* have come further by now.
Whether it would still have come anywhere near as far as Linux, or represented the breakthrough of free software as Linux did is more open to question (ironically, given that it was Stallman who started the Free Software movement!)
What happened to Orabile and T-Mange?
Yeah, I would have voted for "Orabile" too. Either that, or the other suggestion from the same user (bluesxman), "T-Mange".
Re: They're finished
"Kodak's last hope was as a small niche player in the b/w film market (like Ilford"
Kodak was always a large, mass-market company, and a niche market like black-and-white film- even if profitable- won't be able to save them in their current form. It's like comparing a small paper shop turning a decent profit (*for its size*) versus a large but obsolete supermarket trying to keep itself open on the profits made by the small newspaper section at the front of the store. Those profits wouldn't even come close to covering the whole superstore's running costs.
Kodak will not survive in its current form. They are a large company built primarily around a large consumer film market that no longer exists. For that reason, there would be no point restructuring them to continue their old business (like happened with General Motors).
Ironically, I'm guessing that Kodak had to reach the point of legal bankruptcy to be able to make the changes necessary to the existing organisation surviving in *any* form- the amount of investment, associated employees and legal liabilities related to their existing (mass market film-based) operations would otherwise probably have made this impossible.
And once this restructuring has been done, the question is whether there would be any point in trying to keep the "old" Kodak (or anything like it) together when the core reason for having all those (formerly) secondary parts in one company disappeared with the mass market for film.
I suspect the various parts will be split up and sold off. The name will certainly remain, possibly licensed to whoever buys out and continues the (professional and niche amateur) remains of Kodak's film business, and also quite probably whored out to random distributors who want to exploit the name recognition to shift generic electronic tat (a la "Polaroid" LCD televisions, and similar cases with countless other defunct company brands).
Effective Flat design? Harder than it looks to pull off...
In many respects, I'm a fan of the return to the "flat", clean look of graphic design, where it's done well. For example:-
The Warner Communications logo? Great design:-
The original Atari logo is one of my all-time favourites:-
However, the downside of this becoming a trend is that it's harder than it looks to come up with a logo that looks great when shorn of pretty shading effects and the like. In going for that back-to-basics look, it's all too easy for a less-skilled designer to end up with something that simply looks underdesigned, boring and/or amateur.
(FWIW, this goes for the "flat" design trend in general, and not just logos.)
For example, the new Office 2013 logo in white against red is (IMHO) quite effective. On the other hand, the new Microsoft logo seen in the video (with its four coloured squares) looks underdesigned and dull. Meanwhile, while I like the white and cyan minimalist colours, the Windows 8 logo itself is boring *and* bordering on amateurish, due to the perspective chosen rendering the cross with (too coincidentally perfect and naff-looking) 90 degree angles. I still can't believe they accepted something that bad. Saul Bass it ain't.
I actually thought the current Windows Phone logo- with its flattened and boxed version of the XP "wavy" Window- was much more effective:-
Re: This post complies with all ten rules.
Some people might view that as making a fool of the law and/or inciting law-breaking behaviour by implying that only things outside the law are worth doing. I'm sure that a skilled lawyer or f***witted jobsworth at Robin Hood airport can have you done for at least one of them.
Please remain where you are until the authorities come and arrest you.
Nope, Baby Einstein paid for the privilege
Rob Dobs:"You can trademark something that is in public domain, but you have to specify what goods you are selling under that Mark. This is how [..] Baby Einstein uses old Albert's name."
Actually Baby Einstein *didn't* (and presumably weren't able to) do that. In fact, they pay a whole load of money for the privilege of giving their questionable learning aids a spurious association with Einstein's name. From the Wikipedia article:-
"The Baby Einstein Company pays a significant amount of money to Corbis, on behalf of the estate of renowned physicist Albert Einstein, for the use of the Einstein name, though the products have virtually nothing to do with Einstein or his work (however, Disney uses a disclaimer that Einstein is a trademark of The Hebrew University of Jerusalem)."
Re: The reason I bought a SNES
My brother had a SNES, and (aside from the occasional borrowed game) this was the *only* game he ever owned for it. He sold it after a year, but even if he hadn't, I'd have said he got his money's worth from the console- he played the damn thing to death!
Re: (c) teh interwebz
"It's a reminder that the state has more power than any industry lobby when it comes to nicking your rights."
Surely the whole point of industry lobbying is that it *is* a means to gain power by influencing the thinking of government (and hence ultimately policy and lawmaking) in their favour?
That's not to say that governments aren't quite capable of passing crap and/or unfair laws for other reasons, but it's definitely true that the whole point of *lobbying* is to co-opt government for lobbyists benefit.
In this particular case, the decision blatantly smacks of Google's behind-the-scenes influence.
Today's shiny new Smart TV = Tomorrow's dated set you're already wanting to replace
"This means my [several weeks old] list is already out of date and whatever I buy now will be incompatible, non-upgradable tat by the time it gets delivered."
Implying that even a shiny new, (currently) up-to-date smart TV will be in the same boat very, very soon. Bearing in mind the author's apparent horror of tech that *isn't* bang up-to-date, why does he want something that'll be shiny and friend-impressing for a brief period, and then annoyingly out-of-date for the much longer remainder of its life?
I thought the geeks' take on Smart TV was that most weren't all that "smart", didn't do what they did that well, were locked in to manufacturers' favoured proprietary services and peripherals, and (as the article implies) generally didn't get much in the way of manufacturers' upgrades, thus rendering the "smart" parts cheesily dated tat within a couple of years even if the display was fine. This last bit suiting the manufacturers who would rather sell you a new TV.
I've heard it said it makes a lot more sense to get a no-frills, high-quality TV and use (easily replaceable) external boxes for all the "smart" stuff- probably doing it better than the Smart TVs themselves would have.
Obviously not so good for margins-squeezed TV manufacturers who wanted to grab back some profit and differentiation via the Smart TV functionality, but that's business...
"Seems to be very little point in spending a lot of money on an SSD until they make them from memristors. Although it may be 5-10 years."
Are you serious? Storage technology (HDD or SSD) is generally rendered obsolete or out-of-date faster than that anyway, but it doesn't stop people buying it.
By the point "5-10 years" in the future these devices might- or might not- be available, the SSD you "spent a lot of money on" today will *already* likely have been rendered unimpressive by much faster and cheaper developments of regular storage- and they'll probably have been replaced and discarded at least once, if not twice anyway.
If this technology pans out and comes to fruition, then great! I wouldn't put off my own SSD purchase today though.
Feel free to put off *your* purchase though, and be prepared to wait further if- as often happens- a promising technology hits a roadblock. I do hope you didn't put off buying a floppy drive in the early-80s because bubble memory was on the horizon. :-)
Re: @Sony NEVER learn
No, it's because 50 Hz (i.e. fields per second) is the native rate of UK video, and it's impossible to convert to this from 60 Hz footage without introducing some judder. (See my comment above for more on this).
The light flicker can be an issue- I notice that at least one camcorder includes an algorithm to reduce this, presumably by compensating the brightness per frame/field.
Re: Also, does it matter ...
"16oz=1pint oh, no, hang on, where are we ? Oh, yes, the UK, sorry. 20fl oz= 1 pint."
Of course, since you implicitly mentioned the UK versus the US fluid ounces, it's worth remembering that the "pints" they're based on are different anyway! A US pint is only 473 ml versus 568 ml... that's a major difference and point of confusion.
(Ironically, this results in both fluid ounces actually being closer in size, since the US one is a larger proportion of a smaller pint!)
If non-metric units are intuitively "right" as some suggest, then how can this apply to both (differently-sized) pints? Surely one of them must "feel" wrong- but both the Americans and the British seem to be quite happy with their pints, suggesting that it's as much down to familiarity as anything. Both are either side of a half-litre anyway...
Also, the US pint is apparently based on an *older* version of the English pint, whereas the larger Imperial pint was based on a later (early 19th-century) standardisation, so the US pint should be the more historically-grounded, closer-to-its-roots "correct" one.
Does it have a radio as well?
Someone ought to tell the cameraman he's been using daylight bulbs with tungsten film... Seriously, I know that Hollywood is in love with that overdone turquoise look, but *that* is just ridiculous.
As for the whole sideboard concept- yeah, it's a bit retro, but not in a way I think has any cachet at the minute. It just looks like a quaint relic from some middle class home of the 60s. And is the integration so important that it's worth going with some unknown noname brand?
Not poisoning my body with something so passé!
Er, soup came back into fashion a few years back- that means that it must surely be "out" again by now. Really, how inconsiderate can you get, expecting those poor hipsters to eat something that's *so* 2007.
Better get them some cupcakes instead- oh hang on, that was last year's fad too, wasn't it?
Oh, and make sure that's artisan bread from a small family bakery, not Tesco's mass-produced Chorleywood rubbish.
Re: Expect the scalpers and chancers to be out in force
Yeah, but remember what happened at the launch of the PS3 five years back? All the greedy little would-be scalpers got their fingers burned, couldn't shift the large numbers of expensive units at the obscene markup they'd hoped and ended up making little or no money for their effort.
And gamers all over the world cried in sympathy with them- oh wait, no, I meant that they laughed at them.
"I am not that AC but as I read it, the analogy never said every meal they served, did it? It said the hired a mediocre chef while implying they still charged premium rates.
No analogy is perfect but that is why they are analogies not the original thing."
The OP was talking about how in the (big-budget, mainstream) games development industry, a business's entire future can hinge on a *single* big product.
The analogy failed at a basic level because the example it gave (for another industry) was a situation where this *wasn't* the case, for reasons I already stated. It wasn't merely imperfect, it was fundamentally flawed!
Also, I'd argue that it wasn't meant as an analogy, but as a (randomly-picked) direct counter-example which meant to demonstrate that the same situation applied in another industry- except that it obviously didn't!
It's certainly true that computer game development isn't the only industry where that situation applies- but it's definitely not universal throughout all industries, nor even true in the majority of cases as was implied.
Big budget computer game development is *not* typical of all industries
"In every industry there is the risk that if you fuck up and deliver a mediocre or crap product you go bust."
Not always true, at least not to the same extent as in computer games development. For example...
"If an expensive restaurant hires a mediocre chef they lose customers. Why should software be any different?"
This isn't a good analogy- well, not for you- because it actually demonstrates the point I wanted to make.
Unless they piss off a very important or very influential customer (or mess up *extraordinarily* badly!) the future of a restaurant *isn't* normally at risk with every meal they serve.
Such a scenario might be bad for business, but shouldn't be fatal if managed correctly. After one, two or a few substandard meals (and evidently unhappy customers), they have the chance to correct their mistake (e.g. replace the chef, apologise profusely to their loyal customers and/or whatever).
Even in the notoriously fickle pop industry, artists can sometimes come back from a flop single or even album.
By contrast, moderately-sized developers who've spent literally years and millions of pounds on a single big-budget game are very often reliant on that game being a success for their continued survival. (One recent example was the 2010 demise of Scottish developer "Realtime Worlds" when APB flopped).
That's why it's an industry I'm glad I never had any interest in working in.
Satellites *are* inherently geolocation-locked... just fuzzy at the edges!
"Best thing about SKY is that you can point a dish into the ether from anywhere."
Of course- you can point it anywhere you please.
However, if you're in (say) Australia- or indeed anywhere too far outside the satellite's transmission beam- you're not going to have much luck picking up transmissions aimed at the UK.
BBC and C64 both had strengths, but in clearly different areas
It's undeniable that the C64 was far more graphically suited to games than the BBC (and the sound was definitely better), except- as Torben Mogensen said- for processor-heavy games like Elite.
(Plus, despite the C64's limited palette, this doesn't seem to obviously limit the graphics as much as the somewhat garish primaries of the BBC or the Spectrum, even if it *would* have been better with the Atari 800's 128/256-colours).
However, the BBC had nice crisp RGB graphics output and a 640 x 480 high-resolution mode (that AFAIK the C64 didn't) which made it more suited to serious use.
It also had a far more impressive BASIC and powerful OS in general (cf. the crude C64 BASIC), a faster CPU (*), usable disk drives and better expandability, along with better design and potential for "serious" use. Shame about the lack of RAM on the BBC B though. (**)
Basically, IMHO, both machines had clear strengths, but in distinctly different areas with little overlap.
(*) I find it ironic that C= bought the maker of the 6502 and 6510, yet the C64 was equipped with a slower-clocked 6510 CPU than the BBC and Atari 8-bit's 6502s. (1 MHz versus 1.8 and 2 MHz respectively). I don't know how much the minor architectural improvements of the 6510 over the 6502 compensates, but AFAICT it still wasn't anywhere near enough to make up the difference.
(**) I appreciate that RAM was expensive back when the BBC B- and even the C64!- was launched, but that 32K was notoriously limiting when hi-res graphics were in use, and a major constraint on its power. Given the BBC was already an expensive machine, the cost of the extra RAM would have been proportionately smaller and less of a big deal than it would have been on (e.g.) the Spectrum- even *that* was available in a 48K version! They should have upgraded the BBC B to 64K early on. (Apparently they briefly released a 64K "B+", shortly before it was replaced by the 128K Master line, but that was years later).
- Review Samsung Galaxy Note 8: Proof the pen is mightier?
- Nuke plants to rely on PDP-11 code UNTIL 2050!
- Spin doctors brazenly fiddle with tiny bits in front of the neighbours
- Game Theory Out with a bang: The Last of Us lets PS3 exit with head held high
- Flash flaw potentially makes every webcam or laptop a PEEPHOLE