120 posts • joined 11 Feb 2008
Sinclair Research still around, sort of...
"The year its brand and products were bought by Amstrad and it was shut down for good."
This is closer to the truth than many sources which simply claim Amstrad bought Sinclair Research (and which, to be fair, I had previously thought to be the case).
However, Sinclair Research still existed after 1986- albeit (according to Wikipedia) mainly as an R&D and holding company for the likes of Cambridge Computer, the brand under which the aforementioned Z88 was sold.
FWIW, it's still technically in existence today, though with Clive Sinclair as its only employee and (from what I can tell) operating sporadically whenever Sinclair has a new invention to release.
Re: Anti-Sinclair stitch-up?
Oh, I believe most of the events happened (allowing for the dramatic "compression" of events, which I can understand and accept provided it still broadly reflects the spirit of what actually happened). It was the difference in the very comic way Sinclair was portrayed versus the relatively straight portrayal of almost everyone else (especially Chris Curry) that I felt was unreasonable and gave the programme as a whole a somewhat inconsistent tone.
Still, if Sinclair saw the final product and didn't object to it, I guess he's entitled to his opinion if anyone is!
The only other minor quibble I had was- although the programme wasn't *meant* to be about the market as a whole, but the dynamic and conflict between Sinclair and Acorn/Curry- someone who didn't already know much about the early-80s computer market might be forgiven for thinking Sinclair and Acorn were the only major players. Still, that's a minor issue and it's open to question whether it was the programme-makers' responsibility.
I do appreciate the fact that they did got the major events and facts correct, which isn't something one would always expect(!)
The major problem I had with it was the portrayal of Clive Sinclair.
The programme was (IIRC) essentially billed as a comedy-drama, but that wasn't evenly spread. Most of the "comedy" aspects related to Alexander Amstrong's portrayal of Sinclair. This bordered on an outright *comic* character portrayal- which would have been at home in an Armstrong and Miller sketch- whereas Chris Curry was portrayed in an essentially straight dramatic (and dignified) fashion. Which, of course, made "Sinclair" look even sillier, to a point bordering on character assassination.
Now, regardless of whether Sinclair was/is a d**k or not (and not everything I've heard about him has been flattering), I don't honestly believe that he was as comically foaming-at-the-mouth as that. I didn't expect him to be portrayed as a saint if he wasn't- the problem was that portraying him in a totally different manner to everyone else didn't give him a fair crack at the whip.
Maybe this was a deliberate decision, maybe it reflected Armstrong's lack of straight acting skill (and ludicrous bald wig). Regardless of the cause, it was still a major shortcoming- not just in terms of fairness, but in terms of jarring contrast of the programme's tone.
Re: Found the CPC a bit of a mixed bag
"That was certainly behind the ludicrously petty way the joystick ports on the Amstrad-designed Spectrums was re-wired."
Yes- the port (an industry standard "Atari" DB9, but with the pins rewired) was trivially simple to convert to Atari-compatible- IIRC- via a dirt-cheap adaptor that simply re-re-wired the connections back to their original positions, allowing the use of almost any joystick on the market at that point.
Given that the Amstrad joysticks my friend got with his +2 were atrociously cheap and nasty, they can't seriously have expected this "ludicrously petty" roadblock to work.
@Mr C Hill
"Between a million Spectrum +3's"
Were there really a million +3s sold? According to Wikipedia, there were 5 million Spectrums in total. I don't recall the +3 being that successful (being piggy-in-the-middle between the cheaper Spectrums and the Atari ST), nor that much software being released on disc for it.
"No Amstrad PC compatible used 3 inch disks."
I think the OP was- understandably- getting confused with the PCW, which despite its name wasn't an "(IBM) PC-compatible" but Amstrad's Z80-based word processing system.
@AC and @Mr C Hill
@AC; "His PCs bombed because he was naive about testing things. Which destroyed his reputation when a hard disk they fitted wasn't compatible with his PCs."
Yes, the Seagate drives fiasco irretrievably damaged Amstrad's reputation in the PC market, but FWIW (a) Amstrad sued Seagate and won, which suggests it wasn't just a testing and compatibility issue and (b) Amstrad had already enjoyed massive success with their original mid-80s PC-1512 and its successors by that point. (Sugar claimed that they had been the European PC market leaders at one point).
In the UK, those were the first PC-compatibles truly affordable enough to be targeted at the home market and- despite later criticism of their nonstandard aspects- arguably established the PC as a mass-market format over here.
@Mr C Hill; "And just how much did the Amiga cost at that time? It was north of 700 quid wasn't it IF you could get hold of one."
Worse than that- the original 1985 Amiga 1000 was US $1300 (sans monitor or HDD) when it launched, so probably translated to a lot *more* than £700 once UK VAT and usual UK market padding factored in. (This is probably why the rather more generic but also much more affordable Atari ST was more popular in the early days). The ubiquitous Amiga 500 didn't arrive until 1987, and even that was £500 at launch without a monitor.
This mirrors the situation with the Amiga's spiritual predecessor, the Atari 800 (custom-chip heavy with many of the same design team). That was brilliant and state-of-the-art at the time of its late-70s launch, but it was also bloody expensive.
At any rate, the Amiga was an amazing machine by the standards of the mid-80s, but pricewise wasn't even in the same ballpark as the CPC-464 and friends at the time.
Unfortunately, Sugar also kept the cost of the CPC-464 system down by having it manufactured in the Far East instead of the UK where many computers were made until the mid-80s. He also later transferred Spectrum manufacturing to Taiwan (IIRC) and then China. To be fair, other UK and US manufacturers also started doing this in the mid-80s as well.
Amstrad was never about state-of-the-art, but to be fair, they built some solid computers at an affordable price using off-the-shelf design.
Re: Nobody remembers Bill Gates saved Apple
Not speaking as an Apple fan- because believe me, I certainly am not- but Gates' "saving" of Apple wasn't for purely altruistic reasons. Rather, it was for a more self-serving and pragmatic one- it suited Microsoft to have Apple around as a not-too-strong "competitor" they could point to when accusations of them being a monopoly came up.
No, that's *not* how Amiga gradients were generated
DaneB: "...amazing gradient textures - on sprites and backgrounds"
Vociferous: "Good artists, good programmers, lots of dithering and the blurring effect of CRT screens [..] various tricks were used to make the ECS chipsets 32 colors seem a lot more."
Sorry, but as far as the "amazing gradient textures" go, this is wrong.
While the tricks you describe *were* used on the Amiga to get the most out of 32-colour palettes in general use, the aforementioned background gradients were achieved by having the graphics co-processor update one or more of the colour registers (i.e. changing the palette itself) every few lines while the picture was being displayed.
Here's an example of that technique applied to a 1-bit (i.e. ostensibly single colour) background:-
AFAIK the Atari ST could do something similar as well, but it only had 512 colours (except the later STe model) so the gradients couldn't be as smooth.
Re: How the Mighty Have Fallen
"Just over 18 months later the TV screen went kack. 6 or so months after that the surround sound went kack."
I bet your daughter wasn't pleased that Samsung "kacked" all over her bedroom, then. :-O
"On the other hand, I know a number of Sony Trinitrons (and some monitors) still going strong."
Yeah, my 1993 Sony Trinitron portable is over 20 years old, has been in regular use for almost all that time (still gets used daily), and has *never* needed to be repaired. Apart from some (very minor at worst) colour fringing only visible on text, it looks pretty much as good as the day I bought it.
I paid quite a bit more for that set (£200) than the cheaper models I was looking at that day, but in retrospect it was worth every damn penny.
Sadly, what I've heard about Sony since the early-noughties seems to have reflected a significant decline in quality. Even from personal experience, my Dad, for example, had two Sony Walkmans fail when still relatively new (*), and the HDD Sony Camcorder he bought a few years back had its screen go because of bad ribbon cable design, despite being rately used, which turned out to be a common fault. On top of that, I was never impressed with the picture quality even when it *was* working.
To be fair, my Sony "tranny" radio lasted several years, did a good job, and was replaced with one of the same model when it did fail. However, I wouldn't use that good experience as the sole basis to spend £2000 on a Sony TV, or whatever.
(*) The third failed as well, but that's because it was dropped on the ground.
Only requires *20* of the satanically-posessed bears, apparently
Who said that the, er... "effect" required eating the whole bag?
The top-listed review (by Christine E. Torok) on the Amazon page linked in the article (via "carnage") states "Not long after eating about *20* [my emphasis] of these all hell broke loose."
I doubt 20 would be considered "eating to excess" by many people.
Re: Ditched the floppy without supplying a practical replacement
@Kirk Northrop; "I agree that it could be considered jumping the gun somewhat, and that at the time it seemed a very silly move. But it's also fair to say that someone had to do it"
There was no point doing it until a workable alternative was in place (e.g. CD burner might have been practical two or three years later).
Yes, anyone could see that the floppy was out-of-date and needed replacing, but there were no alternatives at a comparable "base" price point at the time. The only thing that the iMac really encouraged was the adoption of USB.
I don't believe that the iMac forced the decline of the floppy; as I mentioned, everyone rushed out and bought external drives because there was no real, universally-accepted alternative. If the optical drive had been a burner, it might have been a different ketle of fish.
The floppy later declined partly because disc burners got very cheap, but mainly because USB flash memory pen drives did everything people used floppies for but without the ludicrously small capacity.
Re: So that makes three computer businesses flushed...
@AC (9 Jan 23:16); "Sinclair was troubled, which is why he bought it."
Amstrad didn't buy Sinclair Research itself, just the "Sinclair" brand and the existing computer lines (including the Spectrum and QL, the latter of which was discontinued soon after the takeover anyway).
Sinclair Research continued to exist, though mainly as an R&D and holding company (e.g. for "Cambridge Computers", the brand Sinclair's late-80s Z88 portable was sold under) It's still technically in existence, though apparently Clive Sinclair himself is the only employee these days.
Given how famous the "Sinclair" brand was at the time, it's somewhat surprising that Amstrad didn't exploit it more. The only new product they used it on (other than updated versions of the Spectrum and peripherals) was a failed all-in-one home PC in an Atari 520ST / Amiga 500 style case:-
I only vaguely remember hearing about this at the time, but apparently it flopped because it was massively underspecced against the ST and Amiga (and most other PCs).
"hey what wrong with the SID chip!!!"
Think it was a specific reference to the quality of *sampled* sound (which most sound chips on 8-bit home computers- SID included- were never designed for, and it's clever that they got it to work at all).
The technique apparently used was virtually the same as that I used on my Atari 800XL, and in both cases only gave 4-bit audio. I can tell you now that unfiltered 4-bit samples on my Atari were *very* coarse, grungey and noisy, even compared to 8-bit Amiga samples, and moreso compared to 16-bit CD audio!
Re: Duracell vs Energiser
If Wikipedia is to be believed, Duracell lost the 'Bunny' case because they'd failed to renew their trademark.
"The ASA is about as useless as an organisation can get. [..] If the government wants to cut 'whitehall waste' then the ASA is the best place to start."
The ASA is the advertising industry's self-regulatory organisation. On the plus side, this means that there's no "whitehall waste" going to fund it.
On the downside it also explains why it has absolutely no legislative power and is "about as useless as an organisation can get". In theory they can pass things on to the Office of Fair Trading or OFCOM. Whether this simply doesn't happen, or it does and the latter bodies do nothing, is unclear to me.
tl;dr - ST:TNG's crappy analogue source makes digital compression harder
"ST:TNG suffers from a sort of active dithering effect on the backdrops that distracts the eye from the foreground"
The problem with ST:TNG is that- like many US dramas from the late 80s until the digital era- it was edited and mastered on NTSC video (*) Result is that the picture is disgustingly soft- not just by modern standards, but even to my eyes on a run-of-the-mill UK TV in the early-1990s. (**) Probably didn't matter to the US networks, as it was intended to be shown over the same low-grade NTSC system.
To get to the point, it's horrible, soft video like this that seems to disagree with digital compression the most. I find this surprising, as I'd have thought that the softness would translate to less high frequency information, making it more compressible, but no. It appears that you get the crappiness of the original analogue video *plus* the double whammy of the digital compression barfing on it. Maybe it's because the tapes were noisy. (***)
(*) Albeit with most of the footage originally shot on film- presumably to retain that "filmic" look rather than the clinical feel of analogue video-sourced material. Earlier dramas of this calibre were- AFAIK- shot *and* edited on film, which means that they can be transferred to DVD in much higher quality via the film masters, but newer ones like ST:TNG would require entirely re-editing (and redoing some video-sourced effects).
(**) I appreciate that some quality may have been lost in the NTSC -> PAL transfer of the time, but I've seen enough since to suggest that most of the poor quality was inherent in the original NTSC masters.
(***) As far as digital compression is concerned, noise in general is just high frequency detail- lots of it. Ironically, it wastes lots of bandwidth on this noisy "detail", leaving less for everything else and resulting in blocking. So, as mentioned, you get the double whammy of lots of noise from the original *and* the ill-effects of the digital bandwidth wasted on preserving that noise to the detriment of everything else.
Add your Dr Who and Dalek knockoffs here
Two 8-bit games I once owned featuring somewhat familiar-looking characters. Probably not coincidentally, both are British:-
Escape from Doomworld (Atari 800):-
The Lone Raider (Atari 800):-
Dr Who Adventure (Atari 800):-
(I never played this one myself (only came across it just now while doing a quick search), but I like how even though it's a magazine type-in listing and obviously of that standard- i.e. not great- someone gave it 9/10!)
Muphry's Law strikes again!
"You have 3 minutes to fix the typo"
They'd be better shifting to DAB+
DAB is already old tech anyway. As DAB hasn't taken off as they wanted, this means there's not *that* big an established base of DAB radios- actually a blessing in disguise (*) if those in power have the guts to risk offending a few early adopters and (worse) being seen to be changing their minds. Why? DAB+, that's why.
Since AFAIK a lot of countries are adopting the newer- and more advanced- DAB+ instead, economies of scale will probably result in most new "DAB" radios sold here actually being DAB+ models anyway (i.e. backward compatible) if this isn't the case already.
Thus, they may as well shift to DAB+ as the standard (requiring new sets to support it), but retaining standard DAB transmissions in the short term for existing sets. By the time they switch over, many of the early DAB-only sets will have been replaced by newer models that are DAB+ compatible anyway, and we avoid getting stuck with DAB's early-90s tech.
Unfortunately, I suspect that a number of jobsworth bureaucrats, officials and politicians- having already publicly committed to ye olde DAB and sung its praises as the "latest" tech- won't want to risk looking bad (or upsetting a few disproportionately vocal punters) and will put the kibosh on DAB+ solely to avoid losing face.
(*) Just like the original Freeview (i.e. Standard Definition DVB-T technology) being a success is the opposite- the government spent years persuading everyone to move to SD Freeview, and by the time the switchover formally took place, people were wanting HD, so we had Freeview HD. Except that existing Freeview boxes can't accept the newer HD DVB-T2 transmissions, so the already cramped spectrum is wasted by having to broadcast both SD and HD versions of the same content. OTOH, Ireland went digital later, so all boxes sold for use with *their* "Saorview" service were HD-compatible anyway, no separate SD transmission needed.
Re: LET IT DIE
"My best & still favourite is a old Amiga 2000, souped up, that is imperverious to the bugs that roam"
Er, seriously? No-one's writing exploits for the Amiga 2000 because only about 3 people are likely to be trying to browse the web on one!
Seriously, they were bloody outstanding and powerful machines when they were new (far superior to the contemporary PCs in both hardware and OS terms), but that was the mid to late 80s. The original 68000 based Amigas would already have been underpowered for browsing even almost 20 years ago when the two-pages-of-text-and-a-GIF-or-JPEG-if-you're-lucky web started becoming prominent. I doubt they'd even load anything more than the most basic modern pages.
I'm sure that people are still running Amiga 2000s, but not for serious web browsing! You might be able to target the 27 or so diehard Amiga fanatics running the allegedly "modern" models like the "Amiga One", but those are nothing like the Amiga 2000 or the classic Amigas in general.
Just one question about that headline... What's "GOLD" actually got to do with it?!
Is it because he wants...
Always believe there are holes
He had the power to know
That they are vulnerable
Always believin', he wants...
- Copywrong 1893 Spandex Bollocks
The Kray Twins are currently appearing in "Run for your Wife" at Her Majesty's Theatre. Other 80s new romantic turned white boy soul bands are available.
Did Not Read The Article Properly.
As soon as I reached the end of the article, I *knew* straight away there would be a load of comments from people who didn't read it through properly and missed the point being made.
I was right.
From the article:- "In fact, if Jobs were still around – he died two years ago last Saturday – Robertson would forgive his faults and stay invested in the iDevice maker. "I think if he were still there, I'd still be in it," Robertson said of his investment in Apple. "I think he's one of the great geniuses of the world." "
In short, the reason (he claims) he's selling his stake isn't because Jobs was a dick, per se, it's because he thinks "it was unlikely that a man as really awful as I think that Steve Jobs was could possibly create a great company for the long term."
You might agree or disagree with that, but it's not the hypocritical moralistic reasoning that the article skimmers assumed it was.
Looks like this was meant to be Rover's big comeback...
Sad to say he was clearly too old to make it all the way up. Poor thing hasn't been the same since Patrick McGoohan died... if it had still been 1967, he'd have chased you menacingly all the way along that beach then squashed you, just like the old days.
Er, seriously... impressive stuff, though, look forward to the video!
Re: Affordability my arse
"The iPhone 5C, he said, is "highly recyclable," featuring arsenic-free glass,..."
That's nothing. My Android phone's screen is entirely free of Plutonium 238!
And the back panel contains entirely no Murdertanium 666, a substance that leaches out of plastics and stabs random people in your house in the face at night.
Now, to be fair, Murdertanium 666 doesn't actually exist... but that's how I can be sure that my phone contains none of this highly dangerous and alarming substance. Still, it's a good selling point to Daily Mail readers.
Re: Most of us didn't use CDs until the 90s either...
(Additional note relating to my comment on cassette sales figures above; bear in mind that the US used 8-track carts in the 70s, which died and were replaced by cassettes in the 80s. This didn't apply in the UK, where we never really took 8-track to our hearts, and had already been using cassettes for longer. So the sales figures might be slightly different. Still doesn't affect what I said in relation to CDs, though).
"Who ever bought pre-recorded cassettes?"
I did, when I was a kid.
"Everyone I know listened to tapes made from LPs, which cost the same amount anyway (apart from the extra cost of half a blank C-90). And you still had the LP to make yourself another copy when, not if, the tape got eaten."
I looked after my tapes and over 15 years can count on one hand the number of times the tape came out of the machine. And none of them actually destroyed the tape- worst case was that my TDK D60 of "Queen Greatest Hits" sounded a bit crinkled at the very start.
(I *did* actually do that with the first- and almost last- LP I bought (after I got a Midi system), but my parents bought a CD system a month after that, so I bought CDs *and* copied to tape instead, eventually intending to have my own CD player.)
Most of us didn't use CDs until the 90s either...
"Mullet people did not discovered the CD before the 90's..."
*Most* people didn't start using CDs until around the 90s, full stop. CD was still a relatively expensive high-end format in 1986. Harvey Yuppie and Phil Audiophile probably owned one (along with a copy of the aforementioned "Brothers in Arms") but Joe Public typically didn't.
It would be another couple of years or so before they started to become truly mass-market. This is probably signified by the fact that the first full (*) release of a Now That's What I Call Music album was Now 10 in late 1987.
It wasn't until 1992 or so that CD sales finally started overtaking the then-dominant prerecorded cassette format. (**)
It's interesting to consider then, that while the technology was available in 1986, it would probably be "cheating" to use it if you wanted to represent the life of a typical family then. Just like how (e.g.) people see 80s phones and think "OMG, those were the bricks we were using in the 80s" when in fact only a few yuppies and well-paid professionals were.
(*) Now 4, 8 and 9 were apparently released in cut-down single disc format.
(**) Interestingly many people- myself included- thought of CDs as having replaced LPs as the "main" format. Yet it turns out (***) that prerecorded cassettes had already overtaken the LP by 1983, and the latter was already in steep decline- long *before* CD sales had become significant. In short, the cassette probably killed off the LP as much as the CD did. This was apparently bolstered by the industry actively trying to kill off the LP format. So while some may note that CD sales overtook the LP by 1988, that would be misleading in terms of establishing the point of market dominance for CDs.
(***) In the US at least, based on these figures:- http://stopmusictheft.wdfiles.com/local--files/music-sales-analysis/100Index600.png
"Who is busy doing R&D on cassette recorders these days? If you’re thinking of Ion Audio, these products are aimed at format transfer convenience rather a serious effort for archivists"
To be fair, I suspect that most El Reg readers (and certainly those interested enough to be reading this article) know it's unlikely that Ion are doing anything that could be described as "R&D" beyond sticking a USB ADC or digitiser onto a low-end generic mechanism and bundling it with Audacity or whatever.
Nor that they generally have a lot going for them beyond convenience- even I (a non-audiophile) have heard enough that I'd go for something better if I still had anything left to digitise!
Re: Fascinating article
I agree that loading software from tape was horribly slow and frustrating and something I'll never be nostalgic for. (*)
But to be fair, the format was never designed for that. It was adopted in the mid-70s as a much more affordable alternative to disk drives and the like (which were *expensive* for home users at that time).
I wonder why the systems that required dedicated decks anyway (e.g. the Atari 8-bit and Commodore 8-bit formats) didn't run the tape at two or three times speed to allow improved frequency response. (**) It'd still have been compatible with standard commercial cassette duplication facilities. The Atari 8-bit had a "stereo" system that could play audio from one channel while data loaded from the other, but IMHO it would have been a good idea to allow both channels to be used for data (i.e. increased throughput).
(*) My Atari 800XL was *horribly* slow when it came to loading from cassette, probably because the original version- the Atari 400 and 800- came out in the era of much smaller memories (i.e. 8 or 16K) and it didn't matter as much for short programs. Excruciating when you were trying to load something that used 48K or 64K though. It was 100 times worse when you got the infamous "LOAD ERROR - Try Other Side".
(**) I appreciate that cassettes weren't designed to be run at high speeds, so there would have been limits. But apparently the late-80s "Pixelvision" camcorder- designed for kids, and based on standard Compact Cassettes- got away with running the cassette *8* times faster to get the necessary bandwidth!
Re: 32K- the BBC Micro's most annoying limitation
Interesting feedback, explains some things- thanks.
I do vaguely remember the sideways RAM thing; we used COMAL (*) for programming at school and IIRC had to load it from the networked hard drive into the sideways RAM.
I have to say that, for all this cleverness of design, having to limit the main RAM to 32K *maximum* as a consequence was still a major price to pay, essentially crippling the usability of the high-res modes.
The 64K "overshadowing" on the C64 was a limitation, I agree- my 64K Atari 800XL had the same problem. However, the space *was* usable for machine code games in both cases. Not ideal, but not quite a swizz either.
(*) COMAL was supposedly a hybrid of BASIC and Pascal. However, since BBC Basic already incorporated many of the "structured" features of Pascal, COMAL- on the BBC at least- ended up being not significantly different in practice anyway.
32K- the BBC Micro's most annoying limitation
That 32KB RAM limit on the BBC B quickly became its achilles heel and pointlessly limiting on an otherwise powerful (and expensive) machine. If you tried exploiting its rather nice high-resolution modes you were left with sod all RAM. Even having 48K would have made a major difference.
I appreciate that RAM was still expensive when the BBC launched in 1981, but they must have known that it was falling fast (*) and made provisions to let existing users cheaply upgrade - at least- and replaced the Model B with a 48K version as soon as it became cost-effective to do so.
I realise that they eventually released a short-lived 64K Model B+, but that was *much* later (only shortly before the 128K BBC Master came out) so by that point the established base was of 32K machines and software had to be written around that.
The Electron lacking the memory-saving Teletext mode shouldn't have been an issue, since for the price it should have had 48K by that point. If you take away the rather nice OS and BASIC- and the marketing glow cast by its association with the high-end BBC Micro- the Electron looks a bit overpriced against its intended competition. (In fact, I came across a contemporary magazine that described the Electron as an underspecced machine giving the impression it was designed to milk the user on expansion units).
Despite this my understanding is that Acorn probably *would* still have sold a lot if it had launched in quantity in time for Christmas 1983. (**) However, I've heard it commented that the Electron's inherent limitations probably would have been an issue in the long run.
(*) Apparently Jack Tramiel delayed the release of the Commodore 64 until RAM prices had fallen enough to make its 64KB RAM affordable.
(**) i.e. before the non-availability of the Electron had led them to buy competitor's machines instead- which of course increased *their* user base and not Acorn's- instead of later on when most people had a computer and Electrons started being produced in large quantities just as the market was slumping
Re: instant fail
I've never seen the show, but they're using hydroFLUORIC acid?
I'd have thought HCl was a much safer bet for their *own* sakes- HF is not only more dangerous in the way that it burns, but it's also very poisonous. You don't want to spill HCl on yourself, but you really, really, *really* don't want to spill HF on yourself.
(Disclaimer: I Am Not A Chemist)
Re: Pumping the price, are they?
"Well, I know a few eBay resellers that are going to have a very merry Christmas with this."
You mean like all the PS3 scalpers attempted to do when it launched, thinking they were going to make lots of money, and ending up with piles of unsold (and very expensive) PS3s?
What a pity ;-)
Re: What happened to you, Big Blue?
According to this article, IBM was quite blinkered in this way even in the late 80s, around the time OS/2 was gearing up to be stabbed in the back by MS:-
From page 2:- "The emphasis on cost saving over producing a superior product was scary to watch. Another mantra was “every dollar saved on the cost of a PC is worth millions”, quite literally on posters all over the building."
IIRC, according to Robert X. Cringely's "Accidental Empires", even in the old (pre-90s) days, IBM was a management-centric company, with everyone- or at least a significant proportion of employees- wanting to get in. This resulted in many unwieldy layers of management and many managers, which was part of the cause of their near death experience.
Anyway, what happened to IBM? IIRC after almost going bankrupt in the aforementioned 90s, they reinvented themselves as a far more services-and-consultancy-oriented company that seems to be quite good at extracting far more money than they're worth from their victims^w customers.
Re: They'll get used to it
I'm in favour of the Visual Editor in principle; the principle being that people shouldn't be reliant on geek-biased tech skills to edit things in an advanced manner.
The problem I anticipated was that WP uses a *lot* of templates and markup, and often in ways that might not occur to the writers of a visual editor. These will never all be usable via the GUI (*), so the next best thing is to hide them in a way that won't be broken by the visual editor. Unfortunately, this is one of those things that sounds easier than it would be, and I was willing to bet that things would get inadvertantly messed up via the (or rather "a") GUI editor.
Lo and behold, when I tested it out, I managed to "accidentally" delete an anchor within a subsection heading (**). The anchor wasn't shown, but it was still there invisibly, and deleted with an extra click of the delete button. The unwary newbie certainly couldn't be blamed as he/she wouldn't even have seen the markup they'd accidentally deleted. That doesn't solve the problem, though.
(*) Realistically, there's a lot of things that are never going to be doable through the visual editor, purely because there are way too many templates in use, and it'd take way more work to create a true GUI representation of each one than it was worth (way more time than the original template would take). One *could* have a popup for every unhandled template that showed the list of fields to be populated, but that's not really a true graphical representation, just the old text-centric way with GUI textboxes.
(**) Used so that if the subsection heading changes, any links to that subsection don't break
So the Skinner box maker that's notorious for ripping off other companies' games- to the point of near carbon-copying- is getting pissy about its *own* supposed intellectual property?
Cows are large and scary
Article: "Being killed on purpose is more likely in town"
Do you go to the country? It isn't very far. There's people there who'll hurt you 'cos of who you are.
Re: caused obvious distress and upset to customers trying to eat
Shagbag: "A colon is still "100% beef" as long as it comes from a cow.".
Pink slime (i.e. low-quality connective tissue that's been highly processed and antibacterially treated) isn't legal for human consumption (*)- let alone allowed to be called "beef" (**)- within the EU. That includes the UK, where this incident took place.
(For that reason the US disclaimers are also irrelevant. Aside from the fact that policies often vary in different markets and wouldn't necessarily apply to the EU, they would never have been able to (legally) use pink slime here in the first place).
(*) Whereas in the US they're allowed to include up to 15% pink slime and still call it "ground beef" (their name for "minced beef").
(**) Before American law was changed to make it explicitly legal, one microbiologist in the US Department of Agriculture had apparently stated that "I do not consider the stuff to be ground beef, and I consider allowing it in ground beef to be a form of fraudulent labeling" and that that such connective tissue is not even "meat".
Then they should call it "Wave Function" because it's so unreliable it collapses when you just look at it. (*)
(*) Note; I haven't actually used Fedora 19- but I wanted to make that joke anyway. :-)
AC: "Only by Americans. And morons."
As Mark Twain (*) might say "...but you repeat yourself." (^_^)
(*) Yes, I'm aware of the irony of using a quote from an American to insult Americans. Not my fault that Twain was a moron. ;-P
Purple Ketchup? Been done...
Heinz got there years ago with their "funky purple" (*) EZ-Squirt ketchup. Anyone else remember that? They also did it in other colours such as lurid green. According to this article:-
...this was back in 2000.
(*) What the f*** is it with marketing tossers that anything brightly-coloured aimed at young people is described as "funky"?! To paraphrase Alexei Sayle, have you noticed that anyone who uses the word 'funky', who isn't involved in the music industry is a right twat?
"This was 1990, when most people at home would have been using all sorts of different machines. Amigas, STs, Archimedes and so on. These machines were all good but they all tended to lack something."
Since you mention the Amiga, it's worth pointing out that both its hardware *and* operating system was in many respects more advanced and modern than Windows, even in 1990, five years after its launch.
MS-DOS started life as QDOS ("Quick and Dirty Operating System"), a bought-in, early-1980s 16-bit knockoff, er... workalike of an 8-bit 1970s operating system called CP/M. It was nothing special even then. MS-DOS was upgraded piecemeal over the years with numerous kludges to work round the countless design and architecture limitations of the original PC and OS, which made it more complicated. (The PC itself was made from almost entirely off-the-shelf parts and sold mainly because it was an IBM.)
Windows at that time was just a graphical add-on plastered on top of this text-based OS- more clunkiness for all.
It really grates when people get nostalgic about messing about with DOS config files and say "that's just the way computers were back then". No, *that's* the way computers running a messily-upgraded OS with very dated origins (even by the standards of the time) were. Those config files were only required because of DOS's hackily-upgraded 8-bit-derived design. People who only used PCs back then have a blinkered view, and it's a shame that the Amiga only really enjoyed success as a games machine and niche use in multimedia and video. It was extremely ahead of its time when it first came out (4-channel sampled sound and up to 4096 colours on screen at once).
The Amiga had true pre-emptive multitasking in 1985, whereas Windows 3.0 (1990) still only supported co-operative multitasking (e.g. I remember Windows 3.1 telnet locked up the whole OS when the remote server didn't respond, and didn't relinquish control until the connection timed out after two or three minutes).
Of course, the problem with the Amiga is that Commodore sat on their laurels and only made minor changes to the Amiga OS and architecture until firstly the PC hardware then the OS caught up then overtook it. I wouldn't suggest that it's a viable competitor today (even though it's still being updated as a niche product in order to milk diehard Amiga fanatics). But at the time of Windows 3.0, it *was* better.
Re: It takes a scammer to see another scammer
That's a matter of opinion- people can judge the value of the recreated C64 for themselves. It's not like they were misled, nor that they were charged silly money for the things.
Anyway, regarding the original story... It's true that from an investment point-of-view one has to err on the side of caution. If something gives off the warning signs of *potentially* being a scam, it should be treated as such until there is sufficient evidence otherwise.
At a personal level though, the guy's entitled to the benefit of the doubt. Since it isn't entirely clear that it *was* a would-be scam rather than a badly-thought-through scheme by someone with more enthusiasm than sense, he shouldn't be being flamed as the former. While Ellsworth was probably right to warn people off the scheme, she could perhaps have been less personal about it.
Re: Lazy Fat Americans.
"The reason for the use of HFCS is quite simple. Liquid HFCS is much easier for robotic food processing machinery to work with thatn granular [sucrose] sugar. The slightest humidity tends to clump granular sugar."
If that was true, HFCS would be much more popular outside the US than it actually is. (AFAIK, Japan is the only other major market where it's used that significantly- around a quarter of sweetener consumption there).
The reason for the massive use of HFCS in the US is simple. The corn it's made from is massively subsidised by the US government, meaning the HFCS itself is in effect subsidised and cheaper than it would otherwise be. Sugar tariffs on imports are high, increasing the differential.
Obviously the sugar tariffs will be different elsewhere, and- while I'll admit to ignorance of the actual legal situation, I'm guessing that trade agreements would prohibit HFCS being sold outside the US (or at least outside the NAFTA region) at the same artificially cheap price that makes it popular there.
Flash succeeded where the much-hyped Java Applets failed
As others said, it's more like it's one of the "death of a thousand cuts" and the indifferent (rather than indignant) response to the move highlighting its decreasing importance- good or bad.
As we all know, while Java is still around, Applets themselves never took off for a number of reasons (not least their slow speed and resource hungriness by the standards of the time).
However, we did end up with something that filled almost the same niche (at least from the end-user's point of view)... that "something" was, of course, Flash. Yep, the one-time animation-centric plugin.
This wouldn't be to say that Flash was the primary cause of Java Applets' failure. Truth be told, the latter had already pretty much failed on their own merits by the time Flash had started moving past its early presentational roots.
Remember the use of "Terrorism" Act against Labour Party protester?
"The problem is that no matter how much politicians promise not to use this against the "average" person it will end up being used that way."
This is correct. Regardless of whether or not claims as to legislation's *intended* use are made in good faith or not, experience has shown that this cannot- and must not- be relied upon.
One notorious example is the use of "anti-terrorism" legislation, specifically the Terrorism Act 2000 (introduced under Labour's watch), which was used against an 82-year-old German-Jewish émigré who had heckled Jack Straw at the 2005 Labour Party conference. Specifically, the law was (mis-)used to stop him getting back in:-
Regardless of whether or not one thinks he should have been allowed back in, the fact that a supposed "anti-terrorism" law was able to be used- and *was* used- against someone who clearly wasn't engaging in terrorist activity nor in a terrorist context shows that the law was badly designed (assuming it was designed in good faith) and that the very party who introduced it- and were still in power at the time(*)- couldn't be trusted to ensure that its usage was restricted only to the claimed targets. (**)
Similar arguments apply against the use of "anti-terrorist" legislation used to freeze Icelandic bank assets in the wake of the 2008 Icesave bankruptcy.
Whether or not one thinks action should have been taken against those respective parties, the fact that it was done using "terrorist" legislation is the concern, because neither were remotely "terrorist" and nothing like the targets the legislation was claimed to be aimed at.
A law that can be misused for something not remotely related to its claimed purpose (whether or not one thinks that a *specific* "misuse" is justified) is wide open to blatant abuse for a whole range of purposes, desirable or otherwise.
(This shouldn't be taken specifically as an anti-Labour rant; I despise the Tories, and didn't vote for them. However, many- myself included- assumed that they would (at least partially) stop and roll back Labour's egregious assault on civil liberties and pathological disregard for personal privacy. Instead, they're turning out to be just as bad in this respect).
(*) Whether or not it was the police's choice to misuse the legislation this way, the fact remains that Labour were the ones responsible for introducing legislation that could be misused in the first place.
(**) Of course, this assumes that the party that introduced the legislation remains in power to ensure its "correct" usage. Even if *they* can be trusted to act in good faith and ensure its correct use (and in the above cases, they obviously couldn't), this is irrelevant if and when they lose power.
30th anniversary of every man and his dog releasing a Spectrum-basher
There seem to be a *lot* of these "30th anniversary" look backs at microcomputers just now. That's not surprising though, because it was around this point that the home computer market exploded (due to their becoming cheap enough for the man on the street and not just the rich hobbyist). Everyone saw the money to be made and started jumping on the bandwagon.
There were a frankly ludicrous number of home microcomputers being released back then. I have a load of my Dad's old "Your Computer" magazines circa early 1982 to late 1984, and each month there's a review of at least one new computer, frequently two and sometimes three.
Almost all these machines were incompatible, and even then people cared about having a machine that had good software and peripheral support. It would have been obvious to anyone that the market couldn't and wouldn't support them all and that the vast majority would fail- and they did.
In the UK, the ZX Spectrum dominated mainly because Sinclair was the first to release a colour/sound/hi-res computer at that price point. The network effect made its success self-reinforcing and made it harder for the "me too" competitors like the Oric-1 (and countless lesser-known machines) to break its stranglehold, even after it was outspecced.
The C64 did well at a higher price point, and Amstrad's CPC was surprisingly successful for a late-era entry, but aside from a few lesser-supported and/or niche formats (like the Atari 8-bit and BBC), the vast majority of those other computers had disappeared without trace by the mid-80s, never having gone anywhere.
"Then fork'an do it yourself."
While the ability to fork *is* a major advantage, that doesn't automatically make "Don't like it, then fork it yourself" (or some variant) a reasonable response to any criticism of an open source project, and (IMHO) certainly to this one. Else you could use that as a comeback to *anything*(!)
I mean, aside from the fact he might not have the skill to do this, are you seriously suggesting that he should fork it just to have his <blink>? Of course not! :-)
Just because something's open and/or free-as-in-beer doesn't negate people's right to criticise minor aspects of it. Obviously, if they start getting overly entitled (particularly if the complaining "user" is a large company), then, yeah... you can tell them to go fork themselves.
He made a reasonable point; one which (as a generally happy user actually typing this on Firefox) happen to agree with (though IMHO the default option for blink *should* be "disabled"!!)
- Very fabric of space-time RIPPED apart in latest Hubble pic
- 10 years of Facebook Inside Facebook's engineering labs: Hardware heaven, HP hell – PICTURES
- Dell charges £16 TO INSTALL FIREFOX on PCs – Mozilla is miffed
- Google! and! Facebook! IDs! face! Yahoo! login! BAN!
- Video WATCH the SILENT DEATH of mystery SPACE CRUMBLE asteroid