Quite why Americans would feel the need to pre-cook back bacon in particular beats me.
460 posts • joined 11 Feb 2008
I've noticed that some (photo paper) prints from the 1970s seem to tend towards red and magenta.
I'm aware that movie film "prints" (i.e. projectable positive film made from movie film negatives) produced from the early 1950s onwards turned out to have bad problems with fading over time- caused by the instability of the colour dyes used. This started coming to light around the late 1970s.
So I've no idea if the problems with still photo prints are caused by the same issue, and when they fixed them. (#) At any rate, I'm saying that it possibly wasn't the fault of that specific processing lab.
(#) My own photo prints- from 1983 onwards- seemed fine the last time I checked aside from some slight warming/browning, which I think might have be a property of the base paper- rather than the dyes- and was correctable via simple colour balance adjustment.
@Smoking Man; That's not surprising- printers need Inc.
Didn't the Amiga's blitter (not included in the regular ST, IIRC) give it a slight advantage when drawing that offset the marginal difference in raw speed?
The STe? Nice idea on paper, but the two biggest (and most obvously Amiga-baiting) sales points- the improved palette and sampled sound- were hobbled by the retention of the existing 16-colour on screen limit in normal use (#), and by the restriction on playback rates which meant that- unlike the Amiga- you couldn't generate all the notes in an octave from a single sample.
I mentioned more about the STe here.
(#) Compared to 32 regular colours in normal use on the Amiga, plus 64 colour "extra half-brite" and 4096 colour HAM modes.
@I ain't Spartacus; "Says the man who had an Amstrad PCW9256"
That's not worthy of contempt, just sympathy... ;-)
(Joking aside, as far as I'm aware, the PCWs were only ever intended- and sold as- specialised systems designed for word processing and light office work at the lowest possible price, so it's probably not fair nor meaningful to compare them against general-purpose computers like the Amiga. The fact that some people got them to play games- albeit in glorious green-screen- is more a bonus than anything).
Either way, we all know that people are only going to be buying the ST because they can't afford an Amiga.
(Why yes, I *am* trying to reignite a 25+ year old playground battle!)
@Martin Gregorie; "'The Right Stuff' is a much better book than you'd expect if you've only see the rather pathetic film."
I've only heard the song, and I was even less impressed with *that*.
"the moon flew off into space. And whos fault is it?"
The person who decided it was okay to leave all that dangerously explosive nuclear waste on the far side.
@Timmy B; Oh, a *sarcasmometer*? That sounds like a *really* good idea, I'm sure.
"While the VC debt model is clearly flawed, the biggest issue is that it makes a company unable to address changing conditions"
Funny you should say that, as that exact reason has been pinpointed as a likely contributor to Toys R Us' failure. Yes, they probably *did* suffer due to their failure to move with the times (which lazy mainstream media reports parroted as the cause of their demise), but as others noted, this is quite likely because it's hard to invest in- and concentrate on- required changes like that when you're trying to keep your head above water servicing the owner-imposed debt in the first place.
(That said, I also read somewhere that Toys R Us' UK operation wasn't doing all *that* badly by market standards and its demise- shortly before the US stores also went under- was due to the American parent sucking cash out of it.)
Sorry, that should read "shamelessly overpriced".
Their prices certainly weren't "shamless". :-O
I was in about five years ago, and I saw the cost of a switch- something that would have cost around a quid on mail order. IIRC it was something between £4 and £6, and the words I remember quite literally thinking weren't "overpriced" but "shamlessly overpriced".
From what I overheard of the staff there, I got the impression they were genuinely enthusiastic and helpful, but the business itself had obviously forced itself into becoming mass-market and uncompetitive.
@Joe W; So, you know for a fact that the people who complained to the ASA about "pseudo-cheese" (#) were the same people who buy margarine because it's "more healthier" (##), or did you just assume that?
And you're also entirely sure that they complained about it for health reasons and not- say- because they didn't appreciate being sold a low-quality substitute as the real thing? Nor possibly that they objected on principle to something being described as something it wasn't, even if they didn't have any strong objections to imitation-cheese-flavour-food-product per se?
(#) I'm assuming this is who "the same people" referred to.
(##) "More healthier", ha ha, those idiotic people can't even speak right! Especially not when you're putting your own, intentionally stupid choice of words into their mouths.
> "***Computer trying to dial a number***"
> "***Random Computer dialup noises***"
@Captain Scarlet; A *real* geek would have been able to accurately imitate the sound of the modem at the other end and fool the computer into thinking everything had gone well. Though you might have had to drop to 14,400kbps. (I'm a reasonable man and don't expect miracles.)
You'd also be able to cunningly implement a man-in-the-middle attack, so long as you could work out how to interpret and modify the data in your head. In real time.
Whatever happened to the "3D printers are going to change the world and everyone is going to have one!!!111" hype?
Even before I spotted references to the initial dates (2012-13) my initial reaction was "hmm... 3D guns, wasn't all that fuss a few years ago now?" followed by "come to think of it, you don't hear nearly as much about 3D printers these days, do you?"
Perhaps someone realised that a machine that can make a few plastic cogs very slowly *wasn't* going to let Jo Average 3D-print a new car engine from the comfort of her own home after all?
The most stupid example of 3D printer hype I read- in a "reputable" news site IIRC- was that we wouldn't have to bother with the environmentally-unfriendly manufacture, transport and packaging of food products like apple pies, because you'd be able to do it at home with the appropriate ingredient packets.
Aside from being a tedious PITA doing it that way, where the #### did they think the ingredients (which will have to be already mostly pre-manufactured) were going to come from?!!!
> "That's fine for them, but what about all the other contributors to the software who also don't receive any revenue from its use?"
Unless they're complete mugs, I'd assume they're not likely to contribute to Redis' subsequent non-Free (#) versions unless they're compensated appropriately. (That's assuming they're prepared to work with Redis at all).
Meanwhile, they'll also have learned a lesson about the risks of contributing to a project where the owner could do something like this- either because they required signing over copyright (which would be necessary under the GPL, but I'm not sure about Apache) or because the license permitted it.
They still have the option of forking the current (pre-proprietary) version, and I'm guessing this is what most of the existing non-Redis contributors will work on in future.
(#) You say that 'technically they're not "closing source"', but- assuming that's purely the opposite of "open sourcing"- then it depends on how you defined the latter in the first place. AFAIK the *intended* definition was broadly similar to that of "free software", albeit with different ideological connotations.
The fact that the term "open source" has been subsequently interpreted more literally by some- merely "source is openly available"- is what's led us into the position where we're discussing whether a system that is no longer by any reasonable measure "free"- i.e. effectively proprietary- is still somehow "open source". It's not, in the intended sense, but the ambiguity is one I'm sure the proponents of the term "free software" would argue is a problem.
> "Kudos to whoever it was who rebranded Pepper's Ghost as a 'hologram'."
The one good thing about this embarrassing attempt by the Tories to get down with the kids is that it provides good ammo to point out that they don't have a clue what a "hologram" actually is and that their idea of the latest technology is a 150-year-old relic from the Victorian-era masquerading as something modern.
Much like the Tories themselves.
@Herring`; I didn't mean to imply that (strictly speaking) it was a structural engineering matter. However, it was close enough to make the point that even in a life-or-death matter of construction, then yes- you *can* still get such worthless, ill-informed, self-serving people weighing in on the decision. With even more serious consequences than you get with the typical IT fuck-up.
"I'm not sure that we'll ever get to the bottom of who should actually carry the can for that one."
If we don't, then that's the most damning thing of all; that a decision on the fundamental safety of building materials can be made with no-one knowing who's responsible- or no-one *having* to be responsible- for approving the decision and its safety. (The aforementioned Tory councillors made the decision- and have blood on their hands for the results- but they should *not* have been in the position of being able to get it through without that happening).
@ Herring`; I wouldn't be so sure about that. What about the decision to swap fireproof panels for flammable ones purely to save money made by councillors on the Tory-run Kensington and Chelsea council for Grenfell Tower?
(Speaking of which, ever notice that it's conveniently slipping by unnoticed that *still* no-one has been held properly accountable for that decision? You'd have thought that in any fit-for-the-twenty-first-century building code that there would be a requirement for someone who knew what they were doing to legally "sign off"- and be responsible for- a decision like that. (#) Then again, it turns out that the Tories were pushing to *loosen* building codes shortly before the Grenfell fire.)
(#) Regardless of what a bunch of odious Tories (whose only interest was the purely cosmetic improvement of an aesthetically-unpleasing sign of The Poors in the middle of their well-off burgh) thought of it.
In their defence, they probably hadn't been planning on selling it on the early-17th-century Swedish market anyway.
@Boothy; I agree. I've said more than once that if the PDA market hadn't been in decline for several years before the iPhone came out, it's possible that something like the post-Apple smartphone would have evolved from that direction instead- or at least it would have been marketed as such, rather than as a "phone".
> Everything is analogue if you go down far enough
...and if you go down further than that, it's all digital/quantised again! (^_^)
(Disclaimer; yes, I know some phycisist will probably come along and point out that this is misleading, inaccurate or oversimplified).
> The Raspberry Pi is supposed to be the modern take on BBC Micro (6502).
In the sense that it's educationally-oriented, possibly. On the other hand, the Raspberry Pi is very cheap, which- for all that I liked them- was *never* something you could say about the BBC Micro.
The 1981 launch prices of £235 and £335 for the Model A and B respectively are equivalent to £940 and £1340 in today's money. And that was *without* disk drives or the obligatory Microvitec Cub monitor...!
> cream wobbly; "The C64 also benefited from the SID audio synth-on-a-chip. Unlike "chip music" from other 8-bit micros which typically waggled a DAC around to make PWM noise"
(Edit:- @ThomH; If I'd refreshed the page before posting this, I'd have seen that you'd already made much the same point in your reply!)
That's true as far as the original (pre-128K) Spectrum goes- along with some other machines (IIRC the Apple II and Dragon 32). However, it's far from accurate to imply that DAC waggling was "typical" of the majority of 8-bit computers.
Many had separate sound chips:-
- The Atari 800 (which came out in 1979) had a four-channel custom chip called POKEY.
- Several 8-bit computers used the Yamaha AY-3-8912 sound chip, including the Amstrad CPC, the Spectrum 128 (though admittedly that came later on), the Oric-1 and Atmos, and MSX.
- Several more used the Texas Instruments SN76489, including their own TI-99/4A, the BBC Micro and the Coleco Adam (and the ColecoVision console it was based on)
Others had sound generation integrated into multi-function custom chips:-
- Commodore's own VIC 20 (i.e. the direct predecessor to the C64) already included tone generation facilities as part of the VIC chip
- Similarly, the Commodore 16 and Plus/4 included tone generation within the TED chip
- Even the relatively primitive Atari VCS (admittedly not a personal computer) had two-channel audio generation as part of the TIA chip.
The point here isn't whether or not these were up to the standard of SID. It's that they were separate sound generation facilities that- like the C64's- freed the CPU to do other things.
> "Obviously emacs is better than vi."
Well, it's certainly more fully-featured. About the only thing it lacks is a decent text editor...
> I'm really looking forward to the Vega QL+
I'd buy that for a dollar!
No, really, that's all I'd risk on it at this point.
I should also add that- although I'm far less familiar with the C64 than the Atari 800- as far as I'm aware, the former also benefits from custom hardware scrolling, hardware sprites and character-based graphics that allow it to outperform (e.g.) the Spectrum on most games despite its slower CPU.
Unless it's a CPU intensive game that doesn't benefit from such features (e.g. 3D games) in which case it'll suffer.
I should also add that the 6809 was apparently superior to both... but unfortunately let down by being paired with uninspiring supporting hardware in its most famous applications. (The Tandy CoCo and its near-clone, the Dragon 32 both featured the same dated graphics chip as the Acorn Atom and the sound was similarly limited.)
As far as I'm aware, for a 6502 of given clock speed, you could expect broadly equivalent performance from a Z80 clocked at double that speed or slightly more.
In other words, the Atari 800's 1.79 MHz 6502 would have been roughtly equivalent to the Spectrum's 3.5 GHz Z80 (#), the BBC Micro's 2 MHz 6502 a little faster... and the C64's 1 MHz 6502 was still on the slow side.
(#) Though the Atari's custom graphics hardware (including hardware scrolling, hardware sprites and multicolour character-based modes) meant that it didn't have to inefficiently use CPU cycles on certain things that the simpler Spectrum would have needed to.
Thank you, Richard Speed(!)
You inadvertently killed 72 techies by improperly shutting down their VMs?! That's one badly-designed system...
Did the consoles explode in a shower of electrical sparks a la Star Trek?
> If you eat a single banana, you'd get more radiation exposure
It's true that the level of radiation being discussed in this story is tiny, and nothing to worry about.
That said, since we're discussing the banana equivalent dose, I'd point out that it's misleading. It rests upon the fact that bananas contain potassium, of which a very small percentage (in nature) is the radioactive isotope potassium-40.
However, your body doesn't retain potassium much beyond the amount it needs; anything in excess will be secreted via the usual channels. (#) Thus, unless you were deficient to begin with, eating a banana isn't going to noticeably increase the amount of potassium- and hence radioactive potassium-40- in your body, which will remain fairly constant. (Hence, in turn, the (incredibly low) level of radiation that it exposes you to should also remain constant.)
In short, the radioactive potassium from bananas doesn't "build up" inside your body if you eat more of them, in contrast to other radioactive substances that can accumulate in your bones et al.
(#) It doesn't really matter whether the potassium it got rid of is the existing or "new" stuff, as it has a half-life of just over a billion years.
It's a bug bounty, which I assume is one of the poorer-selling variants.
Personally, I prefer the milk chocolate one.
(Ironically, your comment reminded me I had half a Bounty beside my desk, which I ate as I typed that out...)
"Because 18:9 sounds much better than 16:9"
That's because it *is* better. But it's still not as good as the 180:9 screen I have on my upcoming smartphone which is obviously *ten* times better than even that. I call this a "Ludicrously Widescreen (TM)" display.
It comes in 7" and 10" versions. But remember, that's 7" and 10" Ludicrously Widescreen (TM), so it's much better than Apple and Samsungs'.
(I know there are people out there who probably *would* buy this "impressive" sounding device on spec... and I'd like to see their faces when they realised what "180:9" actually implied in the context of a 10" display. They'd soon understand why I'd chosen to name my phone... the Sumshite Stykk.)
The standalone pocket/handheld TV market is all but dead, and as far as I can tell it's as much to do with the switch to digital terrestrial.
There's a video on YouTube, "Whatever happened to handheld TVs?" that looks at a handheld television made by some no-name company designed for regular DVB-T (the terrestrial TV standard in Europe and some other parts of the world) and it's... less than impressive. It might simply be that the device is shite, but given my experience with other portable aerials, it's more likely that DVB-T just isn't suited to portable use.
That- along possibly with the need for decoding by low-powered devices- would explain why they created the separate DVB-H standard for handheld devices. However, that was around a decade ago and as far as I can tell, it never got past the trial stage in the UK, and flopped elsewhere.
Even if handheld digital TV had taken off, I suspect it's something that would have been integrated into smartphones by now, rather than being sold as a standalone device. As things are, most people are just going to watch "TV" over their phone's standard TCP/IP connection.
@Anonymous Coward; "You can hardly go wrong"
Well, not unless you take into account the fact that any entity large enough to invade the US- even after an incident like that- is going to have to (a) be a nation state and (b) obviously identifiable.
And that even if that nation wasn't identifiable as the source of the attack beforehand, it's going to be treated as such as soon as it tries to invade and likely subject to retaliatory action from the US's own nuclear weapons.
But, yeah. Apart from that "you can hardly go wrong".
Of course, we can get into nitpicking wankery and it's true that- as you possibly intended- it's theoretically possible that the invading country had nothing to do with the original nuclear attack and were merely taking advantage of it.
But then, given that they'd need to "just happen" to have been in a position to mount a full-scale invasion of the US in the wake of the attack, how much benefit of the doubt do you really think a freshly-nuked US would cut them?
And frankly, do you think the US would care? Even if the invaders *did* provably have nothing to do with original nuclear attack, how likely do you think it is that a completely vulnerable US would let them take advantage of it without responding against the invading country with their own nuclear weapons anyway? Particularly as it's likely to be the only option left to them.
All assuming people would be calmly sitting down and considering such academic smartasseries under such circumstances. How likely do you think that is? (Spoiler; not remotely).
That aside, it's a great idea whose flaws were only spotted by the massed ranks^w rank of one other random guy on the Internet spending a minute thinking about it.
I'm pretty sure the makers of Alphabetti Spaghetti never intended anyone to do *that* with it.
@PhilipN; What a bizarre non-sequiteur, but well spotted anyway!
Looks like he's performing at The Cresset, which is apparently a venue in Peterborough (which ties in with this video coming from Cambridgeshire Police). (#) Unfortunately, you've missed the show on Feb 17th, 2017, but he's apparently doing another in October this year.
If you'd have told me thirty years ago I'd have been able to look up this sort of thing online from a barely legible poster I'd also seen online, I'd have been utterly gobsmacked!
(Also, I was surprised to find out that Wilde is almost 80, but then, he was famous in the late 1950s which is sixty(!) years ago).
@TechnicalBen; I rephrased my comment slightly after posting, so I apologise if (as I assume) you were replying to what I'd originally said.
And yes, you're correct- the Slate article (the second link) makes clear that indeed the moon *would* be tidally broken (which I should have added myself), "So we wouldn’t even have a Moon; we’d have a thick debris ring composed of ex-Moon. That would be cool to see, too, except for the whole everyone being dead thing."
Which is also nice.
@DougS; This video is an artist's impression of what it would look like even closer than *that*- specifically, if the surface of the moon was at the distance from earth of the International Space Station (circa 400km, which would require the moon's centre to be at a distance of 2158km).
Except that- as you already realised- in reality, it wouldn't because- leaving aside the fact the video is slightly speeded up (the moon would take more like five minutes to cross the sky)- the tsunamis generated by the tidal forces at *that* distance would have waves literally kilometres high and running for your life probably wouldn't do much good.
Not to mention that- as also spotted by Brewster's Angle Grinder- the earth would be stretched, leading to huge earthquakes and increased heating resulting in volcanism that would probably boil the oceans away anyway, so you'd probably die due to lava rather than flood.
Which is nice.
@RancidRodent; Yes, I heard that the Microdrives were quite good once they got the flaws ironed out, but I assume that by then their reputation had already been sealed.
Unfortunately, Sinclair has no-one to blame for that but themselves for rushing the QL out way, *way* before it was ready. (Prompted, no doubt, by the fact that when it was launched and they started taking orders- for claimed delivery within "28 days"- they apparently didn't even have a complete working prototype(!)).
@Teiwaz; "I wasn't even aware Bebo was still going?"
Years ago- it must have been before the 2013 bankruptcy- someone I worked with used Bebo and even *then* I was like "are people still using that?!"
But the original Bebo has been dead and gone for several years now. After the company went bankrupt it was sold back to the original founders (#) who shut down the original site and relaunched the company as a designer of social apps that doesn't even call itself a social network any more (##).
Regardless, it's obviously irrelevant nowadays. Involving them in this would be like parliament in 1991 demanding the remnants of Kajagoogoo answer questions about those newfangled illegal acid house raves.
(#) $1m, compared to the $850m (of which $595m was theirs) they apparently sold it for in the first place.
(##) Wikipedia states that "Bebo [..] now describes itself as "a company that dreams up ideas for fun social apps;" Grant Denholm, the man behind the Bebo relaunch, has confirmed that the site will not be returning as a social network but as a company that makes social apps."
@martinusher; It wasn't *even* going to be an FPGA-based recreation anyway! (#) All that was ever intended was an emulator running on some arbitrary device- i.e. a long-solved problem. See here and here.
But- as you correctly note- this is all irrelevant as the problems are blatantly the result of managerial, legal and interpersonal issues within the company, not technical.
(#) If that's what you're after, the far more interesting-looking Spectrum Next- from a totally unrelated team/company- aims to do just that.
I fought the law and the lawyers won.
@Mycho; "Meat Loaf approves of their efforts".
"Liberté, égalité, fraternité".... two out of three ain't bad?
@jim parker; You haven't seen 2001 if you haven't seen it in the original Klingon.
"Are you being ironic, given what happened to Jupiter and Europa in 2010 or did you not read 2010 ?"
Yeah, but that's 2010- which was written around fifteen years later- and for various reasons it's open to question how legitimately one can back-read continuity and explanations between that and the film of 2001.
tl;dr - (i) 2010 was written years later, (ii) Kubrick wasn't involved in 2010 at all, (iii) both book and movie of 2010 reflected Clarke's more "literal" vision seen in the original novel which perhaps was never the intended spirit or interpretation of the film ending and (iv) the discrepancies between versions and sequels mean we can't assume one applies to the other.
The novel of 2010 was written by Arthur C. Clarke alone and follows the far more literal style of his original novel of 2001. *That* was written alongside the original film- rather than being a direct novelisation of it- and- along with the different "approach" and feel- varies somewhat in its depiction of specific events (e.g. the action takes place around Saturn, whose rings were deemed too difficult to acccurately depict for the film).
While it's often implied that the novel "explains" the post-Stargate ending of the film of 2001, the differences in what comes before means it can't be taken for granted that this is the case, or even what was intended. Given the aforementioned differences in approach, it's quite possible that- unlike the novel- the ending of the film was always *meant* to be open to interpretation and viewed as such, and that trying to shoehorn it into the excellent-but-different literal viewpoint of the novel both does it a disservice and misses the point.
Back to 2010... the original novel- which came out a couple of years before the 1984 film- follows very much the approach of Clarke's 2001 novel. (I first read them one after the other- before I'd seen either film- and enjoyed both very much- 2010 was a great sequel).
The film 2010 is based on the aforementioned Clarke sequel novel, and Kubrick was not involved at all. (#). That's why it's so different in feel and approach to Kubrick's original, and why I don't consider it the latter's direct spiritual successor. Yes, they've included elements from Kubrick's 2001- and even Clarke's 2010 novel altered the continuity to fit the original film rather than the original novel better- but the film is still essentially "Hollywood's movie version of Clarke's sequel to *his* original novel" and reflects the approach and style of the latter. (##)
There's also the question of whether one can apply the events of 2010 to 2001, since the latter were Clarke's alone and he possibly- indeed quite probably- hadn't thought them up when writing the original story.
(#) Indeed, when he saw it, he apparently complained that they'd "explained everything". Which might back up my view on trying to shoehorn the novel's "explanation" onto the ending of the original film.
(##) The film even "recaps" the line "My god, it's full of stars" from just before Bowman enters the Stargate in 2001. Except that was *never* in the original film- only Clarke's novel.
Biting the hand that feeds IT © 1998–2018