143 posts • joined Tuesday 29th May 2007 13:56 GMT
How not to build a 32-bit CPU
Ah, the 386. I remember when the first Compaq 386 machines came out (I believe at 20MHz, not 33MHz, though Wikipedia tells me that slower 386s were available), and my fanboyism disliking the fact that it took the performance crown for desktops back from the 8MHz ARM2 Archimedes machines (probably as measured by dhrystone, although BASIC may have been involved). I'm sure my copy of Structured Computer Organization contains some comment about Intel "finally making a decent CPU", but unfortunately I'm a few thousand miles away and can't check - anyone got the red edition?
The 486 was a bit nicer as designs go, excluding the slight problem of getting everyone to optimize code in a way that was pessimal for Pentiums. I still wish IBM hadn't decided to use the chip from their printers and had gone with the 68000 series from the start (and if we were going to end up with thirty years of compatible machines foisted on us by Windows, we'd ended up with anything less crufty than x86), but at least it meant that near and far pointers weren't always obligatory...
How did we get from Samsung and LG being fined 1.47 billion Euros to them having a total of about 345 million Euros, of which Samsung's was only 151 million (and smaller than Philips - who are also fined more than LG - and Panasonic)? Even collecting the four mentioned companies together, there's half a billion Euros unaccounted for. Or did I hopelessly mis-read it? Not that Samsung are unused to being fined a billion, but it'd be nice to know what's going on. (Disclaimer: I'm employed by Samsung, this may hit my salary...)
Ah, that takes me back. Not that I ever completed it as a kid. I really must go back to it. I still tend to think "ope do" when opening doors, which is a bit worrying now I come to think about it.
After this, the Lord of the Rings game was a big disappointment, mostly because it barely worked (maybe I had an iffy tape). Although that's how I got my copy of Fellowship of the Ring, so it turned out all right in the end.
People who don't read up...
It's very publicly been known as having a soft G since its instigation, and the request of the owners. You may have been pronouncing it with a hard G for all that time, and it's more recently become explicitly accepted that this is okay, but historically the hard G was plain wrong, if common. These days I don't bite anyone's head off over it (though I still twitch whenever my colleagues say it with a hard G), but I'm not going to take people claiming that a soft G is wrong.
For the record, Linux historically came with an indication that Linus preferred it either to be pronounced as he did (sort of Leenuss, as I recall), with a secondary preference that one should pronounce it as one would pronounce his name (in my case, natively, Lie-nuss). He didn't like people attempting to pronounce it like his name and getting it wrong, as in "linnux". I believe he has since changed his mind on this, not least because "Linnucks" is so common. I still say "Lie-nucks".
Oh, and Risk-Oh-Ess, for what it's worth.
"All this hardware oddness begs one question, of course: why would low-power ARM-based chips such as Apple's A series, fabricated in increasingly smaller and therefore increasingly less power-hungry manufacturing processes, need to resort to such a complex, space-wasting scheme as fan-based cooling?"
Well, my elderly HTC sensation gets nice and toasty even playing Angry Birds. Run something with actual 3D requirements and there's no way to avoid heat being generated. ARMs and embedded graphics cores are very efficient, but we're still talking multiple >1GHz processor cores and a lot of graphics, usually in a fingernail-sized chip, and they're not magic. Unless Apple are getting their performance by fabbing with superconductors and Josephson Junctions, of course.
Digital babelfish, how I miss you
Some remembering the earlier days of the internet may recall one of the first popular public digital translation services, babelfish.altavista.com, since moved to babelfish.yahoo.com, and now redirecting to Bing translate. Altavista, of course, started out as altavista.digital.com, set up by Digital/DEC, and some of us remember when it was the search engine of choice.
So, no - digital babel fish is not closer. Microsoft have moved Babelfish several step away from Digital.
That said, I'm often astonished that Google Translate manages to produce something that's not a coherent sentence, let alone incorrect. I always assumed that something in the implementation of these things must understand some rules of grammar which ought to make that kind of problem tricky.
In this sense?
"Instead of capturing the output of each individual pixel separately – as sensors normally do – the trick is to combine the output of groups of individual pixels into a larger pixel."
Which is in fact exactly what happens on most cameras if you save a lower resolution image than the default provided by the sensor. And, indeed, it normally improves the noise handling. I've no idea why everyone is making a big deal about this, other than that Nokia can justifiably claim that the sensor resolution is not detrimental. (Now, arguing about whether Nikon should implement a "small raw" mode on their high-end DSLRs is another matter.)
"This technology allows the 808’s sensor to capture as much light information as much larger pixels and sensors would."
The fact that the 808 has, for a phone (and most compacts), a chuffing enormous sensor and a fast lens means that it can capture a lot of light information. The amount hitting the sensor is the amount hitting the sensor. The amount per pixel is small, but DxO do their noise tests normalized by total image area, and exactly the same concept applies. So "[fewer] larger pixels", yes - "larger sensors", no. So much FUD about downsampling...
Anyway, nice bit of kit. I might pick one up as a compact camera once their price drops to clearance levels. Not so tempting as a phone, though.
Re: Well, once I ascertained that Nokia was true to its word
Oh yes, they're completely lying. It's *only* 38MP. Which isn't interpolated (except in a Bayer sense), that's actual sensor sites.
Let's pick our fights?
And still down-playing themselves?
So, assuming this is actually done with image processing (and yes, it's normal to capture a larger frame and calculate a shift within it), why have they decided to go with "optical" image stabilization? Optical solutions fix camera shake, but do nothing to handle subject movement, like the bouncing Nordic woman, which is why professional sports photographers still have large aperture lenses. In extreme cases, as here, keeping the subject static in the frame would result in the background bouncing around, due to the change in perspective from the moving camera position (something Canon have tried to fix in a stabilized macro lens, but not for riding a bike).
There *is* a lot of research into stabilizing/removing blur independently from separate bits of the image - some was presented at SIGGRAPH this year, and Adobe explicitly stated that their work on this was the reason that they'd not yet released their camera-shake-removal technology (demoed recently) in Photoshop. But "optical" it's not.
If it's image processed, you may as well own up to it. Of course, if there really a stabilization element in there, I take it all back - but it's quite possibly not the best solution.
These things always look good in demos. I'll reserve judgement until a real world test, although I don't think my DSLR is going anywhere.
Re: So they think that the market
I agree that it seems unlikely, in the time that everyone's moving to Red and other digital imagers and shooting stereo at 60fps. I do think that ditching the consumer film industry is a mistake, in that they have a small but loyal base and, like Ilford, someone continuing to make Kodak film stock will always have a market, especially in the formats not supplanted by digital. But since Kodak have been mis-judging the film market since APS and Disc Film, I'm not sure that they have a concept of sticking to what works (even if, admittedly, the market has shrunk a lot).
In as much as anyone's printing anything at all these days (the vast majority of images stay in digital form), a lot of people are either using commercial print services or local shops - because they're plenty good enough and more convenient. Of those who print a lot at home, as far as I know the big names in photo ink jets are Epson, HP and Canon, with companies like Lexmark and Kyocera sniffing around. I'm vaguely aware that Kodak make printers, but I've never been under the impression that the had a significant market segment. If nobody's buying your printers, you can't make money on the ink; if you make budget printers rather than the market leader, it's more likely that your customers will buy cheap off-brand ink.
Kodak have never been a leader in (consumer) digital imaging - they're just not an electronics company, and they're not going to compete with Canon, Panasonic and Sony. They don't have the optical and ergonomic background of Nikon or Pentax that allowed those companies to get a foot-hold in the digital market - Kodak haven't been a halo brand name for cameras in a very long time.
They're a film company. They've been a film company for a very long time. That the market for film has drastically shrunk is unfortunate for them, but trying to reinvent themselves into other sectors where they've not been successful isn't going to make them great. From a customer's perspective, I'd like them to continue doing the things that only Kodak do (making some proprietary emulsions), cut their costs, and try to find some new area in which to invest. Throwing out their only unique products while attempting to become a profitable box shifter for consumer goods doesn't seem like a viable long-term strategy.
My only hope is that someone like Fujifilm decide to pick up the film plant and keep making the emulsions, but given that they, too, are discontinuing some films, I don't have much hope. Maybe the Impossible Project will pick it up.
So they're keeping their "commercial film" (is this motion film?) but getting rid of the consumer stuff? (I assume they don't mean commercial vs consumer photographic film.) And they're keeping their ink jets; really?
I hope someone picks up the slack. Otherwise, I'll be off to stock up on Portra - to go with my stocking up on Velvia after Fuji discontinued that. They're not making it easy to buy into a 5x4 camera system...
And the rest?
Interesting review. To me, a lot of the sample images are focussed in odd places (not on the eyes, at least); I don't know whether that's the AF system or the photographer. I've mostly heard bad things about the video on the D4, apparently because the downsampling algorithm introduces unnecessary softness (the 1:1 crop mode is okay, however, but why use a D4 for that?) The D800, which probably just throws lines away, is apparently much sharper. Interesting that there's no mention of the new AF switch position, which is already annoying me on my D800E. I'm not sure that mapping the AE-lock to pressing the joystick is an improvement, either. I'm sure grid lines were available in the finder on the D3s - I've always used them on my D700 (and they help me keep my horizons straight).
This is obviously a camera with a purpose: it's a work horse for high speed, low-light shooting. Journalists and sports shooters will love it. It's not a consumer camera, or a megapixel monster, so comparing it to a 'blad (or a D800) is pointless, as is talking about the price - the right image can sell enough newspapers to justify it, and the price is still lower than many pro lenses (and almost exactly the same as the 200-400 f/4 shown in the review). Consumers shouldn't feel they're missing out if they can't afford one and have to settle for a D3200 instead; each tool to its own place. So no more "how much?" comments, please.
And film? It does have its place, but its place isn't in low-light, high frame-rate, fast turn-around shooting. I have, and use (mostly for flowers that look better on Velvia than digital), an F5, but I'd be mad to try to take on a D4 (or D3) with it for the tasks it was originally designed for. Pros do still use film - but more often in a cheap camera for portability and travel, in a Leica for subtlety (those who haven't spent D4-money on an M9), in medium format for quality or in large format for the ability to fill a wall with a sharp image. Even my Pentax 645 isn't going to get much use now I have a D800, though a Mamiya 7 or a 5x4 (on my shopping list) would be another matter; for what they're good at, they'll smoke a D4 - but then so will a D800. Photojournalist use of film died with the D1 (and especially the 1D), which is why the F6 is so different from the F5 and aimed at prosumers, and nobody's updated a film camera with a modern autofocus system. Shooting through rose-tinted spectacles results in poor images. Someone was using large format at the Olympics to get some interesting images, but I'd be astonished if any pros were using 135 film, at least in an SLR.
Dragging to external devices
For what it's worth, I don't really know *what* to expect when it comes to dragging files around. I established in the 90s that Windows had different behaviour depending on the source and destination location, decided that this was one of Microsoft's usability nightmares (I have USB drives that don't behave like removable devices and systems that come up with their drives in a different order; worrying about the default behaviour is the last thing I want), and I've been right-mouse dragging files (and selecting copy or move from the menu) ever since. Oh, and occasionally I use Ctrl-X/Ctrl-C and Ctrl-V. But then I was brought up on Acorn systems, which let you choose whether you wanted a copy or move by which mouse button you were doing it with, so trying to second-guess based on the device type was never a problem.
Not that this makes the Chromebox any more usable from the sounds of it, but "not like Windows" doesn't mean "broken". That statement is really quite often true.
Re: No chance. NO. CHANCE.
Ah, I wondered about the "press a button to tell the time" problem (I've owned an OLED watch before). Of course, in theory, with OLED you could make the screen mostly black and it *ought* to use very little power. I still think the Pebble eInk solution looks better, if only they'd get it shippable.
I'm vaguely tempted
My latest version of a DataLink has got to the stage where I'm probably never going to program it again, the strap is mangled and the battery is dying. I was on for a Pebble (eInk appeals, although if the previous product was still made I might have been tempted by that too), but they're not out yet, and I'm struggling to find other smart watches that are actually being made - at least, any smarter than a DataLink. I'm less interested in syncing with the phone than running my own apps.
I'm not quite clear with this whether users can write their own stuff for it (need to do more reading). Knowing the resolution of the screen would be nice, too.
Good old British weather
I was up at 4:15 (after watching first contact on a web cam before going to bed - good work Hawaii, not so impressed by the guys from the continental US who were filming each other and not the actual sun), ready to go out and meet the sunrise. But I saw the weather and stayed in bed.
Then I got up around 4:45 just in case, and watched some more live streaming from Oz and from Hawaii. And it stayed cloudy, but with the occasional thinner bit.
About 5:45, after Hawaii had reported third contact and while Norway were showing parallax, I realised the sun might actually hit the house. So I ran upstairs, ignoring all the exciting telescope stuff I had with me, and pointed a (stopped down) lens right at the sun. Lo, the sun had a tiny bite out of it (I was slightly nearer third contact than fourth), and I have photographic evidence (and hopefully not a knackered camera) proving I was looking. It turns out that clouds work as an extremely dangerous alternative to a solar filter. Of course, if it had been sunny, I would have had more time to project an image onto paper.
I've heard the "refractors good. reflectors bad" argument before, but I'm very confused as to why. I'd have thought (enclosed) refractors are more likely to overheat than an open reflector (incidentally, *all* Dobsonians are reflectors, by definition). The only reasoning I can think of is that reflectors tend to be faster (shorter focal length per aperture), but that's not strictly a reflector vs refractor difference. Anyone care to educate me?
Re: Still have a mint-condition A5000 Alpha (33MHz!) in its box.
Yup, studied the textbooks, own "The Design of Everyday Things", did HCI as part of my CompSci course, was a member of SIGHCI for a while. The Mac/Amiga solution is better than trying to aim for the top of a window - as you say, there's a hard stop. However, it doesn't scale well to large or multiple monitors. The Acorn solution of popping up a menu in the same position relative to the mouse meant that muscle memory for menu access worked very well - compared with flinging the mouse at the top of the screen between each interaction, at least; you're incorrect about claiming the need for "greater aim" because the menu was already under the mouse when you start. It's true that context-specific menus (changing the mouse pointer, there's an idea for Microsoft...) needed aim, but no more than pressing a button.
There were plenty of keyboard shortcuts available for common operations on RISC OS, but they were much less necessary than on other systems - claiming power users weren't what the mouse was designed for doesn't mean that making the mouse interface as powerful and usable as possible was a bad thing. Sure, Impression Publisher (which had its own hot keys should you want them) isn't as powerful as InDesign (although it can occasionally give Word a run for its money), but InDesign is unusable if you're using one hand to hold the reference document that's the source of your layout. I'm a little confused as to which keyboard you've been using that has only "Alt" and not "Alt Gr", let alone separate Windows and Menu keys, but - much though I love Emacs, ctrl-alt-meta-cokebottle-x is not a user-friendly short-cut. Acorns had a "copy" key that, who knew, copied things.
I like Macs, but my HCI lecturer was a bit prone to claiming that their interface was perfect - notably "don't make nested menus too deep because you have to click every time to expand them" (not on RISC OS you don't). Acorn's interface guidelines would still do a lot of modern developers good - particularly "never write a large amount of text in a dialogue box and put OK and CANCEL at the bottom - name the buttons for what they do". Acorn never asked you to eject a floppy disk by dragging it to the recycle bin or popped up a "disk not recognized - format? [ok]" box, nor did it expect you to shut down the system by clicking "start" or decide whether it was going to copy or move a file according to where you were putting it. There were some really nice touches - expanding a window off the bottom/right causing the top/left to grow springs to mind.
Not that everything was perfect. It's nice to be able to resize windows from more than one corner (I had a plug-in, although I still think twm had the nicest solution). There was still the odd UI clanger ("Please insert RISC OS 3 ROMs and press any key to continue"), as a co-operative system it could still get locked up by a misbehaving app (although app killers help), it wasn't as dynamic or secure as a modern OS. But some stuff really was done right, and still isn't by almost anything else.
Re: Lander in BASIC?
[And, while I'm eating my words, the reset button on the Archimedes was, of course, on the back of the keyboard, where it was useful if slightly prone to getting poked by the keyboard coily caable - although it meant the keyboard was nonstandard. The RiscPC's "normal" PS2 keyboard meant the reset button was, as I said, at the back of the machine, where you'd hit it plugging in headphones. Clearly I'm going senile.]
Re: Lander in BASIC?
[Okay, I've found an article that claimed that Lander was in BASIC, although I suspect it was only the boot code. Unfortunately, because everyone had a copy, I'm having a little more trouble finding a binary to check. I'm prepared to eat my words, though.]
Re: Lander in BASIC?
I'll believe you, it's just that this thread is the first I've heard of it. There can't have been much logic on the BASIC side, and even a CALL statement to thunk between the two had quite an overhead, so it just seemed unlikely to me. If there's a reference to this, I'll be interested. (Or I may be able to find my old disks and have a look.)
I'm prepared to believe that it might have used BASIC to set itself up, but that seems less likely than doing any BASIC when the program itself was running. I'll go and google this now, but I would have thought that I would have remembered...
Re: Still have a mint-condition A5000 Alpha (33MHz!) in its box.
I absolutely agree about the mouse. Even after Microsoft eventually worked out that their mice had a second button, mousing on Windows still seems stupid compared with the Acorn approach. (Actually, menus at the edges of windows are the worst of all possible worlds - not near the mouse, not where you can get them quickly. I take the Amiga's scheme - like the Mac but invisible until you hold down a mouse button - as second best.)
Add in MouseAxess (we don't need no stinkin' window borders to move things...) and you got a system which was far more usable without a keyboard than most modern PCs.
Which brings me to the things you can do with the three button mouse. None of this "shift-click" to multiple select, that's what Adjust was for. Drag a window without bringing it to the front? Use Adjust. In the file manager, decide whether you want a new window for the directory you're entering or to re-use the current one? Select or Adjust again. I want to say the same thing about the difference between a copy and a move, but it's been too long for me to remember. And, of course, Acorn had the most sensible file save mechanism I've seen (why on earth does every application in Windows need its own way of viewing the file system?)
Ah, rose-tinted goggles. Shame about the lack of pre-emptive multitasking,..
I'm not sure about "problems" as such, although it wasn't until RISC OS that there was a proper multi-tasking interface, the draw module is an epic piece of useful coding, vector font rendering (as of later versions) was way ahead of its time (although so was the bitmapped antialiasing of Arthur), and in the newest versions, the SpriteExtend module (dynamically expanding JPEGs to render them stretched and dynamically mapping the output to the screen palette...) makes me wonder why a lot of modern systems struggle so much.
Hands up if you remember the dark blue/light blue version of Arthur?
Re: Lander in BASIC?
(Since I've been sad enough to look it up, Virus also didn't adaptively shade the spacecraft according to surface angle like Zarch did, possibly more because it didn't run in enough colours to do so rather than anything to do with calculations.)
Lander in BASIC?
I'm not sure why people are saying that Lander was written in BASIC. To the best of my knowledge, it was pure ARM code, and I'd be astonished if it went anywhere near Acorn's triangle drawing routines. BBC BASIC on an Archimedes is fast (and the Arthur, although not RISC OS, desktop was written in it), but it's not *that* fast. Minotaur, one of the first commercial games launched for the Archimedes (alongside Zarch), *was* written in BASIC, however.
(Speaking of BASIC, I'm not sure about this "press reset twice and you get the program back" thing. Not in my memory you didn't. I also remember the reset switch being on the back, right next to the headphone socket, where it was easy to reset the device when plugging headphones in. It was still on the back on the RiscPC, even though the power switch was on the front. Never understood that...)
As for comparisons to Virus, I believe Virus had some more enemies - it got harder faster than Zarch - but it was also noticably less pretty; for example, there was no depth cueueing of the background (in Zarch, everything got darker towards the back of the screen). I'd assume that the Amiga version used the blitter, although since there was an ST port I can't guarantee that. I've never played the Spectrum version, but it's high on my "most preposterous port" list. Lander, of course, didn't have all the enemies, let you blow up on the launch pad, and didn't clip the front edge of the landscape properly - but it was very good at training people to play Zarch! (I still maintain that I ought to be able to fly a helicopter, should I ever need to, because of this game.)
Part of the exciting colour scheme of Zarch was that it could use the 256 colour mode, back when the best PCs had original VGA graphics. The 256 colour mode had 16 palette entries (that most people didn't touch) and the rest of the values derived from them; the default mapping was an effective perceptual HSV scheme, accessed from BASIC by 64 base colours (setting the top two bits of each channel) and four levels of "tint" (setting the bottom to bits of all three channels at once), giving you fine grained luma control. It might not have had all the colours of HAM6, but it was pretty effective. Despite a brief foray into VU-3D on the Spectrum, it was probably Render Bender that taught me to think in 3D graphics (and now I work in graphics professionally).
Re: 1 colour?
Actually the high res mode (which needed a special monitor) was 1152x896, using one bit per pixel. Sometimes it's worth using Wikipedia for fact checking. All the Arabella-based systems could do 1, 2, 4 or 8 bits per pixel. There was also a bash at resolution independence - coordinates were downsampled according to the mode, so in 1152x896 one coordinate step mapped to one pixel, but in mode 12 (640x256, 16-bit) pixels were two coordinates apart horizontally and four coordinates apart vertically (the next pixel above and to the right of 0,0 was 2,4). Mode 13 (320x256, 256 colour) was downsampled by 2 in each direction.
You *could* do 640x512 in 256 colours (mode 21) with a MultiSync monitor, but before the ARM3 turned up with a cache, it didn't leave much bandwidth for the CPU to do anything. Later systems like the A5000 with faster RAM added mode 31 (800x600, 256 colours) etc. And the VIDC20 in the RiscPC added 16- and 32-bit modes and a more programmable video clock.
Re: Some factual errors
Jason: Thank you for picking up most of my ranting! Although I'd point out that Virus was not *exactly* the same as Zarch (and not just because it ran slower). I've clocked Zarch (I still have the disk), but I struggled with Virus on an Amiga.
This has reminded me that my wife made me get rid of my Archimedes (A310, upgraded to 4MB) a couple of years ago. I cried, even though I still have my Spectrum and my RiscPC is still in the family. I may get one from eBay and hide it somewhere, although it obviously won't be the same.
Re: Also other pre-internet sources.
From the Jargon File:
"Hackers, as a rule, love wordplay and are very conscious and inventive in their use of language. These traits seem to be common in young children, but the conformity-enforcing machine we are pleased to call an educational system bludgeons them out of most of us before adolescence."
Hence the profusion of this kind of stuff in early usenet (it's not like "ROTFLMAAOBPO" is quicker than typing "ha ha") and the tendency for each generation, as with sex, to think they invented it.
I've occasionally been known to resort to it in order to make a couple of characters' difference between different numbers of text messages (especially when texting a broad). It might have been an option on the phones I had with qwerty keyboards, but since my first phone had T9 and the most recent ones use Swype, entering anything other than real words is always more of a hindrance than a help.
Re: Makes sense for that resolution
Canon have recently announced some stuff that can shoot at 4K (although not 8K). I don't believe they're shipping, but it's relatively consumer-spec stuff.
Seems a little unwieldy - the last screen of this resolution that I used was 44" diagonal (plus some bezels) - but I'll take the pixels where I can get them.
What does the resolution have to do with plasmas flickering? Pretty much every plasma screen I've ever seen flickers visibly to me (which is why I don't own one). Kudos to Panasonic if they've got the flicker fixed, though.
I, too, still have all my INPUT magazines. Their series on 3D graphics was what started me on 3D; my initial graphics programming happened on the Spectrum (also yay to the orange manual, which is where I learnt the "x is a cross, so wise-up" mnemonic). I remember an adventure game in INPUT that used a partial predictive matcher for compression - pretty good for the time.
The higher level constructs in BBC BASIC (especially on later machines) were a pretty good stepping stone to more powerful languages. I still list BASIC on my CV (so I don't have the hacker's test point for denying that I know it). I never really picked up 6502/Z80 assembler (although I could probably work them out in retrospect now), but I learnt ARM assembly using the inline assembler in BBC BASIC on the Archimedes - a bit of a step up from my spectrum.
I still have my Speccy. My wife made me get rid of the Archimedes. I cried.
I don't know that I'd be where I am now if I'd been starting out with a 1990s PC instead of an 1980s micro that let you write simple animated graphics in an afternoon. I remember writing BASIC to draw a car. With racing stripes. And speed "woosh" lines. And the text redraw was probably faster than the virtual remote machine I'm having to use at the moment.
Thanks to all the pioneers. Sent from a Khronos conference in Dublin.
Re: I think you mean...
I'd assumed it was a typo for 40Mbit/sec video streams, 40Mbit/sec being the upper limit for Blu-Ray content. But in retrospect, at least one other version of this report says "64,000 streams at 40Gbit/sec" meaning *total*. So I buy your version.
Oh dear, spoilers
I'm only half way through the second book, but I second the concerns that things seem to be slowing down a bit - sad if it gets worse in later books. One of the things that hit me about the first book was that quite a lot happens in it; I'm less surprised that the TV adaptation worked than that they managed to fit most of it in. On the other hand, at least some of the characters come across without the "comedy dwarf" tweaks that Jackson felt the need to make to LoTR.
The thing that really put me off the first series was that some of the early acting - even by relatively experienced actors - seemed incredibly stilted. Some of it was very good, but Littlefinger's need to orate everything really grated; even Lena Headey (who's been good in plenty of other things) didn't seem very comfortable. It's possible it got better as the series progressed, or maybe I just got used to it.
I'm also not all that impressed with the need to age the entire cast (because it stops some of them acting their age and the shock of what they go through is lost a bit), but I guess they'd have had problems broadcasting it if there was really the requisite amount of child nudity.
Here's hoping that the pay cheque persuades the author to find a way to finish the story.
Optical track pad
"With the same rounded corners, chrome vanity band and optical track pad you would be hard pressed to tell the two apart."
What optical track pad?
Otherwise... meh, SD resolution, not for me - but that doesn't mean it doesn't have a market.
Just for clarity...
...and to make sure I'm bidding against more people the next time a DCS-14n appears on ebay:
Kodak deserve a bit of credit for getting a DSLR to 14MP sometime before most of the competition, but there's no doubt that it doesn't handle especially well and it's no low-light camera. My interests in it are only as a back-up to my D700, and since the current back-up is an F5 it's not actually going to be worse. The alternative is, obviously, to go crop sensor (or be able to afford another D700 derivative) but that means carrying more lenses to make sure I cover the field of view range I've decided I want on any given shoot.
Kodak obviously suffered from basing their DSLR strategy on adapted film bodies from (mostly) Nikon - the moment Nikon brought out the D1, limited though it was, there was always going to be a conflict of interest.
Actually, a little harsh
I'm grateful that Kodak kept making film so long; I've still got some in my fridge, although I admit that my film shooting these days is more usually with Fuji or Ilford products; I may regret their troubles more when I eventually get a 5x4.
There's no doubt that the film market shrank radically the moment cameras went digital; despite my first paragraph, I do most of my shooting with a DSLR, and that's not going to change (except, with small probability any time soon, if mirror boxes go the way of the dodo). I actually started with DSLRs, unless a Polaroid camera I had in primary school counts, and added film to my repertoire.
I can't imagine Kodak not seeing it coming, but the question is what they could do about it. According to dpreview, Kodak have made 144 digital camera models, some of which are still current. For a company which had more to do with chemicals than optics, electronics or consumer goods (at least in recent years), that's not a bad effort - but it's not surprising that Canon is the company most visible in the desirable compact and DSLR space and that Sony's electronics combined with Nikon and Pentax's camera design is taking much of the rest of the spotlight.
With the premium products made by big names, Kodak - whose cameras have never exactly been the M3 or F-series of their generation - could only really try to compete at the cheap end, and I suspect they didn't have the manufacturing capacity to create small plastic boxes as cheaply as the bigger companies in China. Even if they did, that market must be feeling the squeeze now everyone owns camera phones. Other than "something else", I don't know what they should have done.
Still, maybe I should get my hands on the DCS-14n that I want as a back-up to my D700 before the collectors start putting the prices through the roof.
Oh, and to add to the history lesson in digital photography, Bryce Bayer (of the pattern) worked at Kodak. I wonder which name will live on longer?
Managing, yes. Enjoying, no.
One of the biggest negatives about the first LCD monitors was that their resolution was awful. An LCD running 1280x1024 looks sharper than a CRT running 1280x1024, but my 19" CRT is extremely sharp at 1600x1200 and pretty good at 2048x1536. With CRTs, the user had the opportunity to trade off text size and sharpness. That went with LCDs - and the market position that should have been occupied by successors to the T210 and T221 got filled with LCDs that were the same resolution, but different physical sizes. Given the appearance 17" SXGA LCD, why the "upgrade path" was a 19" SXGA LCD is lost on me (I, like several colleagues, kept our 17" panels and rejected the 19" ones when given a 1920x1080 upgrade to our second monitors). The rant I had when it became clear that 1920x1200 needed at least a 24" screen, and that (prior to Apple) 27-28" screens were still only 1920x1200. Thanks, but I need that space for the other monitor that I have to use because you won't sell me one with more pixels...
If you've spent a life being able to have multiple documents on the screen at once, or see a reasonable overview of a document while still examining detail, any small screen is seriously constrictive. If you've been managing at 1280x1024, maybe you don't know what you're missing (but try running at 1024x768 for a while and maybe you will).
All I can guarantee is that *I* find it useful
To be honest, I don't scale my desktop to 200% when I use my T221 - I prefer the extra real estate, and move closer when I want to - but I was answering a question about the icons being too small.
A lot of software these days will scale. Looking at a PDF (or doing DTP) on a T221 is visibly improved over a "conventional" monitor. Using PhotoShop is much better in showing detail (although since the T221 is ancient, the colours aren't quite up to modern standards). I deliberately got a 960x540 phone in order to make the experience of reading PDFs and web pages without scrolling more comfortable. And, of course, you can fit more code on the screen. I've got from a five monitor set-up in my previous job to a 1920x1080 + 1280x1024 combination with Windows running in a 1440x1050 remote desktop in my new place; it's unbelievable how much more constrained I feel.
Most GPUs these days have resources to spare - except possibly in the mobile space - although I admit that composited desktops get hit hard when the resolution quadruples. I don't claim that everyone should care, and that everyone should buy a monitor with a higher resolution - but it's been a source of frustration for years that very few high resolution panels have been available. Thank goodness for Apple, first with the Cinema Display and then the Retina Display, which dragged other companies along. Given Apple's history of "100ppi is perfect", I wouldn't have expected it of them.
As for portable devices, where you're actually constrained in the physical size of the screen and it's easier to get close than with a desktop, the more pixels the better (although 2880x1800 is a bit of a weird choice compared with, say, 2560x1600 or 3072x1920 - at least it's not 16:9). To me, for a tablet or netbook that's a content viewing rather than creation device and spends a lot of time displaying relatively static information, I'm completely behind the idea of a premium offering giving you more pixels even if it means that games can't run as fast at the full resolution. That doesn't stop the cheap end of the market getting pixels the size of bricks, but it's genuinely a useful differentiator. Although I'd prefer not to have to buy Apple to do it (if a 960x640 Android phone had been available in the UK when the iPhone4 came out, I'd not have suffered another year of Windows Mobile).
How closely can you focus?
I'm curious what monitor has pixels you can't see. If you can't see the pixels with your nose to the screen, I suggest that might be because you're too close to focus. My getting-on-for-forty-year-old eyes are a bit mangled, so admittedly close is better for me than for most, but I can see the pixels on an iPhone4 and the PenTile grid on a Galaxy Nexus, let alone pixels on my T221 (204ppi, 3840x2400), although they're still all clearly better than the norm. The pixel grid on the 1920x1080 24" panel I've got at work is clearly visible even from a normal working distance.
You pay for the image quality in GPU requirements (although, for a plain desktop, this wasn't an issue in 2004 when I got my T221, and probably isn't now except in the mobile space) and of course it makes the panel more expensive, but getting above WUXGA makes a big difference to a lot of workloads. I've regretted that it's been nearly impossible to get a 15" laptop with a decent screen for most of the last decade.
I claimed for a long time that the CEOs of monitor companies have failing eyesight, which is why the panels get bigger without resolution increases. Facetiousness aside, I'll be very happy to see the trend reversed - I'd actively have preferred a 22" 1920x1080 panel to the 24" one I got, and I'd certainly have preferred WUXGA or higher. Here's hoping they finally start selling them and the prices can come down.
I can't deny that it'll increase the graphics requirements - not that most devices are all that pushed when just showing the desktop - but regarding small icons, could I suggest changing the dpi settings on your OS? Admittedly it might be nice if all software handled this a bit better, but allegedly as of Vista (I've not checked, Vista broke spanning and I've not tried my Windows 7 box with a decent monitor yet) the multi-resolution capabilities of the OS got much better. At least for web sites you could always, I don't know, hold down the control key and move the mouse wheel. Rocket science, I know...
Discussing Buffy is more interesting than the phone
I tend to think of it as three times, depending on events in Seeing Red/Villains, but it's a little hard to be definitive about this kind of thing. Definitely only twice by Once More With Feeling, though.
Sadly, even my love of the show and wish that anything as consistently good was still on (let's not talk about Beer Bad) probably won't talk me into associating myself with Facebook.
Shame there's no RIP icon any more.
Call me a pedant, but...
"along came Quake, the first OpenGL application"
Er. No. I couldn't actually tell you what the first OpenGL application was (probably a demo included in someone's implementation of the OpenGL1.0 specification, which dates to 1992), but I'm damn sure that something was written before Quake (1995), if only because there were a load of SGI IRIS GL applications to port.
It's true that Quake was the first application that popularized OpenGL support in consumer devices, though.
Z88 - so near...
I used a Z88 when they were new, and bought myself one a few years back with a view to having something I could use for typing and notes. Long battery life, famously quiet keyboard to type on, lovely bit of kit. I used it about twice, because it was huge in comparison with my Libretto (whose hinge never, thankfully, broke) and therefore never got taken anywhere to justify the inconvenience of transferring content.
One of the tablet/keyboard combos might be a better modern equivalent, although swype is actually almost decent enough on a mobile that you really need to be writing something big for it to be worth something with a full keyboard. I've yet to try the bluetooth keyboard route (they were horrible under Windows Mobile), but I'm certainly going to.
It doesn't solve the "I want a battery that'll last me going on a writers' retreat in the woods" problem, but you could get a lot of spare battery (or solar charger) alongside a mobile without getting to the bulk of a z88.
There was a time when I was involved in a long chat about the "Psion FX" - something in the Psion 5 form factor, but with an Atom in it. I think this got killed by the (x86) netbook fad, which the discussion predated, although a netbook is really not the same thing.
Cheap enough and small enough (but how do you make a keyboard and screen small without having a hinge and ending up with a Vaio P?) and I'd still find the idea tempting today.
"A lot of people have talked about it being bad for innovation but I think the ability to enforce patents is a core aspect of what drives people to invest the huge sums of money that they do in R&D,"
Something that's making me seriously consider getting out of the software industry (preferably in the direction of academia) is how much nonsense this is. Patents were invented so that the few people who had enough spare time on their hands to solve problems would share those solutions, furthering society. We now have an industry of getting on for millions of people who are employed solely to solve a limited set of problems. The problem will have needed solving by someone else long before a patent is granted, so the disclosure advantage is no longer outweighed by the exclusivity.
Patents were meant to allow the better mouse trap to be available to more people. Now everyone is making mouse traps, one person invents the best one, and everyone else spends their time realising that they couldn't do it the right (and often obvious) way and have to invent an inferior workaround instead. This is immensely frustrating if you want to put your heart and soul into making the best possible mouse trap, and if you're building a machine out of thousands of mouse traps, it's immensely frustrating to have to worry about this at every turn - patents rarely cover something that took a long time to invent, and software engineers spend their time building large projects out of lots of these small problems.
Has anyone worked out how much bandwidth and storage space would have been saved if IBM hadn't had the arithmetic coding patent that meant everyone's JPEG implementation uses Huffman? (Admittedly, only a little per image, but there's a big multiplier there.)
There was an interesting article on Ars Technica a month or two back discussing a report that had noted the effect of patent disputes on company market values. It put the cost of patents to the software industry over the last twenty years as $500bn. I'm not surprised.
By all means waste some rock, and while there's a finite amount of gold out there it's at least recyclable, but I really hope the T-Rex bone in question was dust before it got incorporated. Being rich and buying something rare is one thing, buying something that the scientific community could have used is just being a dick.
Another Acorn user
Yes, I was going to say that the article missed the Acorn port. I've forgotten whether it was my A310, my A5000 or my RiscPC on which I played it, but it was a fine game. A little more obscure than I expected to see mentioned in this list, but I'm not complaining!
Hello new phone... oh wait.
I've been waiting for a 720p phone for a very long time (partly for text quality, partly I was spoiled by a 3" WVGA screen in the past, partly because video is less mangled by it). I've no problem with the size - my Sensation isn't that much smaller, and easily fits in a pocket/my hands. PenTile is a big disappointment, though - it may put me off.