nice, but i hope things improve from this...
Particularly, I like how the shutterspecs have a non-laughable framerate now - but as one who was close to offering celebratory sacrifices to whichever small god was responsible when flickeriffic CRTs were washed away by rock steady LCDs, I hope we can re-double that. Anything much below 85Hz makes my eyes and head hurt after a while, and sub-72Hz is nasty. My only, thankfully brief encounters with 56Hz original SVGA were battles against near-unreadable text (monitor phosphors have FAR shorter persistance times than 50Hz TVs) and the 60Hz default was a bane.
No, it's not one of these stupid "powerline fields cook my brain" claims. Set me up a CRT and I will have a fairly good chance of guessing what the refresh rate is if it's between 37i and 75p. The flickering is visible and eventually causes irritation, much like the also-detectable 15.6kHz whistle from a TV tube. However there is an upper limit to what even the most hyperactive rods & cones will detect before their nerve impulses reach 100% duty cycle and the output is considered as "steady". Even with it right up against the eye (making the whole world flicker!) instead of only being a relatively small, distant screen, 120Hz Per Eye should do the trick. We have 200Hz TVs now, allegedly, for whatever good they're supposed to give against a 100Hz (or 50Hz 2D LCD), so it can't be impossible.
By the way, what causes the framerate to drop so? Is it because it's having to render two seperate scenes but not flip the buffer for either of them (except flipping between L & R of course) until both are updated to prevent mind-warping 3-dimensional "tearing"? Like, a 3D Vblank? If they're not doing this, then I see no reason why simply jittering the POV position left and right by a few inches for each drawn frame and dropping the result into a different buffer should be difficult.
Suppose what we need is some kind of SLI-type setup that can offer a reasonable guarantee of maintaining 60fps for each eye to keep up the illusion... well so long as you keep the detail levels down ;)
Finally why do we need to wait for special software to support this concept? Descent, Terra Nova and a few others have been supporting shutterspecs and other true-3D render methods since the mid 90s (ever since fully shaded & textured polys became a practical prospect) and the guys selling the devices allegedly had go-between drivers for a variety of other titles to retroactively enable it. Can't we do similar now? And where, goddammit, is my 3D, HD-movie-capturing digital camera?