headset + Move controllers =
lawnmower man!!!
what ever happened to AR? the tech we have now would be able to do real justice to what at the time looked awesome, but in reality was a bit crap. So where is it now?
Sony's roaring IFA presence this week continues with an update to its 3D headset and further developments in the imaging domain. The Sony HMZ-T2 is more compact and lighter than its predecessor, weighing in at 330g, and comes with a new array of fitting options for greater comfort. There is also the ability to add one's own …
The old Virtuality Technology that ran on Amigas?
The problem with the headsets was that they made people feel dizzy and sick (me included). Nintendo found the same when they tried a similar system in the late 90's.
I think the software went on to be used for training applications. But for home entertainment they gave up.
Unless they can invent a 3D headset that doesn't have people throwing up within 5 minutes, they are onto a loser. I had a go at the VR machines they had in the Trocadero in the early 90's. Great fun but I felt awful afterwards and it was very disorientating.
Anyone know if there has been any widespread reports of dizziness or nausea this with the Sony headsets? Maybe it's a lag thing, if the old VR stuff had a lag between recognising your head morvement and turning that into a change to the visible projection then that would cause nausea, but does the modern kit suffer the same way? Will Google Glass's AR elements take so long to catch up with your head movement as to have the default setting be 'off' unless you've locked your own head in a vice...
The Sony headsets don't have motion feedback systems built in - they simply show a video feed into each eye. The promotional demo's I've seen suggest watching from a chaise lounges or the like.
Thats not to say I don't think they could be adapted with sensors to make them motion compatible, but I don't know what the lag is like - if these are like the the v1, the video input has to be run through a processing box before it gets sent to the headset. I'm no expert, but if this adds a noticable action vs reaction in the screens, then it will be straight to barf central.
I saw some video where Carmack was going on about why previous headsets sucked so much and his major point was that the delay between moving your head and the screen updating was way too long.
Now correct me if I am wrong, but your standard vuzix or whatever glasses are just 3D HDMI and do not cause excessive delays? So the issue is with the head tracking, which is a nut which has already been perfectly solved (TrackIR / FaceTrackNoIR)
For the Oculus, they have gone with gyros, so they have no sensors for translation (only rotation) ie 3DoF instead of full 6DoF. (Carmack said in same video that he may be able to bodge something but there are no sensors that actually detect translation)
IMHO they should have gone with triangulation (ie TrackIR or FaceTrackNoIR style) which would have kept the weight down on the head unit, plus would plug into existing TrackIR / FaceTrackNoIR APIs for instant compatibility with lots of games, while allowing full 6DoF head tracking.
I would say with a triangulation style system you would probably want markers / IR LEDs on the sides and top etc, as with normal triangulation systems, you keep eyes on the screen, and motion is amplified (So you can always just about see the front of your head) - however with a VR headset, you probably want 1:1, so allowing the software to track sides and top of head etc should allow this. If FaceTrackNoIR can be made to track a face, I would imagine crash test dummy style symbols should be easy to track.
Plus with they gyros, Carmack admitted that sometimes if you look around a lot they get confused (like inner ear sloshing), and apparently it doesn't handle looking ?up? ?down? very well.
Nah, I think keeping the visual display and head tracking as two systems makes sense. Then you can improve each independently.