The gift that keeps on giving
Microsoft - you never fail to disappoint.
XBOX One owners across Europe are grumpy with Microsoft, which seems to have shoved the new console out the door with settings that make British and European tellies “judder”. The problem is generating a bit of chat in forums like this one at Xbox.com or the independent forum at n4g.com, where some say they see frames freeze or …
Microsoft - you never fail to disappoint.
I expect it's Sky boxes made by Amstrad that are flakey.
I expect it's the console made by micrsoft that's flakey. :)
Could it not be the cheap Technika TV's from Tesco that could be the flakey ones?
I expect it's Sky boxes made by Amstrad that are flakey.
All Sky boxes are now made by Amstrad, BSkyB bought Amstrad in 2007 specifically for that purpose.
"I expect it's Sky boxes made by Amstrad that are flakey."
Probably Sky use non standard timing or something. Works flawlessly for me with a Virgin TIVO...
An Amstrad receiver going through a Microsoft console onto a Tesco Technika TV?
Talk about stacking elephants
And not patchable in software is the word on the street..
Its not an Xbox launch if there isn't a serious widespread screwup to accompany it.
Sounds like it can output at 50Hz already - so it shouldn't be a hardware issue:
There's some reports this fixes the issues with TV, but then games run at 50Hz too and they look odd instead. A patch in theory could switch output to 50Hz with TV in full screen (but it'd be preferable to keep 60Hz when snapped since everything else is designed with it in mind)
"And not patchable in software is the word on the street.."
Rubbish - Xbox One has custom hardware features for video scaling, overlay and encoding / decoding in the APU - they can be adjusted by software to pretty much any standard setting...
This is more of an el-cheapo TV screwup than a problem with the Xbox.
Everyone knows that TVs support PAL60 even though PAL / NTSC isn't used anymore thanks to the digital format of HDMI.
A great phrase from an old work colleague (about his own code).
Obviously, the solution is to not test/use it - works perfectly!
Every dollar spent on testing is a dollar spent on bad news!
(Popularly attributed to a US General involved in the Strategic Defense Initiative program.)
It's and very old sort of problem, the different TV standards, and, a decade ago, you could expect to feed an NTSC signal into a SCART socket and get a picture out of your TV from a video. The old analogue electronics were able to handle the different standards, once you got past the tuner modules.
DVD changed that a little, expecially with the region locks, but it was still analogue between the player and the TV.
Want to bet that there's not some sort of region lock entangled with this one?
Sorry, I don't want to be rude but you have never worked as a TV engineer, have you?
a decade ago, if you plugged in an NSTC device into a PAL telly, you ended up with a mono picture or a rolling picture like the vertical hold was off sync. In fact, VCR had to proudly display NTSC playback else a lot of VHS tapes would fail as they were recorded in NTSC.
NTSC was called Never Twice Same Colour for good reason....
@Cornz 1 - Twenty or twenty-five years ago, maybe, but by ten years ago all TVs supported PAL/SECAM/NTSC and usually in 50 and 60 Hz. My last analogue telly was a bottom of the range job purchased in 1995 and it supported all three and at both frequencies.
NTSC differs from PAL in two main ways.
1. The way the colour is modulated on to the composite video signal. This means that you can't just feed a PAL TV with an NTSC signal and expect it to work. The TV has to specifically support NTSC or you will get black and white.
2. 50 vs 60Hz. Again the TV has to be able to specifically cope with 60Hz to be able to play NTSC. Also in the case of VHS the image is recorded on to the tape in diagonal stripes by a rotating head. With NTSC VHS the head moves faster so again the VHS player has to specifically support NTSC otherwise the playback head won't synchronise with the recording on the tape.
With modern TVs the situation is very different, modern LCD TVs will quite happily cope with all kinds of input signals. The problem here is that there is most likely that the TV signal is 50Hz and the xbox then converts this up to 60Hz for playback on the TV. Some TVs will cope with this better than others depending on there motion processing software but the fact remains that 10 extra frames have to be inserted each second so it's never going to be perfect.
I cannot fathom why you would want to pass through your satellite/cable signal through the XBox in the first place! It just means that you will have two boxes turned on instead of the one. Why not just go the normal route and connect it directly to the TV? Why feed it through the XBox? There can be no real advantage except to eat up more electricity. If you want to play the XBox put your cable/satellite box into standby. If you want to watch the TV put the Xbox into standby (or turn it off).
Can someone explain? What advantage is there to this?
> I cannot fathom why you would want to pass through your satellite/cable signal through the XBox in the first place!
"XBox, turn on"
"XBox, watch Doctor Who"
No need for a remote, nor sonic screwdriver.
You're assuming the TV has enough HDMI ports for everything. Even some expensive, otherwise high end, TVs are limtied to one or two HDMI ports. If you have a blu-ray player, games console and Sky HD box, then at least one probably isn't going to be plugging directly into things. Even receivers can have quite low numbers of HDMI ports.
fee of about, ohh, a tenner, will get you a self powered 3 into 1 Auto/manual switching HDMI part. Like a scart doubler from way back.
The Xbox has a picture-in-picture mode, so you can have the TV playing in the corner of the screen while you play your game.
Because it makes the Xbox One the "center of your living room" and they can analyse the streams and work out what you are watching and how to target adverts on the Xbox One dashboard at you better.
The Xboner is basically an Advertising platform that's pretending to be a media platform, that's sucks as a gaming platform.
"Can someone explain? What advantage is there to this?"
The Xbox One can fully control your TV, your Receiver / amplifier and your STB - via Infrared or via HDMI control. So you now have a voice and gesture controlled TV - and don't need any extra HDMI ports...It can also do Video overlay and scaling in hardware - so for instance you can watch TV at one side while playing a game...
My gawd can't you people do anything right?
The K-Mart special I got on sale at Worst Buy four years ago has 3 HDMI ports, VGA, DVI, and whatever that really old 5-RGA cable standard (2 actually I think) was. And I'm probably forgetting something. Oh yes, S-Video should I ever have need of it.
Its down to some kind of DRM snafu. They will never admit it, but their code to prevent free movement of media is at fault.
Then again, if you pay peanuts for something you can expect to have some kind of lock on it, oh hang on!!
Agree,"DRM snafu" exactly my first thoughts. M$ made DRM to lock you in so that even your display has to be authorized and decrypt the input otherwise your 50" monster displays in VGA.
However, I don't believe this glitch has anything to do with the mains frequency 'cos TV's are computers and the refresh rate can be anything the graphics chip is capable of and is decoupled from the input.
So the DRM monster has turned round and bitten its creator. GOOD.
"...they see frames freeze or judder. Others report oddly-hued backgrounds"
Because of a DRM snafu over HDMI? I would have thought that would simply kill the link/playback altogether. The symptoms reported sound like a copy protection "issue" from the days of analogue cable telly.
"even your display has to be authorized and decrypt the input otherwise your 50" monster displays in VGA."
Yep - that is more a Sony mandated thing though - HDCP - much like Sony are also responsible for infecting all BluRay players with Cinavia....
You do realise that all Blu-Ray players have to have the HDCP DRM and it has nothing to do with this issue.
The XboxOne only enables HDCP on movies hence why it can support external video capture out of the box.
On the other hand the entire PS4 is under HDCP hence why it doesn't support external game capture out of the box and why one of the resolutions to the blue light error is to update your TV's firmware.
Absolutely no reason why anyone in the UK would pass their box through their Xbox in the UK (unless they only have a single HDMI input on their telly I spose) but I gave it a go with a Virgin TiVo box and a new ish Samsung telly and there was no juddering on either HD or SD channels. So it would seem to be a bit of a picky issue.
"Absolutely no reason why anyone in the UK would pass their box through their Xbox in the UK"
Rubbish - you can select between TV and Xbox - or snap it to one side with voice control - and it can power on / off your TV and receiver too...and you are ready for the full channel guide in early 2014...
"Rubbish - you can select between TV and Xbox - or snap it to one side with voice control - and it can power on / off your TV and receiver too...and you are ready for the full channel guide in early 2014..."
Let's see - switch between inputs with a button on my remote, or via an unreliable voice command, which even it works, means anyone in the room can switch inputs and power things on and off (yeah, the kids will find that *hilarious*). I'll keep the remote, thanks.
And why the hell would I want to watch TV at the same time, and on the same screen, as playing a game? That's just fucking stupid.
...there were games that needed the TV to be set at PAL-60. I only had one such game (The Outfit??) but it caused issues. Haven't seen any of those since 2006.
The vibe I'm getting from both Sony and MS is that no one likes proper product testing. And certainly not testing it with anything more than 6 months old or isn't just laying around campus.
After all, 60Hz is what the PS3 and 360 have been feeding UK tellies for years over HDMI. It seems to be more the case that the Xbox isn't handling the frame conversion from the 50Hz it's getting from the TV box to the 60Hz it outputs very well. Simple solution is to not use it as a leccy-guzzling passthrough, and get a HDMI hub if you've not got enough HDMI sockets on your telly (the well-made ones will switch signals automatically with devices being turned on and off too).
Yep, for years my AppleTV's 1 & 2 have been outputting 60Hz HDMI to my Samsung 40LE796, no problems whatsoever.
the ATV just has Ethernet/802.11 input & doesn't try to do videotranscoding. (any transcoding is handled nicely by offline batch processing by the free cross-platform utility from http://handbrake.fr/)
That's my understanding too. (I don't have an XBox to check.) Apparently the juddering is most noticeable when the TV is outputting smoothly scrolling text, and when the 50Hz TV input is snapped alongside a 60Hz game. The XBox repeats every 5th frame, which some people notice and others don't.
I have a copy of Sid Meier's Pirates on my xbox360 profile which, in order to play, I haveto remove the hdmi and plug in my old scart connector because it only plays at 50Hz !
Well, that's it. The Xbox is trying to harmonise the frame rates for two different video sources. It's not really a surprise that one or the other will be affected.
In order to be able to simultaneously display a 50 fps and a 60 fps picture perfectly, you would need to output from the Xbox to the TV at 300 fps (so the 50 fps image would appear on 6 consecutive frames, and the 60 fps image would appear on 5 consecutive frames).
This would be beyond most TVs, even modern ones.
@ Tom 260
<< get a HDMI hub if you've not got enough HDMI sockets on your telly (the well-made ones will switch signals automatically with devices being turned on and off too). >>
My experience is that some devices (like SkyHD and AppleTV for instance) are so keen to make sure that they are the device of choice that they don't release the HDMI properly and you have to manually switch away from them even when they're in standby. I don't know if they're not following the standard, but it seems to me that HDMI is a pretty sh*tty standard that should have been based on Ethernet cable in the first place, with options for copper or fibre. But then retailers would have lost the opportunity to sell naive buyers a 2 quid cable for 50 quid.
That could neuter any frame rate adaptation the TV is performing. Instead of being able to detect duplicates every 6 frames and deduce it's really 50Hz, the 60Hz elements make every frame unique. Sets are going to be optimised to handle 50<->60Hz conversion well, but only if they can detect the need to convert.
Yes, but even PERFECT 50Hz to 60Hz frame conversion is RUBBISH
Especially for movement.
Almost all Broadcast is 25fps interlace 50Hz in Europe, There is NO WAY AT ALL to display that as 60fps 60Hz progressive without degradation (blur or shudder) of movement.
Any Picture in Picture the game needs to be running 50Hz 50fps progressive.
Also ANY display of broadcast on a Progressive only panel of interlace source even at same frame rate is degraded compared to CRT or native panel interlace (with strobed backlight).
The very concept of Xbox one is flawed in a world where we have 24fps, 25 fps 30 fps 48fps 50fps 60fps progressive and 25 fps 50Hz, 30fps 60Hz interlace. Not to mention all the different resolutions (more on broadcast than disk).
It's a pity with HD they didn't go for a single 48fps progressive 2136 x 1154 standard instead of the 1/2 dozen or so incompatible standards.
The original 30fps / 60Hz and 25fps /50Hz had to be:
1) Interlace halves the transmission bandwidth. An analogue compression system if you like. Progressive wouldn't have been possible for 40 years.
2) The frequency had to match lighting (or else you'd have strobe bars) and initially local mains (so hum bar on picture would be stationary, hence 24 fps 48 Hz interlace wasn't possible. Video cameras are more sensitive to this effect than film cameras. Later in 1950s the TV Power supplies and Cameras better and so TV frame was no longer locked to mains.
Modern TV was developed 1934 to 1936 initially. Hence the frame rate limitation would last nearly 20 years. Interlace Limitation until MPEG and digital was available to do compression.
I'm assuming that it was you who down-voted me.
I was not suggesting frame conversion. I was suggesting that you used a frame rate between the Xbox and the TV that allowed exact timing of both frame rates to prevent the need to re-sample. This is why I chose 300 fps, as that is an integer multiple of both 50 and 60. This allows an EXACT number of frames for each of the different video sources. For the 50 fps source, you would leave the image up for 6x300HZ frames, and for the 60 fps source, you would leave an image up for 5x300HZ frames. A perfect fit, with no resampling, allowing both videos to be side-by-side at their native frame rates.
Of course it's completely impractical as well, and would only work for these two frame rates (or other divisors thereof).
If you assume both videos are interlaced, you could probably take that down to 150 fps, but that is a big assumption.
"In order to be able to simultaneously display a 50 fps and a 60 fps picture perfectly, you would need to output from the Xbox to the TV at 300 fps (so the 50 fps image would appear on 6 consecutive frames, and the 60 fps image would appear on 5 consecutive frames)."
No you don't. You reduce a series of frames to linear light, and then render them out at the new framerate.
I understand what you are saying, you've not understood what I've said. But anyway, using linear light is introducing an additional motion blur component, as you've effectively got to interpolate intermediate frames that do not exist at the re-sample point, and they will always be in one way or another a guess. Also, doing it in near real-time may require more compute power for HD video than is in the Xbox.
What I said would still work, although as I also said, it is impractical.
I recall in the early days of ethernet auto-negotiation you'd sometimes get bad handshaking that resulted in mismatched connections. If you manually set one end of the connection things worked great. Want to bet it is something along those lines?
And while I'm on a rant, WTF can't the HDMI video signal encode to tell the display device what the proper aspect ratio ought to be? I hate flipping through 4 modes trying to figure out which one is the right one on something I've never seen before.
"I understand what you are saying, you've not understood what I've said. But anyway, using linear light is introducing an additional motion blur component, as you've effectively got to interpolate intermediate frames that do not exist at the re-sample point"
No you don't. Temporal aliasing can be controlled though use of motion-predicted shuttering.
I was worried until I read that Studbucket 33 is on it. Microsoft have put their best man on the job so I am somewhat reassured, but lets see how things pan out.