"The more you tighten your grasp, the more systems will slip through your fingers." -- Princess Leia
One more reason for users to cut the cable and go to streaming services. The broadcasters are just hastening their own irrelevance.
Jessica Rosenworcel, a commissioner at America's broadcast watchdog the FCC, has criticized a proposed set of TV standards as a "household tax," due to its lack of backwards compatibility. Addressing a conference of Catholic Bishops in Washington DC this week (we have no idea why either), Rosenworcel complained [PDF] that the …
One more reason for users to cut the cable and go to streaming services. The broadcasters are just hastening their own irrelevance.
How does that make any difference? You still pay the same content companies, the only possible savings is avoiding renting the cable boxes (which you can do already if you buy a Tivo and rent a cable card)
People who think streaming is a panacea are going to be in for a rude awakening in a few years. Already the streaming market is being fragment, with Disney removing all their content from Netflix in a couple years, for example. They'll start their own streaming service. CBS is trying to leverage Star Trek to push theirs, and if they succeed no doubt the other networks will do the same. Before long you'll end up paying more if you want to watch all the same stuff because of all the different subscriptions that will be required. The only people streaming will help is those who don't really care what they watch, they can subscribe to Netflix and just watch something and be happy, and won't care about all the stuff they can't get on it.
"The only people streaming will help is those who don't really care what they watch, they can subscribe to Netflix and just watch something and be happy, and won't care about all the stuff they can't get on it."
That's me. All the stuff I really want to watch, I already own (or will buy) on blu-ray. If Disney remove their stuff from Netflix, I won't miss them.
And don't forget about the data caps. Not many of us have generous data allowances, and some charge exorbitant overage fees.
Don't know about you, but here in Blighty it is very common indeed to have no data caps at all.
Which is nice.
Seriously? As in you can tear through 2TB of downloads in a single month and get no repurcussions at all? I'd love to read the T&Cs of these contracts, then.
Yep they are definitely serious.
We are on Sky Fibre unlimited in the UK - no usage caps or traffic management. I can download at full speed 24/7 for a month with no drop in service if I needed to.
Check: http://www.sky.com/shop/terms-conditions/broadband/network-management-policy/ and scroll to the section for "Sky Broadband Unlimited, Sky Broadband Unlimited Pro, Sky Fibre Unlimited,
Sky Fibre Max, Sky Fibre Unlimited Pro, Sky Ultra Fibre Optic products*"
The only listed drop in service is for external network issues or faults.
It's not even expensive really.
"The broadcasters are just hastening their own irrelevance."
by behaving like MICRO-SHAFT???
I already have to 'rent' set top boxes, even for a 3 year old LCD TV. It has HDMI output (and composite for older TVs). But, the remote control is a piece of CRAP, and 3 or 4 buttons practically don't work after only a couple of years of useage. TV remotes last 10 times as long. But "the cable" went to DIGITAL a while back. So I _have_ to have it.
So I expect "more of the same" then, a special box to decode whatever the hell they broadcast at you. Nothing different except the details. and the increased monthly bill. And not wanting to see anything on 3/4 of the available channels. And too many FEELING commercials injected into the content.
[I had to edit the topic, it was too long with the 'Re:' prepended]
What the heck are you downloading?
to hold Auntie Beeb up as a paradigm of backwards compatibility ?
If you're referring to the transition from analogue to Freeview, I think that was done quite well.
Freeview was around for a long time before they finally switched off the analogue signal. And a basic Freeview box was pretty cheap (I think there were even some help to buy schemes for the disadvantaged). Plus undeniably it was a big improvement.
There's some people using 1930's TVs with a Freeview box. Not bad for backward compatibility (ok, they're using a scan converter too...).
The trouble with having done that is that the reasons to upgrade beyond that become significantly less compelling to the end users. Freeview is still Freeview, which is excellent, plus they've managed to sneak in a couple of HD channels. That's all been handled reasonably well.
And of course what they're doing in America is the equivalent of turning off Freeview altogether and starting from scratch. Doing that here would result in the Daily Mail exploding in indignation...
"If you're referring to the transition from analogue to Freeview, I think that was done quite well."
Yup. My reaction was to wonder if they're still on analogue over there. We've actually gone through two transitions in post-war TV broadcasting: VHF/405 lines to UHF/625 and analogue to digital. Both were handled with sufficiently long transition periods so that anyone actually forced to buy a new TV must have been using a pretty ancient one on its last legs: my last analogue CRT set used with an STB had one colour channel die.
"Doing that here would result in the Daily Mail exploding in indignation..."
An immigrant breathing in does that.
No, the US turned off the analogue signals with the original ATSC transition last decade. Like your plan, there was a subsidy program to encourage people to buy ATSC tuner boxes for those who didn't want to give up their TVs. That's since moved on since ATSC-capable TVs are ubiquitous and inexpensive (think a little over $100 inexpensive).
The same issue would arise here if Ofcom mandated that all DVB-T and DAB broadcasts were switched off in less than 20 years from analogue changeover date (DVB-T2 was already out before then and analogue radio isn't going away any time soon)
Sales of DVB-T (only) or DAB (only) receivers have been nonexistent for a long time, but there are so many installed and in use that there would be consumer uproar.
In this case ATSC3 isn't even released, but the proposed changes would render existing ATSC sets useless in a short period - and given the dearth of USA terrestrial broadcast channels(*) it gives cablecos who already have a defacto monopoly an opportunity to further lock in customers by forcing yet another expensive decoder box to be installed and rented.
(*) A large chunk of the few remaining NSTC terrestrial broadcasters simply switched off their transmitters entirely on ATSC changeover day because the vast majority of viewers are on cable thanks to NTSC's wild colour shifts when any kind of signal multipathing (ghosting) happens(**)
(**) PAL fixes this by alternating the phase of the colour signal carrier on alternate lines, resulting in colour shifts on any line cancelling out the ones above/below and the overall picture displaying correctly at normal viewing distances, You can see the shifting in each line if you look closely enough. The other changes between NTSC and PAL are extremely minor(***)
(***) The basic colour tech is identical apart from the phase inversion. Using subcarrier suppression is a transmitter power saving tweak and the change in subcarrier frequency from 3.58 to 4.43MHz is a direct result of the difference in line numbers and frame rate of NSTC and CCIR mono broadcasts(****). The same multipliers are used in both.
(****) Some countries (france, russia, a few others) messed around with CCIR sync pulse polarity and in some extreme cases inverted the luminance levels (some of eastern europe) or used non-standard audio carrier separation (UK) to either keep out foreign broadcasters(*****) or lock in local manufacturing, but these changes are all relatively easy to dynamically accomodate even in 1960s-70s era sets. There are even a few wild-ass countries which used PAL-M (PAL encoded 525 line 60Hz) or NSTC-4.43 (NTSC encoded 625 line 50Hz)
(*****) And then there's SECAM, which is also CCIR-based, with the same tweaks as above, also used by protectionist countries(france and russia in particular), and subject to even more region-lockin variants aimed at keeping foreign broadcasts out (which even in repressive states resulted in consumers 'acquiring' decoders for the other systems). Aren't you glad most of the world has settled on DVB-T/T2/S/S2 and that ATSC is really only used by a couple of markets? (USA and some of its puppet states^W^Wneighbours/allies) (Yes, There's ISDB, but that market is tiny and fragmented into islands of incompatibility. And you thought TV was TV was TV... :)
> If you're referring to the transition from analogue to Freeview, I think that was done quite well.
Some aspects were rather poor, IMHO. Starting in the North of the country meant that the (on average) poorer towns paid the expensive 'early adopter' prices for the kit. By the time the rollout reached the (on average) more affluent South, 4 years later, the kit was half the price or less.
For me personally, I'm on the Hannington transmitter but this was on 1/4 power for 4 years because of "the risk of interference to reception in Guildford". Quite why one of the richest towns in the country needed mollycoddling I'm not sure. In the end I went to Freesat because it was simpler than dicking about trying to understand whether it was my aerial or the transmitter.
"The same issue would arise here if Ofcom mandated that all DVB-T and DAB broadcasts were switched off in less than 20 years "
Well, in fact some changes did make some boxes useless well before that timeframe! The wedge shaped Pace DTVA-T box was one such thing I had to throw out, and there were many more modules consigned to the dustbin or hopefully recycled!
for a list
"Well, in fact some changes did make some boxes useless well before that timeframe! "
Yes, I'm well aware of that, having had one such box go titsup - but that failure was a direct result of the manufacturer making assumptions that a field size would never change despite there being provision in the standard for it to do just that, not a change in the technology.
A bunch of other boxes (SetPal based) went titsup when the size of the program index pages increased and this was down to the exact same cause.
Reality says that we replace our setup boxes around once a decade if not more. The number of 1980-era TVsets still in use is miniscule, let alone anything older and even my late 2000s LCD "HD-ready" set is on its last legs already (the builtin DVD player packed up years ago and the CCFL lamps are going pink)
> Some aspects were rather poor, IMHO. Starting in the North of the country meant that the (on average) poorer towns paid the expensive 'early adopter' prices for the kit. By the time the rollout reached the (on average) more affluent South, 4 years later, the kit was half the price or less.
I live about as far south (south west) as it is possible to get. My local transmitter - Caradon Hill - was switched over to digital in Aug/Sept 2009 and was the last transmitter in the SW to be switched. Thgis was a couple of months before Winter Hill (Manchester and area) and about 2 years before Emley Moor (Yorkshire). The North East of England was one of the last areas to undergo switchover, after London, with Northern Ireland being the very last.
Not exactly much of a north/south divide at work. Much of the south west isn't known for its affluence...
To be honest, by the time digital switch over actually rolled around, Freeview kit was already cheap and had most of the bugs ironed out. It was a full 11 years after it first launched, after all.
I suspect that "new" video display boxen (aka TV's) will attempt to be compatible with new standards, but you never know. In my house, I still have (count 'em) 5 NTSC only TVs. They still work quite nicely with the TiVo box that emits proper signals. Yes, I do have a bunch of "adapters" (with enough $40 coupons you can get quite a few) and a single W I D E screen video display box for watching sporting events at times (it also makes a great display for a Raspberry Pi).
Someone should have designed the ATSC standard to last a bit longer. NTSC lasted over 60 years in one form or another, and served us quite well. One thing I learned is that we humans can interpolate quite a bit in the visual field, and while some things need lots of resolution (computer monitors seem to be high on the list), entertainment TV got by quite well at 480p resolution for quite a while!
So, life goes on and another standard goes obsolete. (*SIGH*)
Someone should have designed the ATSC standard to last a bit longer.
I think the problem here (and it's not specific to ATSC, I could also mention DAB and DVB-T) is that until the mid 1980s, analogue was all we had (or at least all that was practical) and there was very little you could do to improve analogue TV without also consuming oodles more bandwidth. Japan had Hi-Vision while Europe had PALplus and D2 MAC, none of which really took hold.
From the 1980s onward, the mandatory SCART socket on televisions began to make people realise that their ordinary TVs were capable of extremely good pictures - some home computers could send RGB to a SCART socket, as could some video games consoles, and of course, eventually, DVD players. I have a theory that one reason DVD took so long to get going was that left-pondians didn't have the advantage of an RGB connection via SCART. With US TVs mostly only having composite or s-video connections, the picture quality improvement of DVD over VHS and particularly Laserdisc wasn't as apparent as it was to us Europeans. I know some US TVs had "component" inputs, which would have done the trick, but few DVD players had component outputs I think.
Where was I?
Oh yes, the difference now is that since digital processing of video has become relatively trivial, it's also trivial to keep making it better. A few years after one standard is set (say, MPEG1 layer 2 for audio as used by DAB) another one comes along which offers either higher quality for the same bitrate or the same quality in fewer bits, or a lower decoding burden meaning it runs better on low-power devices, or all three at once. The same is true of transmission standards, as exemplified by the differences between DVB-T and DVB-T2.
Somebody pointed out the well-managed transition from analogue terrestrial broadcasting in the UK to digital, but they failed to point out that there is a digital-to-digital transition under way as we speak. In some ways this is similar to the ATSC to ATSC-3 transition, but the difference is that DVB-T forces broadcasters to work together (effectively, many producers share one transmitter and thus one method of transmission) while ATSC was set up specifically to allow individual broadcasters to maintain sole control of their own transmissions.
In the last very few years, streaming has become a practical delivery method too, and this also alters the landscape. If traditional broadcasters are not to wither, they need to adapt, and adopting new transmission methods, particularly if they enable easier integration with net-connected services, could be useful.
Alongside the improvements in technology of course has come a vast reduction in the cost of receiving equipment. Even back in the early 1980s, a normal (for the UK) size colour TV probably cost in the region of a week's wages for most middle-class people. These days, when you can buy a connected, full HD TV for under £200 - even a newly qualified teacher can earn that in a couple of days - the TV has turned from a "consumer durable" expected to last perhaps 10 years alongside the 'fridge and the oven into a commodity item and manufacturers are able to produce them at such low prices partly because they expect repeat business every 3 to 5 years.
That's my 2p anyway, sorry if I'm late into this argument!
Oh, you also said
entertainment TV got by quite well at 480p resolution for quite a while
Firstly, it was 480i - there is a big difference between interlaced and progressive scanning and secondly, those of us in 50Hz countries actually had a few more lines of resolution (for home-grown programming anyway) at 576i.
"I know some US TVs had "component" inputs, which would have done the trick, but few DVD players had component outputs I think."
I think the problem was the other way around. Component outputs became ubiquitous first (easier to do on the player end with the right board and chip designs), but as people held out their old CRT TVs that may have only had composite input (or nothing but the RF antenna input if they were really old or cheap), adoption was pretty slow until the PlayStation 2 provided another way in (come for the games which didn't need ultra-high-quality TVs, stay for the movies).
"Firstly, it was 480i - there is a big difference between interlaced and progressive scanning and secondly, those of us in 50Hz countries actually had a few more lines of resolution (for home-grown programming anyway) at 576i."
NTSC was lower resolution, higher rate. PAL was the opposite.
NTSC - Never The Same Colour (color)
PAL - Picture Always Lousy
That's what I was taught.
ATSC version 3 is better than NTSC
Almost Twice the Same Colour
That's mainly because the standard was way ahead of video technology of the day; it wasn't until the late 80's that televisions could even show off the full fidelity of the standards. Admittedly, for its time, both NTSC and PAL were good technology that used an enormous amount of bandwidth to make up for their simplicity. Raw NTSC is about 50-100MB/s, depending on how accurate you want color to be, meaning that you could store a whole 1.5-3 minutes of raw video on a DVD-9. It took a LONG time to outgrow that, but once HD showed up, that was that.
On the other hand, there's now lots of investment in continually improving the state of the art, and where ATSC could meet the needs of HD easily, it's again not going to work for 4K or HDR/deep color. This changeover is as much consumer-driven as industry-driven.
It's not like ATSC 1 barely came into being and now it's time to toss it, it's over 20 years old as well (though the H.264 extension is only 10 years old). By the time the new standard is ratified and anyone starts broadcasting with it, we're probably looking at another decade at least. There's only so much future-proofing you can put into digital technology with fancy algorithms, since it still has to be cheap enough to purchase early on.
"those of us in 50Hz countries actually had a few more lines of resolution (for home-grown programming anyway) at 576i."
Except we didn't, because PAL (and SECAM) has half the vertical colour resolution of NTSC, so what we saw was effectively 238i-and-a-bit despite the nominal 625 line frame (576 visible).
The startling improvement in RGB video quality from computers and DVD players over PAL was due to the change from that effective resoution to a true 576i
SECAM = something essentially contrary to the american method
PAL = people are lavender (also peace at last)
"It's not like ATSC 1 barely came into being and now it's time to toss it, it's over 20 years old as well "
The issue is not the age of the existing digital standard, it's the time taken since the last time that people were forced to upgrade their sets or settop boxes on pain of them no longer working.
SECAM - System Entirely Contrary to the American Method
> I have a theory that one reason DVD took so long to get going was that left-pondians didn't have the advantage of an RGB connection via SCART.
Y-C component (S-Video) input was fairly common on North American televisions by the mid-90s. While not as good as RGB signaling over SCART, it was good enough for the televisions of the era when viewing DVD movies. I'd argue that cost was the initial barrier to adoption of DVD.
Where S-Video was noticeably inferior was with game consoles and home computers. The colorspace and chroma bandwidth limitations were more of a hindrance with true RGB/I sources.
Re Alan Brown:
"Except we didn't, because PAL (and SECAM) has half the vertical colour resolution of NTSC, so what we saw was effectively 238i-and-a-bit despite the nominal 625 line frame (576 visible)."
Actually even that's not quite true - That did not apply to the luminance (brightness) signal at all, which gives those systems its higher resolution. The colour itself is literally smeared on top of that, at a very low smudge like resolution.
The horizontal chrominance bandwidth of modern analogue colour resolution was also greatly restricted, to about 1 MHZ due to the colour difference signals being placed on a subcarrier of 4.43MHZ (Pal colour) the luminance channel ONLY, provided the fine detail - The vertical reduction in colour resolution for PAL D and Secam wasn't a problem - the notion being that if it could be reduced horizontally, (as it always was) it could also be similarly reduced vertically too.
It should be noted that cheap simple Pal receivers with no phase errors, and a correctly adjusted decoder wouldn't suffer from decreased vertical colour resolution at all compared with NTSC!
It was just delay line averaging on of the chrominance signal after alternate line phase inversion of R-Y which reduced the resolution on more advanced decoders. Those phase errors then translated into changes in colour saturation. Perhaps modern techniques could even correct any phase errors without such averaging with a delay line.
It should also be remembered that digital colour television ALSO reduces the colour resolution in order to preserve bandwidth, and the only thing running at (almost, because digital TV is lossy compression)full bandwidth is the luminance content. All this simply takes into account the eye's inability to discriminate high resolution colour, horizontally or vertically. I'm sure one could easily sum up the luminance signal from an RGB source, and use it to re-create the R-Y B-Y and G-Y colour difference signals to see what kind of colour definition is transmitted on digital TV - In my opinion standard definition colour resolution on DVB TV often seems worse than PAL ever was!
> The issue is not the age of the existing digital standard, it's the time taken since the last time that people were forced to upgrade their sets or settop boxes on pain of them no longer working.
Like I said, what's the point? By the time the standard is hashed out, ratified, implemented, and finally cut over, you're looking at a minimum of another decade, maybe even two. But thanks for ignoring that.
This change may lead me to dropping my TV subscription depending on the cost of replacing the TVs. I have one aging 42" LCD TV and new(ish) 55" cheapo. It is entirely possible that when those fail I won't replace them. I'll be Internet only then. I don't expect the savings will be huge though. As more people go Internet only the price will go up.
>> it will allow 4K and 3D shows
Is 3D really still regarded as a selling point? I don't personally know anyone who wants it and most of my friends & acquaintances are TV/film/AV buffs.
It's just a subsidy for the electronics industry. It rewards South Korea for letting us further militarize the area. As you point out, it's not like any large number of people care about 3D. While 4K isn't quite DOA, there's not a lot of interest, either.
They got a bump eliminating analog TV. Guess they're looking for another.
Are there are any manufacturers still producing 3D TVs for *any* market? I thought it had reached its useless-gimmick date and had been abandoned by all.
4k is far from doa. Soon you won't even be able to find anything below 4k in tvs over 40 inches, except for the very cheapest bargain brands
3D is likely DOA for most as it generally does very little for most broadcasts. Also, it is known to make some people sick.
4K strikes me as not terribly useful for most people even if they have equipment and a 4K signal. Part of the problem is the physiology of the human eye and one's ability to focus. Also, I suspect the higher resolution would be pointless at the distances many are from the boob tube when they are watching.
In fact, manufacturers were see flat sales after the big switch to ATSC, mandated by the government. The industry tried to goose sales back then with 3DTV, it failed, they’re trying again now with 4K. I guess it isn’t working as they’ve got the government to mandate the initiatives in such a way to make all of our TVs obsolete.
Yep, 4K sure makes all of our 1080i cable video look really awesome!
I wonder if they've thought of putting out content that's actually worth watching.
There are lots of questions around 4k that can be resolved quite easily as to whether you get better quality.
If you want to see where 4k works then I suggest you view the opening credits to stranger things season 1 on full HD and 4k.
Once you have done that then you will see the difference.
"I suggest you view the opening credits"
Opening credits? Aren't those the braking zone for fast forwarding through the adverts that were recorded before the programme started?
Just about every Samsung TV I've looked at lately supports 3D. But while it's listed in the specs, it's not a big bullet on the features list in the store or online short description. Don't know if that's true generally.
Is 3D really still regarded as a selling point?
3D was a "nice to have" when it didn't add more than a few quid to the cost of a TV. Passive systems had their problems but they worked really well and in particular the glasses were cheap (and compatible with RealD cinemas).
We have such a TV at home, and a reasonable selection of films. The main problem we find is that you have to "watch" a 3D programme - it's impossible to have it on and do something else at the same time.
As far as I'm aware there isn't a single manufacturer offering a 3D TV in the UK domestic market at the moment, so I really don't know what we'll do when our TV dies. Perhaps by then it'll be back in fashion.
3D seems to be hanging on in cinemas, the problem there being that they charge too much extra. People might be willing to spend it for a big action movie, but 3D adds relatively little to a RomCom.
At work we show occasional films to the public. We have a licence which allows us to do so, so long as we don't charge. Some of these are 3D and while people don't seem to be put off by a 3D film, unless it's a special event they don't seem to go out of their way to attend our 3D screenings.
We are in the middle of a system upgrade at the moment. Our existing passive 2-projector system is being replaced by a 1-projector system. The polarising filter for this system retails at around £4,000 ex VAT.
"but 3D adds relatively little to a RomCom"
It depends just how "romantic" the action gets...
4K DOA? Haha I actually intentionally downscale 4K content. I don't want to look at people under a microscope. 4K is great for car chases, but it's horrible when you see how bad your favorite actress's skin looks when displayed as a close up on a 65" screen from 2 meters. 4K is absolutely horrible.
And I saw a few 3D movies and I actually stopped going to the movie theater because of them. I'd rather watch a film on Oculus Rift if I want it huge. In fact, it costs about the same to Oculus as it does to go to the movies and have snacks a few times a year.
4K DOA? Haha I actually intentionally downscale 4K content. I don't want to look at people under a microscope. 4K is great for car chases
Personally, I'd rather spend the bandwidth on true 100Hz progressive scanning than on upping the resolution. High frame rate video takes a little getting used to, but it makes a much bigger difference to fast-action sequences than does a few more pixels.
Biting the hand that feeds IT © 1998–2017