With the price of electricity always on the rise, the juice consumed by kit kept on standby can nibble away at the pennies. Now Toshiba has addressed this issue with a new chip said to create standby modes that require no power whatsoever. Zilch. Zip. Nada. The first telly to feature the new chip is the 32in Regza 32BE3, which …
really ? On El Reg ??
'zero watts' because it has already effectively stored up the power it would need before hand.
i.e. saves FA.
Juuuust a sec...
"The telly's power management chip works with a high-capacity capacitor, which stores enough electricity to keep the set's infrared sensor active, while disposing of the need for constant power draw."
Where exactly for the energy that capacitor to power the sensor come from, then? Unless they've made a whole new free energy discovery they're not telling us about, it'll have been stored from the mains when the TV *was* turned on.
So while they might be drawing no mains power while the TV's turned off they're not actually using less energy overall; just drawing more off a bit earlier to use when needed.
The amount of energy required to charge the capacitor will be far less than the amount required to keep the transformer and associated circuitry working all the time.
Because the capacitor won't be loosing charge and has 100% efficiency charge/discharge cycle...
My telly's been on "zero watt" standby for about ten years.
It hasn't even been plugged in ... No real point, nothing worth watching.
how do you know?
Do you check the schedules just in case?
How do I know, AC 12:47?
When out & about, I am unavoidably exposed to television.
It is, from my perspective, a vast wasteland.
>Then again, a saving of roughly £2 a year was never
>going to be a major selling point anyway
I don't know. Given the propaganda that has demonised standby mode, and the apparent innumeracy of the public it could be a major advertisable feature!
The point isn't the saving (of either money or power). The point is that standby mode is one of the witches-of-the-moment, one of those things that get unreasonable hatred and incredible misinformation, where people end up sincerely believing that shutting down a LED and an infrared sensor is going to actually have a measurable effect on anything. It is, in short, a very nice marketing target.
If is used that Nokia tech to gather power from the airwaves, then it would be zero to negative standby use. But this isn't. It still takes energy to charge the capacitor, like any similar storage device. Conservation of enegry. Great laws should not be broken.
You are totally ignoring the large inefficiencies involved when running a switchmode PSU with low drain.
An electrical engineer you are not.
It's about time manufacturers woke up and started building in switches to these devices that could be used to actually turn them off. I reckon that would be a very popular feature, currently missing from 99% of devices (AFAICS)
All you naysayers are jumping the gun...
What if it has a hand crank on the side to charge up the cap? ;-)
Now if the screen could act like a solar cell...
>Now if the screen could act like a solar cell...
We would not be able to see the X factor through it...
someone devised a means of burying a camera into the screen as well? I seem to recall it was one of Apple's patent crop.
Oh, and thumbs up for the X-factor comment.
Erm... why has standby mode EVER taken more than a few millwatts anyway? 38KHz Infrared sensor, tiny chip to handle processing it (literally in the 10p throw-away chip category), and a relay. When the sensor sees the right signal (and ONLY has to worry about the power button signal) the chip activates the relay. You can even get "all-in-one" infrared modules that do just that (and Maplin's sell no end of kits that do exactly the same).
Those circuits? Milliwatts of power. Literally. Maybe you lose a little more for things like power-conversion from the mains to the circuit but easily solved with things like this - i.e. put a small rechargeable battery in your TV to run that circuit that charges from the mains only when necessary and you could run it for years so long as it powered the milliwatt circuit long enough to cope with your longest standby time (and it could be argued that if your TV goes to standby for more than a couple of weeks, it should really turn itself off anyway). We're talking about a pound's worth of the most cheap, easily available, reliable, replaceable, simple technology known to man - the sort of thing that even the most amateur of hobbyists could knock up for you in a day with bits from their junk boxes.
*THAT'S* what I don't understand about standby, not why it has to be "zero watts" standby at all (and if you want to be like that, my system could be just as "zero watts" as their capacitor system). If you can run it off a capacitor for that length of time, then why the hell is it pulling any significant amount of power from the mains socket at all? It's lazy engineering, maybe even deliberately "wearing" so you have to replace your TV, and there's never been any excuse.
Do you see people moaning that their car key-fob systems are so powerful they wear down the car battery? My Mondeo has IR-controlled remote central locking. What, precisely, is the difference between a remote IR-signal activating a door-lock/ignition that has to keep total offline car power usage WAY below 100mA at 12V (i.e. 1.2W at absolute maximum, and most of that is things like radio memory, LEDs, alarm, etc. and STILL doesn't hit that actual figure, that's just the maximum the manufacturer says you can trickle-feed from the battery at all times) and a TV standby mode? And my car is likely to sit for two-weeks unattended in an airport carpark, but my TV isn't (and I don't care if it has to switch off in that circumstance, whereas my car battery dying is a bit more critical).
Standby modes that take anywhere near 1W were always nothing more than incredibly poorly designed and built. That's why people got into the habit of leaving their TV's on standby back when people could build reliable electronic components that didn't suck power for no reason.
You simply don't understand the problem. It's easy to step down from 12V to 3.3V at 100mW or so (google TPS62120 for a high efficiency buck regulator example).
To make a 100mW *ISOLATED* supply from 230VAC involves running the switcher directly from the supply (at least for startup) through a resistor... and for a modern chip needing ~50µA startup current, that's more than 10mW wasted continuously. We haven't even mentioned magnetisation losses in the transformer yet.
auto light level
Another economy mode so annoying everyone just turns it off. Even on mobiles, where it makes a noticeable improvement to battery life there are endless complaints if it cant be disabled. Still, that's another tick that let's them pass efficiency certifications that mean nothing in the real world.
The capacitor wheeze may be more sensible than it seems. Yes, they're lying about zero energy but switching PSU's aren't very efficient at low drains. Completely turning the PSU off might save more than it costs charging the capacitor at higher efficiency. Might.
Look at the time frames.
A couple hours each day of operation vs. 24-(a couple hours) each day stand-by. They probably needed to cut down the listening circuit a bit more so they wouldn't need too big a cap. I think the odds are a bit better than "might", though I'd like to see how they implemented the switch-on circuitry. Also what happens if you leave the thing off for a week. Will it switch on momentarily, or just fizzle out? Something for el reg to test in the upcoming review.
This is just complete marketing spherical objects.
As has already been mentioned, the system will still use exactly the same amount of power as the existing systems (and besides, I'm sure my current TV does this. Peridically a relay operates for just under a second, and I'm sure it does it to charge a capacitor to operate the standby function).
As for the "with two backlight modes that cut consumption by 50 and 75 per cent, respectively, by auto-adjusting the light intensity in response to what's being shown (or not) on the screen" Isn't this the same thing that everyone has been doing for years to claim 10000000000000000:1 contrast ratio?
In defence of this feature.
Quite right to point out that a capacitor only keeps what it's given before. That was exactly my first reaction too. And quite right that eating anything more while stand-by is poor design, though probably inevitable given that the PSU must have both enough capacity, and high efficiency both in operation and in stand-by. That lower end problem is now gone, freeing PSUs to get more efficient at their normal service levels again.
But the mere fact that they now have finally gotten around to work out how to do it, then did it, is good news. It's only a little per set, but there are quite a few of them around, and now that they did it once they can do it again, in other kit. Other manufacturers will follow, given enough demand. It's part of a drive that's long overdue. I for me am cautiously happy that this start is now finally happening. As such, it should be encouraged.
You're all missing the point...
...which is that it is actually very hard to design practical low power (microwatt) 50/60 Hz mains ac power converters. There are inherent losses in the rectification, smoothing, and switch mode (voltage change) circuits. At microwatt levels, the losses can very easily exceed the power actually needed by the downstream electronics by 10, 100 or 1000 times. Practical power converters in consumer electronics rarely consume less than 0.3W, the best ones about 0.1W.
The Toshiba innovation is very important. I suspect it will win awards. Parasitic mains power load is all over the place. It is bad in domestic environments but even worse in commercial environments. It is in desk phones, mobile phone charges, network switches, emergency lighting chargers, environmental controls and commercial light fittings (ie with addressable DALI or DSI controls). Everywhere you look there is parasitic load burning 24 hours/day, even when the equipment is not being used or the premises are unoccupied. Most of them are not using the most efficient power converters and will burn ~1W per device. It doesn't sound very much per device, but it adds up. For example, a commercial office building with 2000 lights switched off will be using more power (on that one system) than a house with its lights, PC, refrigerator and TV switched on.
The reason why this is not an issue in automotive electronics is that the system is driven by a low voltage DC source. There is no power converter leaking energy, just the functional electronics of the device itself.
The saving, which in % terms is significant, is that Toshiba have put in a capacitor which captures energy when the appliance is running, and can provide the micropower very efficiently for the relatively large number of hours when it is in standby. So the standby losses go away (not the IR sensor circuit load, which is not the issue). If this is replicated in other consumer electronics and commercial equipment there will be substantial environmental and resource saving benefits.
In conclusion, it's an essential next step.
A cheap NiCd battery, like some of us used to have on our motherboards, that activates charging only when its nearing zero battery life, that the MILLIWATTS of power required to detect an IR remote signal and act on it can run off for months at a time without ever needing to charge (and can charge all the time when the TV is actually ON). Or Li-Ion. Or NiMH. Or lead-acid. Or, hell, just a couple of alkalines.
All standby NEEDS is a way to recieve an IR signal and turn on. Everything else is wasteful and we don't have the 100-second warm-ups that ancient CRT's could have (and they never had standby anyway) any more.
I had one PC with a NiCd battery running its CMOS (and charging whenever the computer was on) run for nearly a decade before it needed replacing (and then it was a two-second job). There's no reason for fancy capacitors (that WILL blow and be an unreliable and potentially dangerous component) when you could just stick a battery with an intelligent charging circuit in there.
Why would you waste a chemical power cell on this
when a capacitor can do the job cheaper and better?
A 'battery' in this case would effectively be an over-powered chemical capacitor with very specialised recharge-circuit needs, shorter useful life and less efficiency so you are just throwing extra complexity into a solution that doesn't need it.
You mean two sheets of metal with some plastic between? Even electrolytics aren't exactly complex.
Nothing against Toshiba, but...
...I think you'll find Samsung did the same thing about a 18 months ago on a whole bunch of their sets.
Their capacitor has enough power to last about 2 weeks in standby, after whihc you have to pres the power button on the set that start it.
As a few other have pointed out; the power saving is that it can completly turn off the PSU, which are far from being efficient when providing tiny amounts of power.
This is why I think it's about time we had a new low voltage/current standard to go along with the 240V/13amp in homes. Say a smaller version of the UK 13Amp plug, it would even be good in cars, and better than those fag lighter plugs.
If every new home had a decent transformer (not a switched mode job, but a real transformer) to privide say 12 volts and an amp or two, that could privide power to nearly all modern electrical devices (phones, routers, dvd players, LCD Monitors/TV's, laptops, etc...). These days the only thing that need 240V/13Amp are the big devices like vacums, ovens, fridge-freezers and old TV's (like my 10 year old 36" Toshiba CRT).
Think of the power saving that getting rid of the dozens of cheap nasty power converters that use up more power in heat than anything else. It would even save money for manufacturing as they wouldn't have to supply those nasty cheap unreliable power adaptors.
If you're going that route
wouldn't an integrated power/data networkbe a better idea? I suspect, though, that line loss would eat up any savings on the power side.
PoE for the standby mode. Also useful for remote wake-on-lan. Possibly integrated camera (Skype) remote security as well. I think Apple are onto something with their new TV project.
You're missing the point even more...
Ours is a high-energy society. We use orders of magnitude more energy than the generation which went before us, and, if humanity is to progress, we are likely to use even more by the next generation.
That means that there needs to be adequate provision made for generation and distribution of energy. But there are activists in our midst who hate the concept of continuous advance, and are trying to restrain human endevour. The idea that minute amounts of energy should be saved is one of their proposals - as is the idea that we should all make do with continually reducing services of all kinds.
No engineer would complain about efficiency per se - and there will be occasions (particularly in the commercial field) where capacitor storage makes sense. But to push domestic TV sets with this feature is in danger of pandering to these activists, and accepting that ALL savings must be undertaken, no matter what they cost. This insiduously distorts the market place, and makes green activism the new norm.
You see it at work in many ways - water saving, for instance. We don't need every drop of water saving at vast expense - we need more reservoirs and better pipework. But so long as teh activists can push the 'save it' meme, we won't get those.
You see it with CO2 as well. Even though the 'science' of global warming is well and truly disproven, people are still talking about cutting back on CO2 production, 'because it's pollution'. No it's not! It's essential plant food, and the level it has been for the last few thousand years has been pretty meagre for plants....
"the 'science' of global warming is well and truly disproven"
While I agree with some of your other points, the one above is... a stretch, shall we say? There's another science that conservatives tend to put in apostrophes and claim has been well and truly disproven: 'evolution'.
I don't know enough about the details of climate science to offer a rigorous opinion, but the hypotheses pass the sniff test, and when the same people who violently oppose the idea of evolution being a rational science also violently oppose the idea of climate change being a rational science, I tend to go on the side of the scientists.
Funny thing is
that by not having to take supplying milliwats to the standby circuit with moderate efficiengy, the PSUs in devices can then be optimised for providing the operating load as efficienely as possible. The more considerable power savings this will provide aside, it also makes PSU design simpler, hence cheaper. In a competitive market, this will lead to your stuff being cheaper to buy.
But some people are against the idea of having more money, apparently.
I have been working at saving energy, water, etc. since decades before the greenwash started and it has absolutely nothing to do with the environment.
I agree that climate change, and specifically the extent of human influence upon it, is not 'well and truly disproven'. Perhaps that day will come.
However, to liken the situation with that of the 'debate' on evolution is a step too far. Opposing views to evolution, such as Intelligent Design, are driven exclusively by religious doctrine and the desire to permit Christian preaching to gain a foothold in the US education system.
Contrasted, many people have rational concerns over the validity and impartiality of current climate research efforts. Whist it is beyond reasonable doubt that CO2 has some impact on global heat retention, the apocalyptic predictions upon which trillions of tax dollars are already being committed appear to rely upon assumed yet ill-understood interactions of forcing processes as yet unseen in the real world.
It is fashionable to denegrate those who disagree with the mainstream view on AGW, however I wonder the extent to which that is an attempt to own the media, rather than the science.
Why don't they
just use the huge PSU caps that are already in there and charged, or is that too obvious
"just use the huge PSU caps"
They self-discharge far too quickly
Deception of the highest order
Ah, excuse me... 'The Problem' (saving the planet, etc.) is to be found at The Big End of the spectrum. Fiddling with The Tiny End (yes, even phantom power leaks with 24 hour duty cycle) is *extremely* misleading (utter deception). Exactly and precisely ZERO coal-fired power plants will be shut-down by such means. Believing that reducing a TV set's standby power from 1w to less than 1w is an actual accomplishment with any real-world impact is a *serious* cognitive disorder. It's the equivalent of an 8-inch diameter fart-can muffler on a clapped-out $500 Honda. It's a "False Finishing Touch" ™.
RE: Deception of the highest order
The power police will be around to your house checking that all of these potential power vampires are unplugged. They'll be the ones driving that big van.
It's always helpful to use some real numbers to see what difference it makes. For a typical scenario where the TV is on for 2 hours per day, and on standby for 22 hours per day, an 'on' consumption of 150W and a standby consumption of 1W, the TV energy will be 300Wh (on) plus 22Wh (standby). This means that the standby energy is 22/322 = 6.8% of the total. That is statistically significant and makes the effort to improve the design worthwhile.
It should also be noted that the standby circuits of set-top boxes, VCR (if you still have one) and hard drive recorder/DVD are often worse than 1W and their standby energy will be a bigger proportion of their overall energy consumption.
"...a typical scenario where the TV is on for 2 hours per day, and on standby for 22 hours per day..."
Well, here's the problem - you have 'off' and 'on' inverted. Bad initial assumptions another victim take...
@AdamJ "...That is statistically significant..."
6.8% of which total? Certainly not 6.8% of 'The Total Problem' ™.
And yet... There are some actual double-digit % of *all* global warming gas emissions that are low hanging fruit, and yet strangely untouched...
How about we institute carbon capture *just* (nowhere else) at all concrete factories? Such factories are almost always right next to railroad tracks, so the CO2 could be carried away by rail and fed into greenhouses for food, or algae farms to make fuel, or pumped underground for long term storage. Such an approach would result in concrete production changing to a net sucker-in of atmospheric CO2 as it ages.
How about we stop burning off methane from oil wells?
How about we ask Saudi Arabia to not run a 4-foot diamete oil pipeline over to Yeman or Oman to smelt Aluminium using fossil fuel? How about we mandate all Aluminium smelting shall occur ONLY in Iceland.
How about we run some power lines from Iceland to somewhere that people actually live?
How about we fix cow farts with dietary change?
How about someone fix the smoke-belching bunker-C powered ships that emit oxides of nitrogen that are aabout 'a quadrillion' (not really) times worse than CO2?
We humans need to switch from making promises we can't keep (Kyoto) and making trivial window-dresssing greenwash changes that don't matter at all... ...to taking actual easy, simple, cheap and effective changes that could knock 20-30% off 'The Actual Total' ™ with essentially no down-sides.
- Apple: We'll unleash OS X Yosemite beta on the MASSES on 24 July
- Pics It's Google HQ - the British one: Reg man snaps covert shots INSIDE London offices
- The END of the FONDLESLAB KINGS? Apple and Samsung have reason to FEAR
- White? Male? You work in tech? Let us guess ... Twitter? We KNEW it!
- Put down that Oracle database patch: It could cost $23,000 per CPU