design LEDs that minimise the non-radiative recombination
Less heat will be welcome too. A smaller heat sink will also cut costs.
One of the problems with using LED-based lamps to replace incandescent or fluorescent lamps is that they're expensive: not only do they need more electronics than the alternatives, LED efficiency is capped by a fall in light output at higher current. To answer the so-what question: getting rid of the “droop effect” allows LEDs …
No, heat is not light. Heat radiation is the same as light though, but you can transfer heat in other ways too e.g. conduction or convection.
But you don't have to transfer "heat", for example a single hydrogen molecule in a vacuum can be vibrationally excited or not, meaning it has more or less heat, but this doesn't involve any photons.
> Heat _is_ light .. just at a lower frequency.
Um, not exactly. Heat is energy transferred from one body to another due to the two bodies being at different temperatures. The mechanism of the energy transfer -- whether it is infrared light, visible light, kinetic energy, etc. -- is not relevant to the question of whether or not it is heat.
This post has been deleted by its author
This post has been deleted by its author
Manufacturing is driving costs too. An LED's layers are built at very high temperature so a big chip would warp and crack when cooled. This limits production to small sizes on synthetic sapphire. I suspect that one reason that the US DOE is looking specifically at higher intensities is for using LEDs to pump lasers.
You need to conserve both energy and momentum. When an electron and hole combine, the energy is usually emitted as a photon (light) in an LED.
However, a photon has very little momentum compared to what the electron and hole had. That's why only SOME semiconductors are good at emitting light--the ones with what is called a "direct bandgap" where it's possible for the electrons and holes to combine with a small net momentum.
An "indirect bandgap" material, such as silicon, has the low point of the conduction band (where the electron is) offset in momentum space from the high point of the valence band (where the hole is). If their recombination energy is given up as a photon, momentum can't be conserved. So the energy is usually given off as a PHONON, which is a vibration of the crystal lattice (i.e. heat) which can conserve both energy and momentum.
If, instead of exciting the lattice of atoms or emitting light, the energy is used to excite a free electron to carry away the energy and momentum, that's an Auger recombination. In general "Auger" refers to events involving multiple electrons. At high power, LEDs have plenty of excess electrons in the depletion region (the junction of the diode where recombination takes place) so it's more likely one of them will be nearby to accept the energy given up.
QUOTE: "The result is that some of the electrons energy arriving in the LED's drive current is given off as heat instead of light."
Kind of obvious since my add-on headlight LED's require bar copper to conduct the heat away from the LED arrays, and my ambient is often around 40C.
Now, if I could figure out how to boil water with the dissipated energy on my motorscooter I could make cafe sua da on the go.
I know the reasoning is cost, but do you need to light your whole room with one LED? Kind of the same mentality that puts a ridiculous amount of medication in one pill instead of just taking two?
Maybe efforts to reduce the cost of current high-brightness LEDs instead of making existing ones brighter at higher current would be better? They're already bright enough to make your eyes water as it is...
"[...] but do you need to light your whole room with one LED?"
If you are building a projector, you want your light source to be as close to a point source as possible (think movie theaters, video projectors in your office, etc.)
If you are building a light source designed to illuminate things far away, you want your light source to be as close to a point source as possible (think flashlights, car headlights, spotlights.)
If you are building an architectural wonder, you want your lights small so you can better hide them.
If you are pumping a military weapons-grade laser's gain medium, you want your sources as small as possible.
If you are driving an optical fiber, you want your light sources as small as possible.
If you are making an LED video display, you want as much light as possible out of each tiny pixel (ESPECIALLY if you are going for sunlight readability).
I learned a principle of lighting efficiency first hand decades ago. My first house had a bathroom lit by a 100 watt bare bulb. We replaced it with a strip of 5 candleabra bulbs, 25 watt frosted: too dull. Ended up with 5x40 watt clears, 200 watts to produce the same brightness as single 100 watt bulb. That's in a small reflective room. In a very large space, on a stage, replacing strip lights ("x-rays") of about 16x300 watt reflector bulbs with two 1000 watt scoops produced more light, in this case 2000 watts vs. 4800.
There's a principle at play here that undoubtedly will apply to led as it does to existing lighting: fewer higher output light souces will be more efficient than numerous smaller ones.
But it's harder than having a single coherent/point source to start with.
For standard lighting, more leds over a larger surface area isn't generally a problem (look up "corn cob" lamps to get an idea how some makers get around the issue - the same primciple is used in flourescent batten replacements and 600*600mm ceiling tiles.). Point sources in domestic environments are more of a nuisance than anything else.