Advertisement

The hot/cold factor and LED performance

Datasheets can’t tell you everything you need to know about the performance of LEDs, where sometimes less is more

BY MUHAMAD MOUSSA
Future Lighting Solutions
Pointe Claire, Quebec, Canada
http://www.futurelightingsolutions.com

Power saving is one of the key selling points associated with LED technology. Compared to conventional light bulbs, LEDs provide a significant reduction in the power consumption of lighting and an increase in the lighting system’s efficacy, which is specified in lumens/watt (lm/W).

Although the LEDs’ advantages are significant, one limitation is that, for the same drive current, an LED’s light output decreases as its junction temperature increases. This change results in a drop in light output and efficacy.

To compensate for this phenomenon, designers often resort to using more LEDs driven at lower currents to maintain a reasonable junction temperature. Use of multiple LEDs potentially consumes additional power and increases system cost. However, improvements in a performance specification known as the hot/cold factor can minimize this impact and enhance the system’s performance.

What is the hot/cold factor?

The term that describes the amount of light-output degradation as a function of junction temperature is known as the hot/cold factor. The industry has no standard for the temperatures at which the hot/cold factor must be defined. The lower temperature is always 25°C (nominal room temperature), but the higher temperature can be any value within the limits of the LED. For the purpose of this article, we define the hot/cold factor as the ratio of the light output at 100°C to that at 25°C.

Figure 1 shows a typical plot of normalized luminous flux versus thermal pad temperature; thermal pad temperature is equivalent to the LED’s junction temperature under very short pulse test conditions.

The hot/cold factor and LED performance

Fig. 1. In the hot/cold factor curve shown, normalized luminous flux is plotted against the thermal-pad temperature, which is equivalent to the junction temperature for very short pulses.

At 25°C, the normalized luminous flux is 1, and at 100°C, it is 0.84, resulting in a hot/cold factor of 0.84/1. This means that the LED will lose 16% of its nominal light output when it operates at a thermal pad temperature of 100°C.

The hot/cold factor’s impact

At a first glance, the impact of 16% reduction in light output on one LED may not appear to be significant. However, a more severe impact can be observed when considering a light fixture consisting of multiple LEDs. The impact of the hot/cold factor can be seen by comparing its effect on a flashlight using one LED versus a recessed downlight that uses 10 LEDs (see Table 1) .

The hot/cold factor and LED performance

To a typical user, a 16-lm reduction in the light output of the flashlight doesn’t have a severe effect on its performance. However, a reduction of 160 lm in the light output of the recessed downlight is readily apparent, and so requires adding one or more LEDs to compensate for the light loss.

As a result, the overall power consumption and cost of the recessed downlight would increase. Energy Star has strict requirements for the LED luminaire efficacy and such reduction in light output may make it more difficult to achieve these requirements.

Improved hot/cold factor

Recent developments in LED technology at the epitaxial level, in phosphors, and in die attachment methods has resulted in a significant improvement in the hot/cold factor.

Some high-power LEDs currently in the market have a hot/cold factor of approximately 0.94. This means that the LED will lose only 6% of its nominal light output when it operates at a thermal pad temperature of 100°C. Figure 2 compares the light-output degradation as a function of the thermal-pad temperature for typical and improved hot/cold factors.

The hot/cold factor and LED performance

Fig. 2. The newer high-brightness LEDs have an improved hot/cold factor that lets them perform significantly better at higher temperatures.

The improvement in the hot/cold factor extends over a wide temperature range, giving lighting designers the opportunity to operate at any junction temperature, within the limits of the LED, while obtaining a higher hot/cold factor.

Performance comparison

In many cases, various LED vendors may appear to provide high light output ratings in their datasheets. Lighting designers may prematurely conclude that the LED with the higher light output in the datasheet will provide more light output in real-world settings.

This may be a false conclusion, since all datasheet values are specified at an LED junction temperature of 25°C. The performance of an LED in a lighting system must be evaluated at a higher junction temperature. Once that’s done, a comprehensive comparison can be performed to choose the better performing part under realistic operating conditions.

In an example comparison (see Table 2) , two warm-white LEDs are analyzed: LED 1 which has an improved hot/cold factor, and LED 2 with a typical hot/cold factor.

The hot/cold factor and LED performance

LED 2 clearly performs better than LED 1 at the datasheet operating conditions of 25°C junction temperature (Tj ) and 350-mA forward current (If ). But a more realistic comparison would be at higher junction temperatures (see Table 3) .

The hot/cold factor and LED performance

Because of LED 1’s higher hot/cold factor, the total light output of 9 LED 1 units is over 50 lm higher than that of 10 units of LED 2. So despite having a lower flux rating at 25°C, LED 1 outperforms LED 2 when driven at the same 350-mA current in practical applications. Fig. 3, left , shows that 9 units of LED 1 driven at any forward current above 100 mA outperform 10 units of LED 2 driven at the same current. Fig. 3, right , shows that the efficacy of LED 1 will always be higher than LED 2 when both are driven at the same current value for forward currents that exceed 100 mA.

The hot/cold factor and LED performance

Fig. 3. The charts above compare the usable flux (left) and efficacy (right) as a function of forward current for a design that uses 9 LED 1 devices and one that uses 10 LED 2 devices.

Thus, improving the hot/cold factor significantly improves LED performance operating at higher junction temperature. As a result, fewer LEDs can be used to achieve the same target light output, reducing the power consumption and overall system cost. To reach accurate conclusions when selecting an LED for an application, it is important to evaluate the LED based on the realistic operating conditions and not just datasheet values. ■

Advertisement



Learn more about Future Lighting Solutions

Leave a Reply