3
\$\begingroup\$

I am designing an LED matrix for an LED lamp. I need to calculate the heat dissipation from all LEDs which is the dissipation of one LED times their number to choose an appropriate heat sink.

The problem I am facing I didn't find efficiency parameter as % percentage in the datasheet.

After some search I have found that I need to consider the relative luminous intensity vs wave length curve to calculate the power converted to light and divide by total power.

I need some help to know the correct way to calculate that power converted to light so that I know how much is converted to heat and also understand the relative luminous intensity curve is relative to what value.

Datasheet of the LED:

Datasheet Link

The part I have some doubt I should be using is:

relative luminous intensity

\$\endgroup\$
3
  • \$\begingroup\$ Without knowing luminous efficiency it's guesswork so, maybe contact the supplier to see what they say. \$\endgroup\$ Commented 11 hours ago
  • \$\begingroup\$ It'll depend on color, and semicon material, and size. blue led's are better in energy terms. human-visible light output is another measure which is different still \$\endgroup\$ Commented 10 hours ago
  • \$\begingroup\$ @Andyaka The datasheet says LED outputs about 60 lumens at 150mA and it also says at 150mA the forward voltage is about 3.0V. Therefore, efficiency in lumens per watt can be calculated. I don't see how that helps calculating how many % of the input power is wasted as heat and how many % converted to light. While most answers below say assume 100% converts to heat, they don't answer the actual question. \$\endgroup\$ Commented 8 hours ago

3 Answers 3

7
\$\begingroup\$

For a cheap LEDs like those that will turn the vast majority of energy into heat, 0% efficient power to light conversion is a very reasonable and safe assumption for thermal calculations. Some percent will actually become light of course, but that amount is small compared even to the range of power consumptions listed in the datasheet and so can be ignored.

It's only for relatively high efficiency devices where the fraction of energy converting into light had a meaningful impact on cooling requirements, and such devices will provide specs.

\$\endgroup\$
1
  • \$\begingroup\$ +1 from me, but I wanted to give too much background to put in a comment \$\endgroup\$ Commented 9 hours ago
3
\$\begingroup\$

A conservative estimate is that 100% of your electrical input is converted into heat, and needs to be dissipated by the heatsink.

The error in that assumption is smaller than the uncertainties in predicting exactly what °C/W you're going to get from your heatsink, with uncertainties of airflow, ambient temperature etc.

Any heat power you lose to light output will be a bonus, reducing the LED temperatures and improving their life.

\$\endgroup\$
2
\$\begingroup\$

You might find this paper from 2016 interesting Model Identification and Wall-Plug Efficiency Measurement of White LED Modules. It's not how I'd do it, but I have access to calibrated optical measuring equipment. Instead they look at the heat transport - something rather prone to systematic errors but useful enough here. Their method may adapt for your case (taking into account light emitted by the LED and absorbed by your fixture) but that's not what I suggest

The best case figure they show is around 40% efficient at converting electrical to optical power; the worst case under 20%, and that worst case takes place at elevated temperatures. So you should design on the assumption that essentially all the electrical power is turned to heat.

\$\endgroup\$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.