Question : Why does a LED burn out.


Say I have two components within a simple circuit:

Battery: 3.6V, 350mAH
1 LED: 2.5V, 10mA

I know that you need a resistor in series to limit the current, but what I'm trying to work out is why. See using the above components, there is an extra 0.9V across the LED than what it can take, but I'm struggling to put this into terms of how this would generate more current than the LED could handle.

Would somebody mind going through this with me and proving why exactly the LED burns out without a resistor?


Answer : Why does a LED burn out.

"Forward Voltage Drop"
"Electricity uses up a little energy pushing its way through the diode, rather like a person pushing through a door with a spring. This means that there is a small voltage across a conducting diode, it is called the forward voltage drop and is about 0.7V for all normal diodes which are made from silicon. The forward voltage drop of a diode is almost constant whatever the current passing through the diode so they have a very steep characteristic (current-voltage graph)."

Beyond this 0.7V forward voltage, the dynamic resistance is almost 0. So, a 1V source may cause a current large enough to burn out the LED. So, you need a limiting resistor.


"An LED must have a resistor connected in series to limit the current
through the LED, otherwise it will burn out almost instantly."

"The resistor value, R is given by:"
   R = ( VS - VL ) / I

"VS = supply voltage
"VL = LED voltage (usually 2V, but 4V for blue and white LEDs)
"I = LED current (e.g. 10mA = 0.01A, or 20mA = 0.02A)

"Make sure the LED current you choose is less than the maximum permitted and convert the current to amps (A) so the calculation will give the resistor value in ohms (ohm)."
Random Solutions  
programming4us programming4us