All the calculators in step 2 are just doing some simple math that you can do at home: The formula to calculate resistance in a circuit is: R=V/I or, more relevant to what we're doing: (Source Volts - LED Volts) / (Current / 1000) = Resistance * So if we have a 12v battery powering a 3.5V 25mA LED our formula becomes: (12 - 3.5) / (25 / 1000
The LED voltage drop depends on their colour (1.8 - 4.0V), to make them 12V compatible they need a series resistor which is built into '12V LEDS'. There's not really such thing as a "12V LED". Anything labeled and/or sold as such is really a 'normal' LED with a series resistor 'built-in'.
The current limiting resistor is a protective resistor connected in series to avoid excessive current burning of the appliance. The principle is to reduce the current by increasing the total resistance of the load. Generally, it can also play a role of partial pressure. Usually, in a local circuit, a resistor that has no other function in
However, if our voltage was 12V, we would have to rework our calculations to keep the same amount of current flowing through the LED. Our duty cycle would need to drop to 14.167% (1.7V divided by 12V) and our minimum PWM frequency would decrease to 14.285kHz (the inverse of [10us divided by 14.167%]). HOWEVER!, this is cause for concern. In the
As per the datasheet of the 5mm White LED, the Forward Voltage of the LED is 3.6V and the Forward Current of the LED is 30mA. Therefore, VS = 12V, VLED = 3.6V and ILED = 30mA. Substituting these values in the above equation, we can calculate the value of Series Resistance as. RSERIES = (12 - 3.6) / 0.03 = 280Ω.
Red LEDs usually have a 1.7V drop. If the LEDs are wired in series, they will drop about 5.1V. 9V - 5.1V leaves about 3.9V across the current limit resistor. I'm going to further assume that you want to have the LEDs run at 20 mA max. So: 3.9V / 0.02 Amps = 195 Ohms. The closest standard (E12) resistors are 180 or 220 Ohms. I'd choose 220 Ohms.
.
do 12v leds need resistors