Connect with us

quick question about watts.

Discussion in 'Electronic Basics' started by [email protected], Jan 10, 2005.

Scroll to continue with content
  1. Guest

    if i pluged your average 100 watt bulb into an average outlet, it would
    draw just under 1 amp, right? well, if i pluged this same bulb into a
    10 volt 10 amp power supply (100 watts) would i get the same result as
    the regular outlet?
     
  2. No. The lamp allows 1 amp to pass through when 100 volts is impressed
    across it, because ut has approximately 100 ohms of resistance (at
    normal operating temperature). If you lower the voltage, the current
    will go down, and the total power consumed by the lamp will go down,
    too. To complicate things a bit, the resistance also drops as the
    filament temperature drops, so the current does not stay proportional
    to the voltage. But the current still is lower with a lower voltage,
    just not as low as it would have been if the filament held a constant
    resistance. Remember that ohms is just a word that means volts per
    ampere.
     
  3. No.

    The voltage applied to the lightbulb would be 10 volts. Ideally, that
    is. Ideally, a power supply delivers its rated voltage to the load even
    if the load when supplied this voltage draws an amount of current
    different from what the power supply is rated for.

    Having a load drawing less current than the power supply is rated for is
    OK, while drawing more current than the supply is rated for can result in
    voltage being less than the rating of the power supply and/or overheating
    of the power supply.

    Power supplies that lack regulators will produce slightly to somewhat
    higher voltage when you draw less than rated current. Possibly
    significantly more, but I have yet to see or hear about twice as much.

    So what happens when you apply 10 volts to a 100 watt lightbulb:

    If the lightbulb is an ideal resistor, it would draw 10/120 as much
    current at 10 volts as it would at 120 volts.

    At 120 volts, 100 watts means .833 amp. 10/120 of that is .0694 amp.

    But incandescent lamps are famously not linear resistors, since the
    resistance of most metals varies roughly proportionately with temperature
    in degrees K. (Very roughly - tungsten at 2800 Kelvin has about 15 times
    as much resistance as it does at 300 K, not all that close to proportional!)

    Roughly very roughly and there are variations, but the resistance of an
    incandescent lamp is usually roughly proportional to the square root of
    the applied voltage (within the range of voltages at which the lamp
    glows). Likewise the current is roughly proportional to the square root
    of the applied voltage. this means that a 100 watt 120V lightbulb at 10
    volts would draw about .24 amp.

    (I just tried this and got .24 amp at 9.3 volts and about .255 amp at
    11.2 volts, interpolating to about .245-.246 amp at 10 volts - plus or
    minus meter reading tolerances. BEWARE - some lightbulbs will not follow
    my above "rule of thumb" that closely, with deviations becoming more
    severe with greater degree of undervoltage.)

    Go ahead and conect your 120V 100 watt lightbulb to your 10 volt 10 amp
    power supply. Expect the bulb to glow a dim orange or reddish orange,
    possibly not visible in normal room light and fair to good chance not if
    the bulb is frosted or "soft white". Even at 12 volts a 120V 100 watt
    lightbulb is a little dim for use as a night light.

    - Don Klipstein ()
     
  4. Jamie

    Jamie Guest

    nope
     
  5. Guest

    Think of the current rating as meaning the supply can produce up to 10
    amps.
     
  6. Guest

    let me see if i got it, as the voltage goes down so does the heat, and
    the cooler it is the less resistance it has, so it pulls less amps.
    this decrease in power makes for dimmer light. Do i have it figured
    out?
     
  7. Almost.

    As the voltage goes down, it pushes less current through the
    resistance of the filament. Watts (heat) is the product of volts and
    current, so with both the volts and current going down, the heat does
    down.

    But the lower temperature lowers the filament's resistance, so that
    the lower voltage pushes more current through the now lower resistance
    than it would have, if the resistance had remained constant, but still
    less that it did at the higher voltage and temperature.

    So the current falls as the voltage falls, but not in proportion to
    the voltage. (cut the voltage in half and the current is something
    like 3/4ths as large) This keeps the filament warmer than if the
    resistance had remained the same.

    Figure 1 in this page shows that for a +-20% change in lamp voltage,
    the current varies +-10%
    http://www.gilway.com/html/appl-tungsten.html

    The difficulty in capturing such situations accurately in words is why
    differential equations were invented.
     
  8. Actually, pulling less amps is from less voltage.

    If resistance is constant, then amps pulled is volts divided by the
    resistance.

    But with a lightbulb, resistance decreases as voltage decreases, so as
    voltage decreases the amps go down less than they would if the resistance
    did not change.

    A 100 watt 120V lightbulb has a resistance of 144 ohms at 120V.
    According to Ohm's Law:

    I=E/R, I=120/144, which is .833 amp

    But apply various voltages to a hypothetical ideal 144 ohm resistor and
    an actual 100 watt lightbulb:

    (Figures for the lightbulb at voltages less than 120V I am predicting
    based on what I know about them, and are mostly interpolations along with
    two extrapolations and are not actual measurements except 9.3V and 11.2V
    which are actual measurements.)
    (ALSO please note that a 144 ohm 100 watt resistor whose resistance is 144
    ohms both cold and at full power operating temperature is a hypothetical
    item - I have yet to see this in any electronics parts supplier's
    catalog! Please also note that 144 ohms is not even a standard resistor
    value but 100, 120 and 150 ohms are standard resistor values, and
    resistors of wattage rating more than 10 watts are less common and less
    standardized than ones rated 10 watts or less!)


    VOLTAGE RESISTANCE of RESISTANCE of CURRENT thru CURRENT thru
    144 ohm resistor 100W 120V lamp 144 ohm res. 120V 100W lamp
    --------------------------------------------------------------------------

    120 V 144 ohms 144 ohms .833 amp .833 amp
    100 V 144 ohms 131.5 ohms .6944 amp .76 amp
    80 V 144 ohms 117.6 ohms .5556 amp .68 amp
    60 V 144 ohms 102 ohms .41667 amp .59 amp
    50 V 144 ohms 93 ohms .347 amp .54 amp
    40 V 144 ohms 83 ohms .2778 amp .481 amp
    30 V 144 ohms 71.9 ohms .20833 amp .417 amp
    24 V 144 ohms 64.2 ohms .1667 amp .374 amp
    20 V 144 ohms 58.5 ohms .1389 amp .342 amp
    15 V 144 ohms 50.5 ohms .1041557 amp .297 amp
    12 V 144 ohms 45.3 ohms .0833 amp .265 amp

    11.2 V 144 ohms 43.9 ohms .0777 amp .255 amp
    10 V 144 ohms 40.8 ohms .069444 amp .245 amp
    9.3 V 144 ohms 38.8 ohms .0646 amp .24 amp

    6 V 144 ohms 31 ohms .041667 amp .193 amp

    ..1 V 144 ohms 9.7 ohms .6944 mA 10.3 mA

    Hope this helps,

    - Don Klipstein ()
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-