Connect with us

Zener minimum current

Discussion in 'General Electronics Discussion' started by rdl, Oct 5, 2013.

  1. rdl

    rdl

    2
    0
    Oct 5, 2013
    I have a remote control that puts out a 12V PWM signal. I want to use it to control an LED driver expecting a 10V PWM signal for dimming. I've done some reading and think that a 10V Zener diode regulator circuit could clip the 12V to 10V.

    I've not selected any particular diode yet, but have been looking at many spec sheets. I am unable to find or work out the minimum current required through the diodes to keep them in their regulating range. Some tutorials suggest ~5 mA is common, but I'd like to get it right rather than assume.

    What is the secret to finding this drop out current?

    As you might have guessed, I'm a noobie at electronics. Any suggestions would be welcome. TIA

    Regards
    rdl
     
  2. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,191
    2,693
    Jan 21, 2010
    If you have a high impedance input that you are driving, a simple resistive divider would probably suffice.

    If you can describe more fully the source of this PWM signal and the device which you are trying to interface it with, we may be able to assist.

    In this application, the zener current is probably not an issue. You would need some sort of series resistor to ensure you don't damage the signal source though. It is only low voltage zeners that suffer badly from soft knees where a sufficient current is required to get up to the nominal voltage.
     
  3. rdl

    rdl

    2
    0
    Oct 5, 2013
    The device I'm trying to control is a MeanWell LPF-40D-12 LED driver in a lighting fixture. The spec sheet does not specify the input impedance or current draw on the 10 DCV dimming signal circuit. So, before going much further I will apply 10V to the dimming control terminals and measure the current - probably only a few mA, but better to know than guess.
    EDIT: I should have mentioned that the dimming control apparently accepts either a 0 - 10 VDC signal or a 10 VDC PWM. So I trust that 9 or 10 VDC from a battery should allow me to work out current draw.

    The source of the signal will be an RF remote controlled 12 V LED driver which uses PWM for dimming. I am planning to use its output simply to control the LPF-40D-12. It will not be used as an LED driver per se. I've no detailed specs & its not even in hand yet - in transit. But SWMBO will want everything operating about 10 minutes after arrival :D

    The reason for all this complexity is that the MeanWell LED driver needs 5 wires for full control: 3 for A/C line/neutral/ground & 2 for the 10V dimmer control circuit. I've got only 3 wires from the wall switch to the ceiling. Adding 2 more wires would require major surgery to the walls and ceiling.

    I had a forehead slapping moment at your suggestion of a simple voltage divider. It seems I've latched onto the complicated solution rather than the easy, simpler one. I did say that I am a noobie to electronics - can't see the forest for the trees. Thanks for pointing me in this direction.

    Regards
    RDL
     
    Last edited: Oct 5, 2013
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-