Connect with us

converting rs-232 voltage to current in my circuit design

Discussion in 'General Electronics Discussion' started by robismyname, Jan 19, 2013.

  1. robismyname

    robismyname

    5
    0
    May 22, 2012
    Im using a TI Transimpedance Amplifier OPA138 in a design. The amp is used to convert input current to voltage output.

    For reasons related to quick testing capabilities I want to take rs232 voltage signals (from my laptop) and input these signals into the amp. But the amp like I said before is looking for current (10mA max).

    So my question is since the serial signal is voltage based can i just place a series resistor between the rs232 signal source and my amp to provide the current that the am is looking for?

    I calculated 5k resistor foe 2ma current input. I measured 10V serial voltage.
     
  2. (*steve*)

    (*steve*) ¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd Moderator

    25,178
    2,690
    Jan 21, 2010
    A lot depends on the "standard" used for the RS232.

    In the dim dark past, you could expect +/-12V (or even more). These days you might get as little as 0-3.3V.

    But yeah, a resistor will do it. RS232 is (was) specified to allow any pin to be shorted to any other pin, so in theory you can't damage anything if you draw too much current. However that does not necessarily apply to RS232 that's emulated with logic levels from who-knows-where.
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day

-