# voltage divider network

Discussion in 'General Electronics Discussion' started by juh, Mar 20, 2013.

1. ### juh

27
0
Feb 21, 2013
i am designing a voltage divider network. my input is 1000V, .2A and I want my output to be 500V. I will use two pairs of parallel 10kilo ohm resistors connected in series. is that right? how about the wattage? how will i consider that? thank you so much. this forum has been a great help for me!

2. ### Harald KappModeratorModerator

11,449
2,629
Nov 17, 2011
That's 1000V*0.2A=200W. Are you sure?

I assume that 1000V, 0.2A is the rating of your source, not what you actually want to dissipate.
In that case forget about the 0.2A. Using 2*10kOhm resistors you can build a divider that outputs 500V. The power lost in the divider will be 1000V^2/20kOhm= 50W so maybe going for >100kOhm per resistor is a much better option. Using e.g. 2*1MOhm, the power lost will be only 0.5W which is much better manageable.

You should also be aware that you cannot draw any considerable amount of current from the output of the divider. The voltage will drop as soon as you draw current due to the high internal resistance of the divider.

It might help to know what you are trying to do.

Last edited: Mar 20, 2013
3. ### juh

27
0
Feb 21, 2013
we have a power supply that outputs that rating (1000V 0.2A). we want to test it with our multitester but it can only handle up to 500V so we decided to build a divider network. what else should i consider?

4. ### Harald KappModeratorModerator

11,449
2,629
Nov 17, 2011
Why? Everything you need is in my answer.

5. ### juh

27
0
Feb 21, 2013
ok. thank you.