# impedance problem

Discussion in 'General Electronics Discussion' started by ingram010, Oct 31, 2012.

1. ### ingram010

3
0
Oct 31, 2012
Hi I need to read the voltage from a power supply, the datasheet for the power supply states that any digital measuring device needs an input impedance of at least 10M Ohms, unfortunately the Digital measuring device I have states that the maximum input impedance is 1M Ohm. What options do I have? Is there a circuit that can increase the input impedance of my measuring device?

Regards

John
John

Posts: 1
Joined: Tue Oct 30, 2012 4:24 pm

2. ### Harald KappModeratorModerator

11,268
2,577
Nov 17, 2011
Your option is: get another multimeter or a 10:1 probe with 9M input resistance (making for a total of 10M including the multimeter).

What kind of power supply is this if it requires a max. load of 10 Meg? What useful things can you do with this supply?
Could it be a misunderstanding of the datasheet on your side? Can we access the datasheet for verification?

Harald

3. ### ingram010

3
0
Oct 31, 2012

Here is the extract from the data sheet, it is a 1KV Glassman MK series.

Voltage monitor. J1-4
A 0-10v signal, positive with respect to common, and in direct proportion to output voltage, is available at this pin. a 10k Ohm limiting impedance protects the internal circuitry so that a digital voltmeter with greater than 10 Megohms input impedance should be used to monitor this output. It is also acceptable to use a 1 mA DC full scale instrument (i.e analog meter) for monitor purposes.

I am not using a multimeter to monitor the voltage I am using a labjack u3-HV.

I have a very tight budget so I am stuck with what I have already got.

Regards

John

4. ### Harald KappModeratorModerator

11,268
2,577
Nov 17, 2011
The 10Meg limit is there to achieve greater accuracy. The internal 10k resistor of the monitor output and the input resistance of the meter form a resistive voltage divider.
Using 10Meg the divider ratio is 10Meg/(10Meg+10k)=0.9990
Using 1Meg the divider ratio is 1Meg/(1Meg+10k)=0.9901
The error using a 1Meg instrument compared to a 10Meg instrument is therefore 0.009 or appprox. 1%. If you can live with that, use the 1Meg instrument as it is.
If you want better accuracy, you can compensate the error of the 1Meg instrument by multiplying the measured value by a factor 0.9990/0.9901=1.0099. But that is probably unnecessary.

Alternatively you could use a 100Ohm resistor to model an amperemeter. 1mA will develop a voltage of 0.1V across the resistor. You can measure that voltage using the 1Meg instrument without with high accuracy.
Harald

5. ### ingram010

3
0
Oct 31, 2012
I am very grateful of you explanation, I was concerned that I might damage the labjack if I connected it directly to the voltage monitor of the power supply.
As you can probably tell I need some electronics tuition.

Many thanks
John