Hi all!
Attached is a circuit I have designed for measuring temperature with P1K0.232.6W.A.010 RTD sensor, connected in the 4-wire configuration. The sensor is placed on a custom PCB, around 3m away from the rest of the circuit.
U2, U3 and R3 produce a constant current source for the RTD. Theoretically, U2 produces 1.25V and the constant current is around 1.25/3.16k=~396uA. In reality, the current that I measure is 401uA.
The voltage developed across RTD is first amplified with U5 (with a theoretical gain of [R_gain/100k] + 1 = 2, but in reality R_gain has the value of 99.5k and thus the gain is around 1.995) and then measured with a 12-bit ADC, integrated in the EM250 ZigBee SoC. The measured value is given to me directly in mV, with help of the API offered with EM250 from Ember.
To convert the result from the ADC into Temperature I do the following process:
1) The measured voltage is divided with 1.995 (the actual gain), to find the real voltage value across the RTD.
2) Afterwards, I calculate the resistance value of RTD by dividing the above calculated voltage with the measuring current.
3) To find the temperature, I use the equation of α: α=(RT-R0)/(R0*T)=0.00385
The problem I face is that the measured temperature is quite different from the real. The error is around 6deg Celsius and I cannot find the reason for that.
I took some more specific measurements and found the following: I measure the voltage directly at the two legs of the PT1000 (at the remote PCB) to be around 448mV, which means around 29 deg. C. On the other hand, the ADC measures 438mV (after dividing the true result with the gain of AD623), which means around 23 deg. C.
Do you have any good idea on where this error comes from and how I could eliminate it? Is it possible that the this 10mV error is caused by the input bias currents of the AD623? It seems weird though, that such a current of 20nA can solely produce such a huge error.
Is it possible that the input filter formed by R1, R2, C1, C2 causes any problem?
I feel that probably this error is caused by resistors tolerances, but the point is that I have measured the true values of the components and made the computations with these real values (but still the problem is there).
Can anyone please help me with this? Do you think that it is possible to go down to an accuracy of around +/- 1C with this circuit, because this was my initial goal. Is it possible to make the accuracy better by calibration? I don't have any previous experience on this though, I don't know how to do it.
Thanks,
Nikos
Attached is a circuit I have designed for measuring temperature with P1K0.232.6W.A.010 RTD sensor, connected in the 4-wire configuration. The sensor is placed on a custom PCB, around 3m away from the rest of the circuit.
U2, U3 and R3 produce a constant current source for the RTD. Theoretically, U2 produces 1.25V and the constant current is around 1.25/3.16k=~396uA. In reality, the current that I measure is 401uA.
The voltage developed across RTD is first amplified with U5 (with a theoretical gain of [R_gain/100k] + 1 = 2, but in reality R_gain has the value of 99.5k and thus the gain is around 1.995) and then measured with a 12-bit ADC, integrated in the EM250 ZigBee SoC. The measured value is given to me directly in mV, with help of the API offered with EM250 from Ember.
To convert the result from the ADC into Temperature I do the following process:
1) The measured voltage is divided with 1.995 (the actual gain), to find the real voltage value across the RTD.
2) Afterwards, I calculate the resistance value of RTD by dividing the above calculated voltage with the measuring current.
3) To find the temperature, I use the equation of α: α=(RT-R0)/(R0*T)=0.00385
The problem I face is that the measured temperature is quite different from the real. The error is around 6deg Celsius and I cannot find the reason for that.
I took some more specific measurements and found the following: I measure the voltage directly at the two legs of the PT1000 (at the remote PCB) to be around 448mV, which means around 29 deg. C. On the other hand, the ADC measures 438mV (after dividing the true result with the gain of AD623), which means around 23 deg. C.
Do you have any good idea on where this error comes from and how I could eliminate it? Is it possible that the this 10mV error is caused by the input bias currents of the AD623? It seems weird though, that such a current of 20nA can solely produce such a huge error.
Is it possible that the input filter formed by R1, R2, C1, C2 causes any problem?
I feel that probably this error is caused by resistors tolerances, but the point is that I have measured the true values of the components and made the computations with these real values (but still the problem is there).
Can anyone please help me with this? Do you think that it is possible to go down to an accuracy of around +/- 1C with this circuit, because this was my initial goal. Is it possible to make the accuracy better by calibration? I don't have any previous experience on this though, I don't know how to do it.
Thanks,
Nikos
Attachments
Last edited: