Connect with us

any way to calibrate digital thermometer?

Discussion in 'Electronic Repair' started by Johnny Appleseed, May 29, 2012.

Scroll to continue with content
  1. I have a digital thermometer like the one here:

    The temperature the outdoor sensor reads is at least 3 degrees off from what
    it should be. Is there any way to adjust the temperature reading in these
    units? The instructions don't say anything about this issue.

  2. Make a calibration chart.
  3. mike

    mike Guest

    There's an old saying...
    A man with one watch always knows what time it is. A man
    with two watches, not so much.
    Same goes for thermometers.

    Publish what you find out. I've got the same problem.

    For me, the only temp that really matters is the temp at which
    the pipes freeze.
  4. Charles

    Charles Guest

  5. Phil Allison

    Phil Allison Guest

    "Johnny Appleseed"

    ** How do the two readings compare if the whole caboodle is indoors ?

    Much depends of the siting of the outdoor sensor - it needs to be in a shady
    spot and get a bit of breeze.

    ** Have look inside yours.

    The non radio linked kind generally have no adjustments.

    ... Phil
  6. gregz

    gregz Guest

    Outdoor remote sensors. Something I asked for a couple years before they
    existed for home use.

    Mine vary. I have two. 2-3 degrees. I have not tinkered with circuitry. I
    also have a third I have not compared. One is under my porch roof. As the
    patio heats up from the sun in the afternoon, it rises from true ambient.

    It's very tricky to calibrate ordinary probe devices, something I did
    frequently. Probes need to be on top of each other. Even in water, you must
    circulate it very fast for accuracy.

  7. I remain amazed. I gave a simple answer, which was implicitly endorsed by
    another person, but which has been ignored.

    "The Lady from Philadelphia" recommends the following...

    Place the conventional thermometer you trust and the transmitter in the same
    shaded spot. Make a chart with the transmitter's readings in the left
    column, the thermometer's readings in the right column. Check the digital
    thermometer's reading whenever you care to. When there's a change, walk out
    of the house and read the conventional thermometer. (Such an exhausting
    trek! Bring plenty of food and water, along with sunscreen and plenty of
    books to read. Hire a sherpa to carry it all.) Add both values to the chart.

    Is there something wrong with a simple solution? Tell me, I want to know.

    We are talking about a (presumably) cheap digital/remote thermometer, which
    likely has //no// calibration controls. (If it has any, it's probably just
    one, for a temperature around 75F.) What is it with this hacker mentality
    that demands wasting time on something that is just not that important?
  8. Which is exactly the point. How much time has been wasted on looking for
    that solution -- with no results?

    I have a wireless thermometer that's part of an atomic clock. Without even
    opening it, I'd be willing to bet that the temperature sensing elements
    comprise one resistor in series with one thermistor. You were, perhaps,
    expecting multiple resistors and thermistors, with two or three pots to get
    everything "just right"?

    This "knob" is all-too-aware from many years of experience that virtually
    all products are built to meet a price point, and that attempts to improve
    or customize them //almost// always result in failure.

    To give an example... I once owned the Pioneer RT-2000 system. It had
    modular electronics and interchangeable half-track two-channel &
    quarter-track four-channel head blocks. It was a clever and useful idea,
    poorly executed.

    When I started making live recordings, it occurred to me to position my dbx
    II noise-reduction units between the Pioneer's external electronics
    (containing the mic preamps and mixers) and the transport. To my surprise,
    there was no improvement in the S/N ratio.

    The 0dB sensitivity of the transport electronics was an unbelievably low
    0.1V, way below what is commonly taken as line level. When I measured the
    S/N ratio of the external electronics at 0.1V output, it was a miserable
    50dB. No wonder noise reduction had no effect.

    I was obliged to purchase external mic preamps. Re-engineering Pioneer's
    crappy electronics might have been a worthwhile project if I were trying to
    improve my skills in circuit design. But I wasn't, so what would be the
    point? Life is too short.

    I BEG YOUR FORGIVENESS for trying to see through to the heart of an issue,
    of trying to find simple solutions to "complicated" problems -- or of
    recognizing that there really is NO PROBLEM at all.

    You would do well to pay attention to this "knob". You might learn something
    about problem solving. But, of course, you already know everything, right?
  9. You would do well to pay attention to this "knob".
    So I take it that, over the years, you have learned nothing new about
    problem solving from me? That's a shame.
  10. mike

    mike Guest

    This is a bad idea on many levels.
    Check the operating temperature range. it's unlikely that it included
    Even if it survives...
    Most electronic temperature measurement methods rely on some form
    of linearization. Calibrating at twice the intended operating temperature
    range is unlikely to improve the readings at normal temps.

    Do the math on boiling temperature. You've just converted your
    inability to measure temperature into an inability to measure
    atmospheric pressure.
  11. mike

    mike Guest

    Well, We don't know where the OP is. I assumed he was concerned about
    being off
    by 3F. You admit to 3.6F variance due to pressure.
    I suggest that it's definitely a factor in increasing the absolute
    accuracy of his temp gauge...even if he's in C territory. If I hadn't
    mentioned it, I don't think anybody
    would have considered that as an error term. And you still don't.
  12. These type of things use a "precision" thermistor that is supposed to
    be good to a degree or so against an internal fixed reference
    resistor. There is no individual calibration. You could try to trim
    the reference resistor, but personally I'd f'dget about it. 3 degrees
    (F?) could be the difference between being in the sun and not, for

    What do you expect for < $10?

    Best regards,
    Spehro Pefhany
  13. gregz

    gregz Guest

    I got two different devices with fairly constant 3 degree difference. My
    one with wind gauge, I don't believe. I think it reads half. I'm going to
    take it for a ride some day.
    One transmitter failed after a few years. Bought replacement. I have other
    cheap units go bad, but my current models are half decent.

  14. Shaun

    Shaun Guest

    You bought it at Wal-Mart and you're surprised that it is not accurate??
    3 degrees is not bad for something that sells for less than $10.00
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day