Connect with us

DVM normalizing circuit is used for what?

Discussion in 'Electronics Homework Help' started by Simmon, Feb 14, 2014.

Scroll to continue with content
  1. Simmon

    Simmon

    76
    0
    Jan 30, 2014
    What is this DVM meter normalizing circuit used for?

    The schematic and pictures are attached files
     

    Attached Files:

  2. shumifan50

    shumifan50

    573
    56
    Jan 16, 2014
  3. jpanhalt

    jpanhalt

    426
    4
    Nov 12, 2013
    Hi Simmon,

    When you have multiple sensors, it is often convenient to have the full-scale range read the same for each. Since we tend to think in decimal, it can be convenient to adjust the output of even a single sensor to some convenient value, such as "10."

    That appears to be what the circuit does and instructions describe. The voltage divider, which is made up of a 1 M resistor and 100K pot/decade box allows one to adjust the voltage reading of a reference voltage from the sensor/computer to 10V.

    Can you link to the procedure manual? Some important steps may have been lost in the snip you show.

    John
     
  4. Simmon

    Simmon

    76
    0
    Jan 30, 2014
    .

    Why do they use a reference voltage?

    If they didn't use a reference voltage or this normalizing circuit, what would be different?

    Does the 1 meg ohm resistor change the DVM input impedance?

    When do you use this normalizing circuit for measurements mostly? for what type or kind of circuits?
     
  5. jpanhalt

    jpanhalt

    426
    4
    Nov 12, 2013
    You are giving us just a small peek of the procedure from which it is impossible to answer all of your questions.

    Let me step back and ask you, if you have a voltmeter, and you want it to read "10" as a full scale value for the maximum voltage from a sensor that produces more than 10V (only for example, you have not given enough information to be sure), how would you go about doing that? Would you not consider getting a voltage that is the same as the maximum sensor output (i.e., "reference"), putting that across a voltage divider, and adjusting that divider so it reads "10" on the voltmeter? Sure, there are other ways to do it, and a simple divider won't work, if the source is less than "10," but your initial question was, basically, what does this voltage divider do. Have you read the manual? Do you now understand the concept of "normalizing?"


    And yes, you have put a 1M resistor in parallel with the DVM's input impedance. That will change the impedance the source sees a little, but if the output impedance of the source is only 100 ohms, the effect will be pretty small. That is not always the case and is one reason there are other methods to "normalize" meter readings.

    John
     
  6. Simmon

    Simmon

    76
    0
    Jan 30, 2014
    I don't understand the theory or why would want to "normalize the meter readings?

    What circuits or tests would you want to normalize the meter readings?

    No I don't, please explain to me what it is and what it is used for please?
     
  7. shumifan50

    shumifan50

    573
    56
    Jan 16, 2014
    Have you read the article on the link that I posted - it explains that and many more concepts about instrumentation.
     
  8. jpanhalt

    jpanhalt

    426
    4
    Nov 12, 2013
    Normalization, standardization, and calibration are terms that refer to measurements and all have something in common. Since we are talking about normalization, let's look at that:

    Let's assume you drive a car. Have you ever noticed that the "normal" meter indications for water temperature, oil pressure, and battery charge have the indicator needle vertical? In the USA, where the "normal" speed limit is about 60 mph, the speedometer also is vertical. If you are flying an airplane, the course deviation indicator is vertical when you are lined up on course. North usually points straight up.

    If you look at an analog clock, where do the hands point at noon or midnight? Do they point straight up (or down), or do they point to some position off that centerline?

    There is no technical reason for any of those normalizations. The normal oil pressure could just as easily be 1/4 way up the scale and the normal temperature could be 3/4 way up the scale, and so forth.

    Normalization makes it easy for humans to interpret instrument readings and to react accordingly. If you can't understand the need to do that, maybe someone else here can explain it better.

    John
     
  9. Simmon

    Simmon

    76
    0
    Jan 30, 2014
    When have you used t to "normalize the DVM meter readings?

    What circuits or tests would you want to normalize the DVM meter readings?

    Without normalizing the DVM meter readings what would happen? the readings would be to hard to make sense?
     
  10. jpanhalt

    jpanhalt

    426
    4
    Nov 12, 2013
  11. Simmon

    Simmon

    76
    0
    Jan 30, 2014
    The Difference between a Non-Normalized measurement VS a Normalized measurement is?
    1.) The percentage is different?
    2.) The range is different?
    3.) What else?
     
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Similar Threads
There are no similar threads yet.
Loading...
Electronics Point Logo
Continue to site
Quote of the day

-