# too much input voltage?

Discussion in 'General Electronics Discussion' started by camphor1122, Oct 21, 2014.

1. ### camphor1122

6
0
Aug 12, 2014
Hi guys,

I have a buffer amplifier that has a single rail supply of 5v.
Suppose I have an analogue signal as the input signal to the buffer that sometimes exceeds 5v (0v-5.5v)
So in this case, ideally the output voltage won't exceed 5v and clipping/data loss will occur.
Since this is a problem. Is there a method to track the input voltage to determine if it exceeds 5v and have the output scaled down so it will not clip but at the same time have a maximum output signal?

Any help is appreciated!

2. ### (*steve*)¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥdModerator

25,505
2,849
Jan 21, 2010
You could attenuate the input signal slightly so that 0 - 5.5V became 0 - 4.5V. But this is no longer a unity gain buffer.

You would be best off ensuring the input does not exceed the limits for the device (often the supply rails) or by connecting diodes between the input and the supply rails to limit excursions beyond the supply rails.

3. ### Harald KappModeratorModerator

12,326
2,930
Nov 17, 2011
That's what a compressor does. Note that the circuit will be no longer linear as the compressor changes gain depending on volume.

4. ### KrisBlueNZSadly passed away in 2015

8,393
1,272
Nov 28, 2011
A compressor acts on the amplitude of an AC signal (usually an audio frequency signal). I think in this case, the input signal is just a voltage level, not an AC signal with varying amplitude, and its range may exceed the 0~5V range of the buffer amplifier.

The best answer to the question depends on a number of factors. Please describe the whole project and application in detail so we can suggest the best way(s) of detecting overvoltage.