B
Bill yg
- Jan 1, 1970
- 0
Dear All,
Perhaps this is a dumb question but I really find confused.
I am building a simple RC-phase shift oscillator using an opamp, with
+/- 15V power supplies. I used a trimmer in the feedback resistor
across the opamp. However, I found something interesting.
The output amplitude is normally +/- 15V. The waveform is somewhat a
fine-looking sine wave.
If I set a larger opamp gain, the output waveform is distorted and
looked like a square wave.
If I set a smaller opamp gain just enough to start oscillation, it
will result in a +/- 12V or smaller sine wave.
The question is:
Why is this so? Why won't the +/- 12V sine wave grow up to +/- 15V?
What I mean is, in theory, the oscillation should grow up to
infinity... and so why now is bounded by +/- VCC? and even with a
smaller gain, bounded by +/- 12V?
Thank you very much.
Billy yg
Perhaps this is a dumb question but I really find confused.
I am building a simple RC-phase shift oscillator using an opamp, with
+/- 15V power supplies. I used a trimmer in the feedback resistor
across the opamp. However, I found something interesting.
The output amplitude is normally +/- 15V. The waveform is somewhat a
fine-looking sine wave.
If I set a larger opamp gain, the output waveform is distorted and
looked like a square wave.
If I set a smaller opamp gain just enough to start oscillation, it
will result in a +/- 12V or smaller sine wave.
The question is:
Why is this so? Why won't the +/- 12V sine wave grow up to +/- 15V?
What I mean is, in theory, the oscillation should grow up to
infinity... and so why now is bounded by +/- VCC? and even with a
smaller gain, bounded by +/- 12V?
Thank you very much.
Billy yg