# How to fix voltage drops at 220V in this inverter Circuit

Discussion in 'Power Electronics' started by said az, Feb 24, 2018.

1. ### said az

4
0
Feb 24, 2018
How can i fix the voltage at 220V , when I connect the voltmetre with no load it shows 220V but when I add a lamp it drops to 87V
What can I do to improve the circuit and what are your opinions about it

File size:
152.5 KB
Views:
99
File size:
161.2 KB
Views:
95
File size:
159.8 KB
Views:
90
2. ### Audioguru

3,616
763
Sep 24, 2016
When one IRL540 Mosfet conducts 20A then its voltage loss across it is 1V. Then the transformer gets 11V instead of 12V.

But the unknown transformer probably has a design of 220V input to make 14V unloaded output or 12V output at its full load of maybe 20A.

Then the 11V from your circuit into the 220V to 14V transformer is stepped up to only (220/14) x 11V= 173V when the 220V load is 1.3A or 225W. Its output is a squarewave but your meter assumes it is a sinewave with a peak voltage 1.414 times higher. Then the 173V shows as 122V on your meter.

You need to know how much output current you want.
You need to know how much voltage loss the transformer produces at your current.
You need to know if the transformer can produce that much current.

A light bulb resistance is low when it is cold then its current and power are maybe 10 times as much as when it is white hot. Maybe the transformer cannot produce 2250W.

The simple circuit has no voltage regulation so select a transformer that produces maybe 240V with no load and produces 200V with a full load of about 1.1A (242W) at 220V. It will probably be 220V to 20V center-tapped.
If your load draws more than 1.1A at 220V then you need more Mosfets in parallel to share the current and heat.

davenn likes this.