Do battery's in general consume more power when charging compared to the power they output when fully charged?
Yes, because they are never 100% efficient. But you need to get your terminology clear.
A battery stores energy, not power. Energy is measured in joules, and one joule is the amount of energy transferred when one watt of power flows for one second.
Take a typical rechargeable cell such as NiMH, rated at 2500 mAh (milliamp-hours), fully charged. For simplicity let's assume that it terminal voltage remains constant at 1.2V during the discharge process.
Because the cell is rated at 2.5 Ah, its "1C" (1 × capacity) charge and discharge current is 2.5A. Nominally, if you charge it fully then discharge it at a steady 2.5A, it will last for an hour before its stored energy is exhausted. At least, that's what the specification claims.
We have enough information here to calculate how much energy it releases when discharged at a rate of 1C. First we need to know how much power it delivers. Power is voltage × current, which is 1.2V × 2.5A which is 3W. So while the cell is discharging, it is delivering a steady 3W of power.
Since we are discharging it at 1C, the discharge time is one hour (its rating is 2.5 amp-hours). One hour is 60 × 60 = 3600 seconds. So the total energy it delivers during its discharge is 3W × 3600 seconds = 10,800 joules, or 10.8 kJ (kilojoules).
Typically, to fully charge a cell like that, you use a complicated sequence of charging stages - precharge, bulk charge, and top-off, at least. During this time, the total amount of energy you need to put into the cell is a lot higher than the energy you will get back out of it. Some of this extra energy is converted to heat during the charge process and lost forever. Some of it is lost as heat on discharge.
A typical complete charge process on an empty cell at a charge current of 1C (2.5A for this cell) might take 100 minutes and raise the terminal voltage to 1.3V on average. So the charging power will be 1.3V × 2.5A = 3.25W and the charge time will be 100 × 60 seconds, so the total charge energy will be 3.25W × 6000 seconds = 19,500 joules, nearly twice the amount of energy you get out. That's pretty typical for the type of cells I'm familiar with. Newer ones like Li-ion and Li-pol might be better.
This is just a very simple description to familiarise you with the quantities involved. In the real world there are a huge number of complicating factors. Here are just two examples: I assumed a constant terminal voltage during charging and discharging, but as you know, this never happens; capacity and available energy depend on the discharge rate (you will get more energy out of a charged battery if you discharge it more slowly - for example, discharging it at half the current will more than double the discharge time).
beside the heat loses, what about the voltage drop that happens during the discharging of the battery, in the words the voltage wont stay at a certain level for the whole charge that the battery have and suddenly drop to zero after the battery reaches 0% charge level.
doesn't that count as a power loss ?
Sort of. During the discharge process, the terminal voltage does drop, and for most loads, the power will drop as well, but the important quantity when you're calculating battery efficiency is energy, not power.