So is back EMF , counter electromotive force, basically the resistance the motor produces against the source which drives it? So D3 would be there to stop feedback to the FET?
Actually, back EMF is not the best name to use for this phenomenon because back EMF occurs whenever the magnetic field in an inductor is changing, not just when it is collapsing. The proper name is inductive kickback. It's also called flyback, inductive kick, and inductive spike.
It's a bit difficult to explain clearly, but I'll give you an introduction and a list of links. You may find that you need to understand voltage and current more clearly before you can understand inductance and inductive kickback. I know it was the last of the basic electronics quantities that I was able to grasp.
Inductive kickback is a phenomenon that happens with any inductive component when it has built up a magnetic field then the current flow is interrupted. Inductive components include inductors, electromagnets, relay coils, and motors - any component that has a
coil of wire wound around a core. This creates a characteristic called inductance, which has certain interesting behaviours.
In the circuit in post #4 on this thread, the MOSFET (Q1) is driving an inductive load - a motor. It is being switched ON and OFF a few thousand times per second by a control signal generated by the 555. But it's easier to visualise a single instance of inductive kickback - the situation that occurs when a relay coil (for example) has been energised for a period of time, and is then switched OFF.
While the control signal is high (typically around +10V relative to the 0V rail), the MOSFET is ON, i.e. conducting, and it completes the circuit and applies the supply voltage across the inductive load. Current flows as shown by the red arrows. This current, and the magnetic field, builds up in the inductive load until it is only limited by the DC resistance of the load.
If the MOSFET remains ON, the circuit stabilises with a steady current flowing as shown, and a constant, steady magnetic field in the load. If the load is an electromagnet, it will be magnetised; if it's a relay, it will be activated (pulled in).
Now have a look at what happens when the MOSFET is turned OFF. This happens when the control signal goes to 0V. The MOSFET responds by turning OFF, i.e. it stops conducting. It's like a switch that has just been opened, to break the circuit.
There is no longer any path for current to flow to or from the battery; the battery and the MOSFET are completely out of the circuit. But because of the nature of inductance, and the stored magnetic field that has built up in the core of the inductor,
the inductor current cannot change quickly. I found this hard to grasp, but that's really what's happening. The inductor "tries" to keep the current flowing through it.
If there is no direct path for this current, the inductor will actually generate a voltage across it, and this voltage can be very high - hundreds, even thousands of volts - and it will cause something in the circuit to break down and conduct. The inductor will try to do anything it can to keep the current flowing. The breakdown could happen in the insulation on the wires in the inductor itself, or the wires going to it; it could happen along the surface of the circuit board in between the inductor pins; or it could happen
within the MOSFET. This voltage can be very destructive. It can generate some nasty electromagnetic interference and make your monitor display jump. It can also give you a short but painful shock. This effect is called
inductive kickback.
The actual voltage that the inductor generates when the external current path is interrupted depends on the external circuit. The inductor doesn't care much; it just wants the current flow to continue. If there's no direct path for that current, the inductive kickback voltage just increases rapidly until something breaks down and conducts the current. But you can clamp the inductive kickback
voltage by connecting a diode across the inductive load, as shown in the schematic. The diode provides an easy path for the current to follow when the MOSFET turns OFF. The current flows within the little loop shown in the second diagram. No (significant) voltage is generated, so components in the circuit aren't stressed until they break down. Specifically, the MOSFET is protected from the voltage spike that would otherwise be hundreds of volts.
This is the reason why inductive loads should always have some kind of suppression. A diode is a common method when the inductor is driven by DC, because the polarity of the voltage spike generated by the inductor is opposite to the polarity of the voltage that was applied to it, so the diode has no effect when the MOSFET is ON; it only takes over when the MOSFET is OFF.
The current (and voltage) generated by inductive kickback is usually pretty brief. Losses in the circuit allow the magnetic field in the inductor to collapse, and the inductor returns to its idle state after a short time.
If that made sense, here's some more reading material:
http://www.allaboutcircuits.com/vol_3/chpt_3/9.html
https://www.wisc-online.com/learn/career-clusters/stem/ace5803/the-inductive-kick-of-an-inductor
But none of this relates to your LED dimmer project, because LEDs are not inductive!
This is where I get stuck!
How do you know what component types to choose for a certain job, is it experience? Or is it all in the description, ie 470uf 25v, I see its polarised, does that tell you which type it is
It can be hard to know. In general, component parameters that aren't important in the application aren't shown on the schematic or in the parts list - at least, that's true with properly presented designs; not always with designs from the inexperienced.
Some components - capacitors and inductors for example - can be described with many parameters in addition to the basic numbers.
Capacitors have maybe the widest range of values of any component - from less than 0.1 pF (10
-13 farads) to over 10 farads - a range of 14 orders of magnitude at least. There are many types of capacitors, each of which covers part of this range. Each type has specific advantages and disadvantages, and is a compromise between good and bad qualities chosen to suit the application (or vice versa, really).
For capacitors, the basic specifications are capacitance and rated voltage. Capacitance is almost always important, although electrolytics typically have an initial tolerance of ±20% and in some applications, other factors such as ESR are more important. Rated voltage may not be specified for low-voltage circuits where a voltage rating of 16V or 25V can be assumed for large capacitors and 50V can be assumed for small capacitors (which aren't usually made with voltage ratings less than 50V).
Other capacitor specifications include the type and dielectric (the materials it's made from and the way it's constructed), which are important in many applications; the initial capacitance tolerance (accuracy) and sometimes the variation of capacitance with temperature; the ESR (effective series resistance) and inductance, which usually must be low for capacitors used to smooth the outputs of switching power supplies; the maximum allowable ripple current, which often must be high for smoothing capacitors; leakage current, which is sometimes important; and of course, the package (style, size, weight, type of connections), and whether the capacitor is polarised or not.
Most capacitors above around 10 µF are either aluminium electrolytic or tantalum, and both of these types are polarised (special non-polar electrolytics are available in a limited range). Supercapacitors are a special type of electrolytic with a very large capacitance; these are used for backup in some types of appliances.
Capacitors below around 1 µF are generally one of the non-polarised types, and there are many. The most common type is ceramic and MLCC (multi-layer ceramic capacitor); these are available in a very wide range of capacitance values and with various dielectrics and are used for a wide range of applications, such as timing and decoupling.
There are various kinds of metal film capacitors such as polystyrene, polyester, polypropylene and others, and less common types such as silvered mica, which are used in high-frequency applications.
Inductors have a similarly complicated situation to capacitors. There's a huge range of inductances and physical sizes, and there are many parameters that are significant, because, like capacitors, inductors are not usually very close to ideal components. There are many compromises for reasons of cost, manufacturability etc which the designer needs to be aware of. The situation with transformers is even worse, and they are often custom made. I won't go into any more detail here.
Resistors are mainly specified by their resistance (duh), and their maximum power dissipation in watts. In most parts of most circuits, resistors don't dissipate much power and you can use standard sized low-power resistors, which commonly have ratings of 1/4W, 1/3W, 1/2W and 0.6W. Often also 1/8W. If power dissipation is significant, a power rating will be specified on the diagram or in the text. Resistors also have an initial resistance tolerance, which nowadays is normally ±1%, or ±5% for some larger resistors.
For general applications, in the past, 1/4W or 1/3W carbon resistors with ±5% tolerance were used as standard, but I recommend using metal film resistors with ±1% tolerance, even if this accuracy is not needed, because the cost difference is tiny and metal film resistors have much better performance and life span. Also they are often rated at 0.6W. Keeping separate stocks of 5% carbon resistors and 1% metal film resistors nowadays is false economy.
Some very old circuits may specify ±5% (as opposed to ±10%) and/or carbon for resistors; you can always replace these with ±1% metal film resistors. (Unless the resistor is being used as a noise source!)
A resistor's mass is roughly proportional to its power rating. Resistors with ratings up to about 10W can be mounted directly on circuit boards (often raised on their leads, sometimes with spacers); larger resistors may need to be mounted on heatsinks or a chassis. Resistors that get hot are a common cause of dry joints between their leads and the copper traces on the circuit board, because the constant heating and cooling, and expansion and contraction, can weaken the solder joint. The heat can also cause the circuit board to discolour, and even become conductive.
Resistors rated for 1W and more may be wirewound - that is, made from a coil of "resistance wire". If these are wound in the normal helical way, they actually behave partly like inductors, and are not suitable for certain circuit positions - such as the two 4.7Ω resistors in your amplifier. In this case, they should be marked "carbon", "film", or "non-inductive" on the schematic, so you know not to use a standard wirewound resistor. "Non-inductive" wirewound resistors are available; they are wound in a different way, so they don't behave like inductors. But other non-inductive types of resistors (thin film and thick film) are available with power ratings up to about 20W I think, and can be used where inductance is a problem. Or you can use a bunch of lower-power resistors in series (resistances add together) or parallel (see
https://en.wikipedia.org/wiki/Series_and_parallel_circuits#Resistors_2).
With semiconductors, such as diodes, transistors, MOSFETs etc, the part number should be shown on the diagram, and this is a pretty specific definition of what you need, although there may be suffixes on the part number that specify variables such as package type, quality grade, and temperature grade (these apply mostly to ICs), and gain and sometimes voltage rating (these apply mostly to transistors). The designer should specify any that are important, and may leave others up to your discretion, depending on your construction method, budget, accuracy requirements, etc.
Would this be the correct one
Yes, that will be fine for your LED dimmer. No special requirements there.