An LED rated at 3.6V and 20mA will draw 20mA if supplied with 3.6V. Correct me if I am wrong.
LEDs aren't "rated" for a particular voltage. The best way to look at it is that you need to supply them with a certain amount of current (the light output is roughly proportional to this current), and this current will cause a certain amount of voltage (the forward voltage) across the LED.
The forward voltage varies with temperature and other factors, but is not a tightly controlled specification. Data sheets usually specify a typical and maximum forward voltage for a particular current. Here's one that specifies a minimum forward voltage as well. These numbers cover JUST manufacturing variations.
However, as it heats the current draw may vary. And in the real world, exact voltage is not the case.
It's better to say that the voltage varies for a given current. The current is the figure that's important, because the light output is roughly proportional to it. The voltage is more of a side effect. Here's a graph of typical forward voltage vs. current for a typical LED.
So a resistor basically acts as a "safety" device in that it limits the current draw of an LED to safe levels, so it does not burn out. In addition, it gives the LED flexibility in case of a varying voltage, and thus protects it.
Not really. The resistor is there to control the current. Unlike the LED, the resistor has a linear voltage vs. current relationship, so variations in the voltage across it will have a defined, linear effect on the current that flows through it.
Take the LED in the data sheet above. Say we want to run it at 10 mA and our supply voltage is 5V. We start with the typical forward voltage, 1.9V, and calculate the resistor based on the voltage that will be dropped across it at that voltage. This voltage is (5V - 1.9) = 3.1V. Using Ohm's Law with I = 0.01 amps we calculate R = V / I = 3.1 / 0.01 = 310 ohms.
Now check out what happens when the forward voltage is at the extremes.
Minimum forward voltage (1.5V) leaves 3.5V across 310 ohms; I = 11.3 mA
Typical forward voltage (1.9V) leaves 3.1V across 310 ohms; I = 10 mA
Maximum forward voltage (2.4V) leaves 2.6V across 310 ohms; I = 8.4 mA
So even with that wide range of forward voltage variation, the operating current of the LED will only vary by ±16% or less. THAT is the reason for the resistor.
As Steve pointed out, a current source is a better option; there would be NO variation in current with a current source. But a resistor is adequate, especially if the ratio of the voltage across the resistor to the voltage across the LED is high, i.e. the supply voltage is a lot higher than the LED forward voltage.
Another thing I realized is that an LED isn't a resistor like a regular light bulb (which I assume is basically a resistor), correct me if I am wrong about light bulbs being resistors.
Kind of. The resistance of a light bulb varies somewhat depending on the temperature of its filament. But apart from that, yes a light bulb is resistive; it's not a semiconductor.