You can assume that a very small change in voltage across a LED will result in a very large (and not easily calculable) change in current.
For example, if a string of LEDs has 20mA flowing through it, and the voltage across them is 10V, an increase in this voltage of 0.5V would increase the current, but it's not easy to say by how much. It might be to 40mA, 100mA, or maybe higher. If the LED has a maximum current of 30mA, the smoke might escape.
Having a resistor in series will cause the change in current to be proportional to the voltage across the series resistor. Thus, a change in voltage of x volts will result in a change in current of x/r amps.
As an example, if a 100Ω resistor has 2 volts across it (with say 10V across the series LEDs), and the input voltage rises by 0.5V, the current will increase from 20mA to 25mA. If the LEDs are rated for a max of 30mA, they'll still be fine.
The key issue is that the voltage across the resistor should be large compared to fluctuations in the supply voltage. This will allow for a relatively constant current even allowing for small changes in the supply voltage or the voltage across the LEDs (which can vary with temperature).