VA = W if the power factor is 1, that is, if the current drawn is directly proportional to voltage. This will happen if the load is a resistor.
Once you introduce other elements (diodes, capacitors, etc,) things change and VA > W.
So for a typical linear power supply (let's say bridge rectifier and filter cap) for an output of 10W, the VA rating of the transformer has to exceed that considerably.
Here is a useful guide:
http://www.hammondmfg.com/pdf/5c007.pdf
If you look down to the full wave bridge rectifier wit capacitor input load you'll see that Vdc = 0.9 Vac and Idc = 0.62 Iac.
So, if you have a 30VA transformer (let's say it;s 15VRMS 30VA) then...
Vdc = 13.5 (this is under load)
Idc = (30/15) * 0.62 = 1.24A
So for an output power of 16.74W, we need a 30VA transformer.
The simple relationship is to double the output power to determine the VA rating in this configuration.
Note that the actual figures will change with the amount of capacitance.
If you repeat this calculation using the choke input load, then you'll get a surprising result. However, you need to consider the size of the inductor required (this PDF does not say anything about optimum values for inducatnce or capacitance).
This page gives more information that, whilst it does not answer your question, supports the calculation above.