I spent several hours trying to find the answer to this question: what limits the power transfer capabilities of a real transformer? Bottom line appears to be losses creating unacceptable temperature rise. Couldn't find anything, except a side-bar reference on
Wikipedia, that relates winding e.m.f. to excitation frequency, core area, number of turns, and peak value of the magnetic field in the core, assuming a sinusoidal field. If you multiply both sides of this equation by the current in the winding... voila! you have a formula for power! Clearly power transfer depends on core area and length of the core (to obtain the induced magnetic field in the core), but where does magnetizing current play a role? All the stuff I found on magnetizing current says you want to keep it small, around one percent of the load current in the primary. Why? Why would you want or need any magnetizing current? More research is required.
Pragmatically we know that more power transfer in a transformer requires more ferromagnetic material in the core, else electrical utilities would purchase much smaller transformers!
As for wiring transformers back-to-back: this does work. You should try to match the VA capabilities of the two transformers, and limit the power transfer to the transformer with the smallest VA specification.
Sorry to be of so little help. Transformer design was not covered in my one power electrical engineering course, although I did find out how a three phase power source can create a constant amplitude rotating magnetic field. Fascinating, as Mr. Spock would say.