It will depend a lot on the devices you're hooking up and the host port you're using. FWIW, it's actually the device that determines how much current to draw. They decide how much to draw based on what they see on the data lines (either voltage levels or actual communication). This tells the device how much current it is allowed to draw. It's up to the host to protect itself from having too much current drawn and I'm not sure there's a standard for that. As Harald pointed out this is unlikely to work correctly with two devices in parallel off the same data lines unless the current drawn by both devices is small.
The worst case is you'll hook up two devices to that one port, they'll both try to pull the max current that port can supply and you'll blow out your USB port.