I'm working a little DIY project I bought while I was in Japan.
The only problem is I'm now in the US.
The device's manual says it's rated for 12-15VAC input.
The transformer that came with the kit is rated at 100VAC Input @ 50-60Hz and puts out 12VAC 400mA.
So here's what I'm thinking... In Japan they use 100V. In the US we use 120V.
My devices is rated to take in 12-15V. So if I plug this transformer into the wall here in the US it will put out 14.4VAC and I'll be within my 12-15VAC range (as in: 100V*1.20=120V and 12V*1.2=14.4V).
Am I correct in my thinking or would it be safer to buy a 120V rated power adapter?
And will this change over also increase the amperage or will that remain at 400mA?
The only problem is I'm now in the US.
The device's manual says it's rated for 12-15VAC input.
The transformer that came with the kit is rated at 100VAC Input @ 50-60Hz and puts out 12VAC 400mA.
So here's what I'm thinking... In Japan they use 100V. In the US we use 120V.
My devices is rated to take in 12-15V. So if I plug this transformer into the wall here in the US it will put out 14.4VAC and I'll be within my 12-15VAC range (as in: 100V*1.20=120V and 12V*1.2=14.4V).
Am I correct in my thinking or would it be safer to buy a 120V rated power adapter?
And will this change over also increase the amperage or will that remain at 400mA?
Last edited: