Can I operate a 50Hz transformer at 60Hz power supply?

Well first let get one thing straight for transformers: the higher the line frequency, the lower the core (iron) losses! The core power loss are proportional to kf*B^2 approximately for any machine, dynamic or static. But transformers are self-excited static machines, meaning the flux density B is reverse proportional to the line frequency, therefore Pcoreloss = kB^2*f=k*(1/f)^2*f=k/f… so the higher f, the lower the losses. However, increasing the frequency also increases the magnetizing inductance – lowering the magnetizing current. For if you increase the frequency you may want to increase the voltage. But of course this is not usually practical, as line voltage of 60Hz systems is usually lower than those of 50Hz systems. So operating a 50Hz motor at 60Hz should be safe, but may result in higher voltage drop because of lower magnetizing current and because of higher leakage inductance (the series inductance).

It is true that the higher the frequency, the higher the hysteresis (and eddy current) losses will be. But is it a common misconception to assume higher power losses when frequency increases in a transformer. Simply because the hysteresis losses depends not only on frequency, but on the max magnetic flux density as well (Bmax^2). The flux density is reversely proportional to the line frequency, which eventually causes lower core losses as you raise the frequency. This holds true for low and mid frequency ranges. For higher frequencies, skin effect and eddy currents dominates, so the picture may be different. However, iron core transformers do not operate in such high frequencies. We use ferrite core instead. In a practical transformer model, the core losses are represented by a parallel resistor (Rc). The resistor’s value is linearly dependent of the line frequency (Rc=k*f), and the core losses are given by Pc=U^2/Rc… Of course this model is limited to mid-low frequencies…