Avoid voltage drop influence

My cable size and transformer size should give me maximum 3% on the worst 6% to 10%. If it is the single only equipment on the system then maybe you can tolerate 15%. If not, dip factor may affect sensitive equipment and lighting.

This is very annoying for office staff each time a machine starts lights are dimming. It does not matter what standard you quote I cannot accept 10%- 15% make precise calculation and add a 10% tolerance to avoid.

In most cases, this problem comes from cable under sizing so we have to settle with a Standard giving 15% Max.

Just recently I had to order a transformer and cable change for a project which was grossly undersized.
I have had to redesign the electrical portion of a conveyor and crushing system to bring the system design into compliance with applicable safety codes. The site was outdoor at a mine in Arizona where ambient temperatures reach 120F. The electrical calculation and design software did not include any derating of conductor sizes for cable spacing and density within cable trays, number of conductors per raceway, ambient temperature versus cable temperature rating, etc. Few of the cables had been increased in size to compensate for voltage drop between the power source and the respective motor or transformer loads.

Feeder cables to remote power distribution centers were too small, as voltage drop had not been incorporated in the initial design. The voltage drop should not be greater than 3%, as there will be other factors of alternating loads, system voltage, etc. that may result in an overall drop of 5%.

The electrical system had to be re-designed with larger cables, transformer, MCCS, etc, as none of the design software factors in the required deratings specified in the National Electric Code NFPA70 nor the Canadian Electric Code, which references the NEC.