Why 4-20mA is preferred over 0-10V signal?
My experience has been that most process automation sensors use 4-20mA to stave off voltage drop and interference in long wiring runs (where your wiring runs are in the hundreds or thousands of feet through a process plant). It also gives you the ability to determine if a sensor is dead.
In industrial automation (primarily the automated equipment / motion control world), the 0-10VDC (or 0-5 or 0-12) seems to be more of a standard. Most servo motor drives take a +/-10VDC signal as a torque/velocity demand, and most position/distance sensors provide an analog voltage output. In these cases, the distances are generally in feet or tens of feet, not hundreds or thousands.