The second part of our HV efficiency blog series aims to clarify the difference between tapping down HV distribution transformers, and voltage management technologies such as voltage optimisation, for the purpose of reducing a site’s voltage.
To aid any confusion surrounding the difference of HV distribution transformers and our Powerstar range of voltage optimisation products, it’s worth clarifying that the role of HV transformers is to reduce high voltage (HV) electricity from the grid, typically 11000V or 6600V, to low voltage (LV) electricity, typically 400V-433V, so that it can be used safely. This process transforms the entire power to a site from high voltage to low voltage. In contrast, voltage optimisation technology reduces the LV voltage, when higher than necessary, to the optimal voltage for the specifications of the electrical equipment on-site; this is done by subtraction, rather than by transformation.
Can tapping my transformer achieve greater efficiencies?
With distribution transformers, voltage reduction can be achieved through changing the tapping of the transformer, although achieving a reduction in this manner is not recommended. By tapping, you can get different turns ratio and the ability to control output, therefore achieving a form of voltage management. However, when you alter the tapping it effects the number of coils, and therefore the impedance, of the transformer. This leads to an increase in the transformer current if the voltage is reduced, or a deduction in the transformer current in the event of an increase in voltage.
Ideally, HV transformers should not be tapped beyond the settings they were specified and built to, as this is when they will operate at their optimal efficiency. Utilising this as a method to reduce the voltage is inefficient due to the relationship between voltage and current. Reducing the voltage in the transformer output by 5% will increase the current by the same percentage. Using this example, the losses in the transformer would increase by 25%. In the case of a typical 1000kVA transformer with a load loss of 8kW, based on a 75% load, reducing the voltage by 5% will increase these losses to 10kW, which will increase the site consumption by 17,520kWh a year, therefore inflating the site’s energy bill.
For this reason, HV transformers should be set to a tap based on the site requirements, such as site capacity, to ensure that they operate at maximum efficiency.
Optimal efficiency through voltage management
The most effective method of reducing voltage to achieve savings is through purpose-built voltage management technology as part of a combined solution involving an amorphous core distribution transformer integrated with electronic-dynamic voltage management, such as the Powerstar HV MAX. The HV MAX combines the benefits of a super low loss amorphous core transformer, which reduced load losses by up to 75% when compared with a traditional CRGO distribution transformer, with the benefits of LV electronic-dynamic voltage management, which reduces and stabilises the on-site voltage providing energy consumption and cost savings.
Due to this combination of technology, Powerstar HV MAX is suitable for sites operating their own HV/LV supply alongside sites with a high incoming voltage profile and an inefficient HV/LV distribution transformer.Back to all news