Transformer Efficiency: Is it worth it?PUE often encompasses all of the losses from the utility down to the server. This includes transformers that convert power down to 400 or 208, 3 phase power. To get an understanding of your "total" efficiency, transformer efficiency and costs have to first be understood.
Here are few things of interest to data center operators:
- There are several factors that affect transformer inefficiency (electrical losses):
- Hysteresis due to charging the field in the iron core (which is NOT proportional to load).
- Line losses through the coil (which are proportional to load).
- There are several different type of transformers:
- "Standard temperature rise" – More efficient at partial loads, where data centers actually operate.
- "Low temperature rise" – This means there is a low surface temperature (less heat coming off the transformer). These transformers have larger iron cores, which make them more efficient at transferring energy at high loads, but less efficient at partial loads, where data centers actually operate.
- “DOE 2016” – In the US, the Department of Energy released a new efficiency standard for all transformers sold. These transformers are more efficient across the board.
Figure 1: Transformer efficiency curves for standard, low temperature, and DOE 2016 type transformers
Premium efficiency transformers cost around $.02 per Watt. Upgrading from a standard or low temperature rise transformer is well worth the money. Assuming a 40% load, they pay themselves back in 2 years (@ $0.10 /kW-hr) versus a standard, low efficiency transformer.
Have something to add? Leave a comment below!
To learn more about Minimus Servers contact us at firstname.lastname@example.org or call +1 512-692-8003.
Update: Figure 1 has been updated. The earlier version had been labeled incorrectly.