Heat rate is one measure of the efficiency of electrical generators/power plants that convert a fuel into heat and into electricity. The heat rate is the amount of energy used by an electrical generator/power plant to generate one kilowatthour (kWh) of electricity. The U.S. Energy Information Administration (EIA) expresses heat rates in British thermal units (Btu) per net kWh generated. Net generation is the amount of electricity a power plant supplies to the power transmission line connected to the power plant. Net generation accounts for all the electricity that the power plant consumes to operate the plant’s generator(s) and other equipment, such as fuel feeding systems, boiler water pumps, cooling equipment, and pollution control devices.
To express the efficiency of a generator or power plant as a percentage, divide the equivalent Btu content of a kWh of electricity (3,412 Btu) by the heat rate. For example, if the heat rate is 10,500 Btu, the efficiency is 33%. If the heat rate is 7,500 Btu, the efficiency is 45%.
For information on EIA’s methodology for estimating energy consumption for generating electricity with non-combustible renewable energy sources (geothermal, hydro, solar, and wind energy), see Monthly Energy Review, Appendix E: Alternative Approaches for Deriving Energy Contents of Noncombustible Renewables.
Historical average annual heat rates for fossil fuel and nuclear power plants
Average annual heat rates for specific types of fossil-fuel generators and nuclear power plants
Approximate Heat Rates for Electricity, and Heat Content of Electricity (average annual heat rates from 1949 to most recent year available)
Last updated: August 17, 2020