Pages

Saturday, April 5, 2014

How the Btu came to be the Standard for Measuring Heat

The HVAC industry has been using the British thermal unit (Btu) for generations in measuring heat. For furnaces, how much Btu can one unit produce at a unit of time? For air conditioners, how much Btu can a unit remove from a room?

Articles on how the Btu came to be are scarce; nobody’s certain about its origin. The only viable basis is from statements made by 19th century scientists regarding the measurement of heat, the earliest date of the term’s use being 1876. At the time, the Btu was known as simply “heat unit,” arguably a simpler and more universal term.

The first to mention Btu was James Hargreaves, better known for his spinning jenny invention. He mentioned the unit of heat to be used for his thermo-radiometer to be expressed either in Btu or calories. Basically, heat is energy, so using calories to measure the amount of heat isn’t anything new or erroneous.

However, William Anderson gave the modern definition of the Btu. Anderson, in a lecture on 1884, defined the Btu as the amount of heat required to raise the temperature of one lb. of water by one degree Fahrenheit. The definition has been in use in the scientific community ever since. Despite its sketchy background, the world has embraced the Btu as the standard.



No comments:

Post a Comment