Why are there 5280 feet in a mile, and why are nautical miles different from the statute miles we use on land? Why do we buy milk and gasoline by the gallon? Where does the abbreviation “lb“ come from? Let’s take a look at the origins of a few units of measurement that people in the United States use every day.
1. The Mile
The basic concept of the mile originated in Roman times. The Romans used a unit of distance called the mille passum, which literally translated into “a thousand paces.” Because each pace was considered to be five Roman feet—which were a bit shorter than our modern feet—the mile ended up being 5000 Roman feet, or roughly 4850 of our modern feet.
If the mile originated with 5000 Roman feet, how did we end up with a mile that is 5280 feet? Blame the furlong. The furlong wasn’t always just an arcane unit of measure that horseracing fans gabbed about; it once had significance as the length of the furrow a team of oxen could plow in a day. In 1592, the English Parliament set about determining the length of the mile and decided that each one should be made up of eight furlongs. As a furlong was 660 feet, we ended up with a 5280-foot mile.
2. The Nautical Mile
So if the statute mile is the result of Roman influences and plowing oxen, where did the nautical mile get its start? Strap on your high school geometry helmet for this one. Each nautical mile originally referred to one minute of arc along a meridian around the Earth. Think of a meridian around the Earth as being made up of 360 degrees, and each of those degrees consists of 60 minutes of arc. Each of these minutes of arc is then 1/21,600th of the distance around the earth. Thus, a nautical mile is 6076 feet.
3. The Acre
Like the mile, the acre owes its existence to the concept of the furlong. Remember that a furlong was considered to be the length of a furrow a team of oxen could plow in one day without resting. An acre—which gets its name from an Old English word meaning “open field”—was originally the amount of land that a single farmer with a single ox could plow in one day. Over time, the old Saxon inhabitants of England established that this area was equivalent to a long, thin strip of land one furlong in length and one chain—an old unit of length equivalent to 66 feet—wide. That’s how we ended up with an acre that’s equivalent to 43,560 square feet.
4. The Foot
As the name implies, scholars think that the foot was actually based on the length of the human foot. The Romans had a unit of measure called a pes that was made up of 12 smaller units called unciae. The Roman pes was a smidge shorter than our foot—it came in at around 11.6 inches—and similar Old English units based on the length of people’s feet were also a bit shorter than our 12-inch foot. The 12-inch foot didn’t become a common unit of measurement until the reign of Henry I of England during the early 12th century, which has led some to believe it was standardized to correspond to the 12-inch foot of the king.
5. The Gallon
The gallon we use for our liquids comes from the Roman word galeta, which meant “a pailful.” There have been a number of very different gallon units over the years, but the gallon we use in the U.S. is probably based on what was once known as the “wine gallon” or the Queen Anne gallon, which was named for the reigning monarch when it was standardized in 1707. It held 231 cubic inches, and some believe the wine gallon corresponded to a vessel that was designed to hold eight troy pounds of wine.
6. The Pound
Like several other units, the pound has Roman roots. It’s descended from a Roman unit called the libra. That explains the “lb” abbreviation for the pound, and the word pound itself comes from the Latin pondo, for ”weight.” The avoirdupois pounds we use today have been around since the early 14th century, when English merchants invented the measurement to sell goods by weight rather than volume. They based their new unit of measure as being equivalent to 7000 grains, an existing unit, and then divided each 7000-grain avoirdupois pound into 16 ounces.
7. Horsepower
Early 18th-century steam engine entrepreneurs needed a way to express how powerful their machines were, and the industrious James Watt hit on a funny idea for comparing engines to horses. Watt studied horses and found that the average harnessed equine worker could lift 550 pounds at a clip of roughly one foot per second, which equated to 33,000 foot-pounds of work per minute.
Not all scholars believe that Watt arrived at his measurement so scientifically, though. One common story claims that Watt actually did his early tests with ponies, not horses. He found that ponies could do 22,000 foot-pounds of work per minute and figured that horses were half again stronger than ponies, so he got the ballpark figure of 33,000 foot-pounds of work per minute.
A version of this story originally ran in 2017; it has been updated for 2023.