Why Does the NFL Have a Two-Minute Warning?

Hannah Foslien/Getty Images
Hannah Foslien/Getty Images

The two-minute warning comes up at the end of each half in every NFL football game. Most fans take it for granted, but why does the NFL stop the clock with two minutes or so left in each half? Is it just so the NFL can sneak an extra commercial break into the action? Here's a quick story you can pass on to your relatives later today.

The custom of giving teams a two-minute warning dates all the way back to the NFL's first years. In those days, fans and coaches couldn't just take a look at the stadium clock to see how much time remained in the half. The official game clock resided in the pocket or on the wrist of one of the officials, and the stadium's clock was just a rough estimate of how much time remained in the game. Thus, the NFL instituted a two-minute warning where the referee would stop the clock and let both teams know exactly how much time remained in the game.

Obviously, that sort of "warning" is no longer necessary.

Starting in the 1960s the NFL made the stadium's clock the official game clock, which is why you occasionally see officials stop play to request time be put back on the clock. The league didn't want to do away with the two-minute warning, though, which had become an important strategic part of the game, helped build excitement during game-closing drives, and offered broadcasters an opportunity to sell an extra set of commercials. As a result, the two-minute warning stuck around, which is why we still have to wait a few extra minutes to see the climaxes of games.

Other leagues either disregard the two-minute warning or have their own twists on it. There's no clock-stopping two-minute warning in NCAA football, but the rules require the referee to give both coaches and each team's captains a verbal warning when there are two minutes left in the game. Thanks to their short fields and frenetic pacing, arena football leagues only use a one-minute warning, and the Canadian Football League opts to use a three-minute warning instead of its shorter American cousin.

Are Any of the Scientific Instruments Left on the Moon By the Apollo Astronauts Still Functional?

Apollo 11 astronaut Neil Armstrong left the first footprint on the Moon on July 20, 1969.
Apollo 11 astronaut Neil Armstrong left the first footprint on the Moon on July 20, 1969.
Heritage Space/Heritage Images/Getty Images

C Stuart Hardwick:

The retroreflectors left as part of the Apollo Lunar Ranging Experiment are still fully functional, though their reflective efficiency has diminished over the years.

This deterioration is actually now delivering valuable data. The deterioration has multiple causes including micrometeorite impacts and dust deposition on the reflector surface, and chemical degradation of the mirror surface on the underside—among other things.

As technology has advanced, ground station sensitivity has been repeatedly upgraded faster than the reflectors have deteriorated. As a result, measurements have gotten better, not worse, and measurements of the degradation itself have, among other things, lent support to the idea that static electric charge gives the moon an ephemeral periodic near-surface pseudo-atmosphere of electrically levitating dust.

No other Apollo experiments on the moon remain functional. All the missions except the first included experiment packages powered by radiothermoelectric generators (RTGs), which operated until they were ordered to shut down on September 30, 1977. This was done to save money, but also because by then the RTGs could no longer power the transmitters or any instruments, and the control room used to maintain contact was needed for other purposes.

Because of fears that some problem might force Apollo 11 to abort back to orbit soon after landing, Apollo 11 deployed a simplified experiment package including a solar-powered seismometer which failed after 21 days.

This post originally appeared on Quora. Click here to view.

What Makes a Hotel Breakfast 'Continental'?

Hotels often offer a complimentary pastry and fruit breakfast.
Hotels often offer a complimentary pastry and fruit breakfast.
tashka2000/iStock via Getty Images

The continental breakfast, which is typically made up of pastries, fruit, and coffee, is often advertised by hotels as a free perk for guests. But why is it called continental, and why don’t patrons get some eggs and bacon along with it?

The term dates back to 19th century Britain, where residents referred to mainland Europe as “the continent.” Breakfast in this region was usually something light, whereas an English or American breakfast incorporated meat, beans, and other “heavy” menu options.

American hotels that wanted to appeal to European travelers began advertising “continental breakfasts” as a kind of flashing neon sign to indicate guests wouldn’t be limited to American breakfast fare that they found unappealing. The strategy was ideal for hotels, which saved money by offering some muffins, fruit, and coffee and calling it a day.

That affordability as well as convenience—pastries and fruit are shelf-stable, requiring no heat or refrigeration to maintain food safety—is a big reason continental breakfasts have endured. It’s also a carryover from the hybrid model of hotel pricing, where American hotels typically folded the cost of meals into one bill and European hotels billed for food separately. By offering a continental breakfast, guests got the best of both worlds. And while Americans were initially aghast at the lack of sausages and pancakes on offer, they’ve since come around to the appeal of a muffin and some orange juice to get their travel day started.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER