12 Fun Facts About the U.S. Flag

iStock/MarianVejcik
iStock/MarianVejcik

Happy Flag Day! If you’re wondering what June 14th has to do with the Stars and Stripes, why the flag looks the way it does, who came up with it, who paid for it, what you can and can’t do with it, and how those flags on the moon are holding up … we salute you!

1. THE FIRST FLAG WAS COMMISSIONED WITH A PAYMENT OF "THREE STRINGS OF WAMPUM."

By 1777, the U.S. was still waffling on the exact look of its flag. This was a cause for concern for Thomas Green, an American Indian who wanted the protection of an official flag while traveling through treacherous territory to Philadelphia. Thomas asked for help from Congress, throwing in the aforementioned payment to sweeten the deal. Within 10 days, a resolution was passed, finalizing the flag as a creation with 13 stars and 13 stripes. The date: June 14th, 1777.

2. BETSY ROSS MIGHT NOT BE AS TIED TO THE FLAG AS WE THOUGHT.

She may have sewn quite a few in her day, but there is no actual evidence that Betsy Ross was the person responsible for the design of the U.S. Flag. In fact, Betsy’s name didn’t even come up in conjunction with the deed until 1876, 40 years after her death. The first person to have made that claim publicly was New Jersey Congressman Francis Hopkinson in 1780, who had hoped (in vain) to earn a "quarter cask of the public wine" for his efforts. Apparently, he didn't take wampum.

An aside: There also seems to be dispute as to whether Betsy Ross even lived in Philadelphia’s popular Betsy Ross House.

3. THE FLAG HAS ALWAYS HAD 13 STRIPES … EXCEPT WHEN IT DIDN'T.

Upon welcoming Vermont and Kentucky—states 14 and 15—into the union, a new version of the flag was created that had 15 stars and 15 stripes. As the U.S. continued to add new states, there was concern about having to continually add additional stripes. The solution: revert to 13 to represent the original 13 colonies, and let the stars do the heavy lifting.

4. SOME OF THE STAR FIELDS HAVE BEEN PRETTY STRANGE LOOKING.

As of 1818, conventions concerning the numbers of stars and stripes were cemented and remain in place today. However, one thing remained un-codified: star layout. With this lack of official guidelines, some designers got creative…in kind of a Microsoft Paint-way.

26-star "star" flag:

33-star Ft. Sumter flag:

Which looks a lot like this, yes?

Courtesy of BBRCreative

38-star concentric creation:

5. THE DAKOTAS THREW OFF THE STAR-DESIGN PLANS.

There have been 27 official versions of the U.S. flag, each with a different number of stars. A 39-star version is not among them, but that didn’t stop some enterprising flag manufacturers from producing one for the marketplace. The reason for the miscalculation: Some thought North Dakota and South Dakota were going to be admitted as one state.

6. THE 50-STAR PATTERN WAS CREATED BY A HIGH SCHOOL STUDENT.

When Alaska and Hawaii became states 49 and 50, President Eisenhower received thousands of ideas for an updated flag. Almost all of them were for a 50-star flag, including one from Robert G. Heft, a 17-year-old student at Lancaster (Ohio) High, who created the design for a class project. He was one of three to submit the version that was accepted and remains in use today.

Robert got a B- on his project.

7. THE 50-STAR FLAG IS THE FIRST ONE TO HAVE LASTED 50 YEARS.

In contrast, over a 50-year period in the early 1800s, the flag went through 17 different versions.

8. THE ACTUAL FLAG THAT INSPIRED "THE STAR SPANGLED BANNER" STILL EXISTS.

The flag that flew at Ft. McHenry during the War of 1812, immortalized in Francis Scott Key’s tune, is one of the few remaining specimens of a 15-star, 15-bar flag. What’s left of it is on permanent display at the Smithsonian’s National Museum of American History.

9. A SNIPPET OF THAT FLAG SOLD AT AUCTION IN 2011 FOR $38,000.

We say "what’s left of it" because the flag in question was a victim of "souveniring," a once-common practice where sections from flags were snipped off and sold as mementos. The 2" x 5" swatch in question was taken from the flag in the 1800s.

10. THE FLAG DESECRATION AMENDMENT FAILED IN 2006.

The proposed constitutional amendment would have prohibited not only burning the flag (for political reasons) but printing it on disposable items such as t-shirts or napkins. The amendment fell one vote short in the Senate.

11. EVEN IF IT HAD PASSED, BURNING A FLAG IS A-OK...

…as long as it’s already damaged beyond repair. It’s one way that the flag may be disposed of in a “dignified manner,” according to the U.S. Flag Code.

Then again, if the U.S. Flag Code got its way, the stars and stripes wouldn’t appear in advertising either.

12. OF THE SIX FLAGS PLANTED ON THE MOON, FIVE OF THEM ARE STILL STANDING.

The one that’s not: the first one, planted by Neil Armstrong during the Apollo 11 mission. Readers of a certain age might also recognize the now-fallen flag from the original MTV bumper.

See Also: The Pledge of Allegiance was written in part to sell flags to schools.

A version of this story originally ran in 2013.

Has An Element Ever Been Removed From the Periodic Table?

lucadp/iStock via Getty Images
lucadp/iStock via Getty Images

Barry Gehm:

Yes, didymium, or Di. It was discovered by Carl Mosander in 1841, and he named it didymium from the Greek word didymos, meaning twin, because it was almost identical to lanthanum in its properties. In 1879, a French chemist showed that Mosander’s didymium contained samarium as well as an unknown element. In 1885, Carl von Weisbach showed that the unknown element was actually two elements, which he isolated and named praseodidymium and neodidymium (although the di syllable was soon dropped). Ironically, the twin turned out to be twins.

The term didymium filter is still used to refer to welding glasses colored with a mixture of neodymium and praseodymium oxides.

One might cite as other examples various claims to have created/discovered synthetic elements. Probably the best example of this would be masurium (element 43), which a team of German chemists claimed to have discovered in columbium (now known as niobium) ore in 1925. The claim was controversial and other workers could not replicate it, but some literature from the period does list it among the elements.

In 1936, Emilio Segrè and Carlo Perrier isolated element 43 from molybdenum foil that had been used in a cyclotron; they named it technetium. Even the longest-lived isotopes of technetium have a short half-life by geological standards (millions of years) and it has only ever been found naturally in minute traces as a product of spontaneous uranium fission. For this reason, the original claim of discovery (as masurium) is almost universally regarded as erroneous.

As far as I know, in none of these cases with synthetic elements has anyone actually produced a quantity of the element that one could see and weigh that later turned out not to be an element, in contrast to the case with didymium. (In the case of masurium, for instance, the only evidence of its existence was a faint x-ray signal at a specific wavelength.)

This post originally appeared on Quora. Click here to view.

Graham Crackers Were Invented to Combat the Evils of Coffee, Alcohol, and Masturbation

tatniz/iStock via Getty Images
tatniz/iStock via Getty Images

Long before they were used to make s’mores or the tasty crust of a Key lime pie, graham crackers served a more puritanical purpose in 19th-century America. The cookies were invented by Sylvester Graham, an American Presbyterian minister whose views on food, sex, alcohol, and nutrition would seem a bit extreme to today's cracker-snackers. Much like the mayor in the movie Chocolat, Graham and his thousands of followers—dubbed Grahamites—believed it was sinful to eat decadent foods. To combat this moral decay, Graham started a diet regimen of his own.

Graham ran health retreats in the 1830s that promoted a bland diet that banned sugar and meat. According to Refinery29, Graham's views ultimately inspired veganism in America as well as the “first anti-sugar crusade.” He condemned alcohol, tobacco, spices, seasoning, butter, and "tortured" refined flour. Caffeine was also a no-no. In fact, Graham believed that coffee and tea were just as bad as tobacco, opium, or alcohol because they created a “demand for stimulation.” However, the worst vice, in Graham's opinion, was overeating. “A drunkard sometimes reaches old age; a glutton never,” he once wrote.

Graham’s austere philosophy was informed by the underlying belief that eating habits affect people’s behaviors, and vice versa. He thought certain foods were "overstimulating" and led to impure thoughts and passions, including masturbation—or “self-pollution,” as he called it—which he believed to be an epidemic that caused both blindness and insanity.

Illustration of Sylvester Graham
Library of Congress, Public Domain, Wikimedia Commons

Graham's views directly influenced Victorian-era corn flake inventor John Harvey Kellogg, who was born a year after Graham died. Like his predecessor, Kellogg also believed that meat and some flavorful foods led to sexual impulses, so he advocated for the consumption of plain foods, like cereals and nuts, instead. (Unsurprisingly, the original recipes for both corn flakes and graham crackers were free of sinful sugar.)

In one lecture, Graham told young men they could stop their minds from wandering to forbidden places if they avoided “undue excitement of the brain and stomach and intestines.” This meant swearing off improper foods and substances like tobacco, caffeine, pepper, ginger, mustard, horseradish, and peppermint. Even milk was banned because it was “too exciting and too oppressive.”

So what could Graham's followers eat? The core component of Graham’s diet was bread made of coarsely ground wheat or rye, unlike the refined white flour loaves that were sold in bakeries at that time. From this same flour emerged Graham's crackers and muffins, both of which were common breakfast foods. John Harvey Kellogg was known to have eaten the crackers and apples for breakfast, and one of his first attempts at making cereal involved soaking twice-baked cracker bits in milk overnight.

Slices of rye bread, a jug of milk, apples and ears of corn on sackcloth, wooden table
SomeMeans/iStock via Getty Images

However, Kellogg was one of the few remaining fans of Graham’s diet, which began to fall out of favor in the 1840s. At Ohio’s Oberlin College, a Grahamite was hired in 1840 to strictly enforce the school’s meal plans. One professor was fired for bringing a pepper shaker to the dining hall, and the hunger-stricken students organized a protest the following year, arguing that the Graham diet was “inadequate to the demands of the human system as at present developed.” Ultimately, the Grahamite and his tyrannical nutrition plan were kicked out.

Much like Kellogg’s corn flakes, someone else stepped in and corrupted Graham’s crackers, molding them into the edible form we now know—and, yes, love—today. In Graham’s case, it was the National Biscuit Company, which eventually became Nabisco; the company started manufacturing graham crackers in the 1880s. But Graham would likely be rolling in his grave if he knew they contained sugar and white flour—and that they're often topped with marshmallows and chocolate for a truly decadent treat.

SECTIONS

arrow
LIVE SMARTER