Weird Facts About Leap Years

By Alen Boes, About Entertainment


1.  Blame It On Augustus

We owe leap years to Julius Caesar, but also to his successor, the Emperor Augustus.

The Ancient Romans used to follow a calendar that had 355 days a year, but it eventually grew hopelessly out of sync with the seasons, making it difficult to celebrate festivals at the same time each year. So in 45 BC, Julius Caesar decreed that a new, reformed calendar would be adopted that had 365 days a year, with an extra day every “leap year” in order to keep the seasons and calendar properly in sync.

However, the Roman priests who devised the new calendar initially made a mistake. They set the leap year to occur every third year. The priests realized soon enough that this wouldn’t work, and in 8 BC Emperor Augustus officially corrected the calendar so that leap years came every fourth year.

So Caesar can take credit for leap years in general, but the four-year tradition is Augustus.

And have you ever wondered why February is shorter than every other month? That’s also because of Augustus. The Roman Senate, to honor him, renamed the month of Sextilis as Augustus (August). But originally August was only 30 days long, and this was a problem because Julius Caesar’s month (July) was 31 days long. It wouldn’t do for Augustus to have a shorter month than Caesar!

To make August as long as July they borrowed a day from February, reducing it from 30 days during a leap year to only 29, and 28 days every other year. This permanently left February as the odd, shortened month that it is.

2.  The Extra Day Swindle

In February 1997, John Melo was convicted of home invasion and sentenced to ten years and one day in prison. Seven years later, he filed a motion complaining that the Department of Correction had miscalculated the length of his sentence. Why? Because it had failed to credit him for the additional days he had to serve on account of the February 29’s during leap years.

Melo’s motion was allowed, but he didn’t win the case. In 2006 the Superior Court ruled (Commonwealth vs. John Melo) that not only did his case have no merit, but it had been a mistake to ever allow it to proceed in the first place, noting that he had clearly been sentenced to a term of years, no matter how long each year may be.

Melo may not have had a compelling case. However, it is true that the extra day in February can be somewhat unfair. For instance, if you’re a salaried employee you essentially have to work an extra day for free during a leap year, whereas hourly employees get an extra payday. Similarly, banks often don’t include February 29 when they calculate the interest they owe their customers, thereby giving themselves an extra bonus day of profit at everyone else’s expense.

 - via Facebook
via Facebook

3.  Leap Year Capital of the World

In 1988, the town of Anthony, Texas, with a population of 8000, declared itself to be the “Leap Year Capital of the World.”

Its justification for this title was that two members of its Chamber of Commerce were born on leap year days. But in a moment of honesty a member of the Chamber also admitted that, “We just voted arbitrarily to name this as the leap year capital of the world because no one else has.”

As of 2016, the town of Anthony continues to pride itself on being the Leap Year Capital, with festivities planned for February 29.

4.  Leap Year Mother and Daughter

On February 29, 2008, Michelle Birnbaum of Saddle River, New Jersey gave birth to her daughter, Rose. What made this unusual was that Michelle herself was also a “leapling,” having been born on February 29, 1980.

The odds of a child being born on February 29 are 1 in 1641. However, the odds of both a mother and daughter sharing that birthday are somewhere in the range of 2 million to one. [, 2/24/2012]

Though quite long, those odds are still much better than the odds of winning the Powerball Lottery — approximately 292 million to one.

 - Hulton Archive/Getty Images
Hulton Archive/Getty Images

5.  Happy Aldrin Day!

Over the years, would-be calendar reformers have proposed many alternative ways of dividing up the year. Often these plans would give special status to the leap day.

For instance, in July 1989 Jeff Siggins published an article in Omni Magazine proposing that the Gregorian Calendar be scrapped and replaced by his “Tranquility Calendar.”

This would be a scientifically-based calendar that would set July 20, 1969 (when humans first landed on the Moon in the Sea of Tranquility) as Day Zero. All years after that would be referred to as “After Tranquility” (AT). So, as of February 2016, we are in the year 46 AT.

Siggins would rename the months after famous scientists — such as Archimedes, Copernicus, Darwin, etc. — and he would designate the leap day as Aldrin Day, after the astronaut Buzz Aldrin.

Taking a more mystical approach, Randy Bruner, a Cincinnati psychic, came up with the Dreamspell Calendar based on the Mayan Calendar. His system would transform the leap day into a “Day out of time,” which means it wouldn’t be included as a day of the week. It would be a non-day when people could “celebrate time is art.” [What exactly does that mean? Your guess is as good as mine.]

One of the most popular alternative calendar systems of the 20th Century was the World Calendar, created by Elisabeth Achelis of Brooklyn, New York in 1930. It would have shifted February 29 to June 31 and made it a world holiday.

Finally, we here at would like to propose that February 29 be designated as Official Weird Day — in honor of all things that don’t quite fit in.

25 “facts” that really aren’t: You’ve heard them all your life and they just aren’t true

by Salon


This article originally appeared on AlterNet.

There are scientific facts that most of us accept as true. The Earth is round. Check. E=MC squared. Check. Humans evolved from apes. Not exactly.

It might come as a surprise that science doesn’t necessarily back many scientific “facts.” Here are 25 facts that ain’t necessarily so.

1. Humans evolved from apes.

Evangelicals will be relieved to note that humans did not evolve from apes. Of course, that has never been the scientific claim that human evolution is based on. Humans and apes (chimps, most closely) share almost identical DNA (98.8% the same). What this means is that somewhere down the line, humans and apes shared a common ancestor, some of whose offspring evolved, over millions of years, into today’s apes, and some of whose offspring evolved, over millions of years, into us.

2. Milk builds strong bones.

The advertising campaigns for milk (“Got milk?”) have been some of the most popular and successful marketing in modern history. It is gospel among most people that milk builds strong bones and teeth. However, it’s not true. Many studies over the years have failed to find a link between milk consumption and a lower incidence of bone fractures.

3. Eating ice cream when you have a cold makes you more congested.

A prevailing myth about dairy in general and ice cream in particular is that consuming it increases mucous production. Eat up. According to the National Institutes for Health, there is, “no statistically significant overall association can be detected between milk and dairy product intake and symptoms of mucus production in healthy adults, either asymptomatic or symptomatic, with rhinovirus infection.”

4. Sugar makes children hyperactive.

The longstanding belief among parents of all persuasions is that sugar makes their kids crazy. The popular anecdotal example tends to be the children’s birthday party, where kids are loud, raucous and overly excited, with the blame pointing squarely at the sugary treats they have consumed. Truthfully though, no scientific evidence firmly links sugar to hyperactivity. The reason for the party hijinks more likely rests on the general excitement of kids being surrounded by their friends in a celebratory environment. The National Institute of Mental Health states that, “The idea that refined sugar causes ADHD or makes symptoms worse is popular, but more research discounts this theory than supports it.”

5. The 5-second rule applies to food dropped on the floor.

It’s okay to eat food you drop on the ground as long as you grab it within five seconds of the drop, or so the myth goes. Believe this fact at your own risk. If the dropped food item is dry, and the surface of the floor is clean, you’ll probably survive. But if the food item is moist, or if the ground is not clean, bacteria will be swarming the food practically upon impact, and there is no telling if that bacteria is harmless or not.

6. An apple a day keeps the doctor away.

No one believes this one anyway, but just to clarify, apples don’t protect you from anything in particular (well, maybe scurvy). They have plenty of fiber and vitamin C, and they make a healthier snack than candy, but in lieu of a flu shot, not so much.

7. Honey is healthier for you than refined sugar.

Just because your candy bar is sweetened with honey instead of sugar, don’t be fooled into thinking it’s healthy. Most scientists agree, sugar is sugar. The body processes honey essentially the same as high fructose corn syrup. The only advantage to honey may be that the caloric count may be lower, since honey tends to be sweeter than sugar and thus less of it is used in the product. And, by the way, brown sugar is just white sugar with a bit of molasses mixed in. There are some traces of nutrients in the molasses, but not enough to even remotely call it a health product.

8. Sharks do not get cancer.

The popular myth that sharks don’t get cancer, popularized by a nutritionist, I. William Lane, in the early 1990s, led to a health supplement craze of consuming powdered shark cartilage in the hopes of staving off or curing cancer. Not only does this not work, it has helped lead to the needless slaughter of millions of sharks in the past decades. And guess what? Sharks do too get cancer. Tumors have been found on many species of sharks.

9. Sharks must continually swim or they die.

Not only do sharks get cancer, but they can stop swimming and they won’t croak. At least most of them can. While a few species do have to keep moving to pass water over their gills, most can hang out just fine. However, all sharks do lack swim bladders, so if they stop swimming, they sink.

10. Frogs give you warts.

No, they don’t. Frogs and toads may look bumpy, but those aren’t warts, and you can’t catch them. Humans give you warts. Or more specifically, the human papillomavirus, which is usually passed around by shaking the hand of an infected person.

11. Bats are blind.

Bats do use their sonar (echolocation) to find their way around, but they are not blind. In fact, their eyesight is almost as good as a human’s.

12. Poinsettias can kill you.

The widespread belief that the popular Christmas plant, the poinsettia, is a deadly plant that will kill your pets or you if consumed is false. It’s not exactly edible, and it may make you a bit queasy, but its deadliness is a myth.

13. Organic fruits and vegetables are safer and more nutritious.

It has been widely believed for the past several decades that organic foods are both safer and more nutritious than conventionally grown foods. There are valid reasons why organic may be a better choice for the consumer, chiefly the negative environmental impact that conventional farming has on the soil and water, and birds and wildlife. However, safer and more nutritious are not among the reasons to go organic. Organic food, contrary to popular belief, is not necessarily pesticide-free. Many organic foods are grown using natural pesticides that are as unsafe to the human body as chemical pesticides. And in any event, the amount of pesticide residue on both organic and conventional foods is tiny and safe for human consumption. Meanwhile, almost 100,000 studies were reviewed on the nutritional content of organic, and the consensus is that organic has no more nutritional value than conventional.

14. Dropping a coin from the Empire State Building could kill a person.

A coin dropped from the height of the Empire State Building, or any tall skyscraper, will gain quite a lot of speed, up to 50 miles per hour, but the most it will do is cause a wicked sting. No manslaughter charges need be filed.

15. Lightning never strikes the same place twice.

Sure it does. The same Empire State Building you dropped the coin from suffers around 100 lightning strikes a year.

16. Water conducts electricity.

While it is true that if you are taking a bath and you drop a live electrical wire into the tub, you are in for a hot time, the real truth is that pure water is a poor electrical conductor. The reason we are shocked in our water is that regular tap water contains lots of other stuff, like dirt, minerals and the like, that are conducting the electricity.

17. Diamonds are formed from coal.

Many of the comic nerds among us fondly recall scenes of Superman squeezing a lump of coal and turning it into a diamond. Unfortunately, the science is a little off. In fact, diamonds are formed about 90 miles deep in the Earth, made from compressed carbon. Coal is only about two miles down.

18. People in the Middle Ages thought the Earth was flat.

Maybe way, way back this was true, but by the Middle Ages, most learned people believed the world was round. Certainly Columbus knew his trip was fraught with danger, but falling off the Earth wasn’t one of his concerns.


19. Carrots give you better vision.

Carrots contain lots of vitamin A, which is an essential nutrient for the eye. Not getting enough vitamin A will certainly impact the health of your eyes, but there is no evidence that eating lots of carrots will improve your vision.

20. Hair and nails grow even after you die.

Once you die, your biological processes cease. No more growing. This myth appears to be due to the fact that the body dries out and shrinks after death, making it appear as though the nails and hair have grown.

21. You only use 10% of your brain.

This myth is 100% wrong. People use all their brain. The idea that an organ like the brain would evolve and be 90% useless is ridiculous on the surface. In fact, the brain is so active that although it comprises only 3% of body mass, it uses 20% of the body’s energy.

22. Right brain is creative, left brain is logical.

It’s true that the left side of the brain (which controls the right side of your body) seems to be responsible for processes like language and math skills, while the right side of your brain (which controls the left side of the body) controls things like spatial relationships and music processing. But the brain is an incredibly complex organ, and the simplistic notion that one side is dominant over the other is wrong. There are plenty of left-handed people (right-brained) who are analytical (look no further than the professorial Barack Obama), and plenty of right-handed people (left-brained) who are artistic. No scientific study has shown that overall right or left brain dominance is a real thing.

23. Drink eight glasses of water a day.

Human beings are comprised primarily of water, so keeping hydrated is certainly an important goal. However, there is no scientific study that has ever come to the conclusion that eight glasses of water is the magic number. People should drink water when they are thirsty, and water is preferable to calorie-laden sodas and juices.

24. Drinking alcohol kills brain cells.

Actually, drinking a lot will screw up the connections between your brain cells, but your brain cells remain intact.

25. Outer space is freezing cold.

The vacuum of space is cold in some places, like the furthest reaches of the universe. In other places, though, like in the sunlight near Earth, it can get quite hot. Like 250 degrees Fahrenheit hot.
Larry Schwartz is a Brooklyn-based freelance writer with a focus on health, science and American history.

The Explosive Truth Behind the Movie Theater Projection Room

You might remember how flammable nitrate film was in the movie Inglourious Basterds. The story took a lot of liberties with history, but the Nazis actually did try to do away with nitrate film, also called celluloid, before World War II was over. Even today, Germany is working to rid itself of celluloid movies, which was the medium of choice for filmmakers up until 1951.

Celluloid is also extremely dangerous. It is essentially a solid form of nitroglycerin dragged across superhot carbon rods at extremely high speed. If celluloid combusts, which it can do at “car parked in the sun” temperatures, the fire generates its own oxygen, creating a flame which cannot be extinguished. It can burn underwater. It can burn beneath a fire blanket. It burns until the celluloid is gone, and any attempt to smother it creates clouds of poison gas.

That doesn’t sound like something you’d want to have in a crowded theater, but it was the reason why theaters were crowded. To ensure safety, celluloid film was segregated from the audience by the projection room, which was designed specifically to mitigate the danger of film fire. Read about nitrate film and the way theaters had to treat it at Atlas Obscura.

Ways People From the 1910s Thought Movies Were Ruining Civilization

by Anne Green, Mental Floss

In the 1910s, the movies were just starting to come into their own as a popular art form. Feature-length films were on the rise, a handful of actors and directors were gaining respect from critics, and the medium was moving from a cheap novelty to widespread popular entertainment. But while the nascent Hollywood film community was celebrating, not everyone was so happy about the ascendance of the moving pictures.

Journalists and concerned citizens began penning articles and editorials warning against the multifarious dangers of cinema. Their concerns ranged from the health effects of film to more general fears about morality. And while some people were simply skeptical of the artistic value of the new medium, others made it seem like the movies were on the verge of destroying civilization. Here are a few reasons to stay far, far away from your local movie theater, according a range of concerned citizens between 1910 and 1919.


Nowadays, parents worry about their kids rotting their minds with too much television or too many video games. But in the 1910s, parents worried that the movies were dumbing down their kids: “It is not only the artistic side of the cinema to which objections may be raised,” wrote one concerned citizen to The New Age in 1917 [PDF]. “It is rather, the educational side, for it is a well-known fact that children frequent the picture-palace—as it is often called—to a very large extent.” He continued:

Now, students of child-life know that the mere passing of knowledge without assimilating it is not merely useless but distinctly harmful to a child. The process of thought must proceed on natural and not artificial lines. It is true moving pictures arrest the attention, but thought is difficult or impossible. By one sense alone—that of sight—the mind is for the time being employed, and the rapidity of motion pictures produces a confusion of ideas. As every teacher is aware, education can only be received in a limited quantity at a time, and by associating an object with something that is known. The mere gazing at an infinite number of pictures in rapid succession must produce perplexity. There cannot be any real assimilation of the food thus provided. The brain becomes [exhausted], and unable to receive influences of a really educational nature, and, in fact, becomes demoralized.


The fact that movies were silent apparently didn’t stop actors from using dirty language. In a 1910 article, The Oregonian reported that “deaf mutes” had caught actors in several movies using “unprintable language.” The article quoted one scandalized audience member who exclaimed, “I am ashamed to repeat what that actor has just said … If the police could have heard the last remark of that man on the screen, they would arrest the manager of the show.”


One of the most common critiques of movies was that, as art, they just weren’t any good. Many journalists looked down their noses at film, dismissing it as a fad and a cheap novelty. But a few theater critics took a more extreme stance, arguing that film was a threat to art itself. “In the sacred name of truth, let us abolish this new cliché: to speak of ‘the art of the movie’ is to employ a vast farce of a phrase that is a contradiction in terms,” wrote one journalist in a 1916 Harper’s Weekly article called “Movies Destroy Art.” He continued:

Art is the effort on the part of a human being to express life as he sees it by brush, pen, chisel, song, or stave. Art is far from the movies—not merely in absence, but in positive antithesis—because the chief effort of the movie seems to be to present something that shall express life, not as the manufacturer sees it, but as he imagines somebody else wants to see it. This is not art but artifice.


At a time when many were calling for increased censorship of immoral content, a few journalists actually argued that movies were overly moral. “The movies have instituted a self-censorship,” wrote Floyd Dell in 1915:

In this respect they are unlike all the other arts, which have wantonly desired freedom, and chafed under restraint. The movies on the contrary, pay the expenses of a National Board of Censorship, to which they invite moral experts to belong, and to which they submit their productions. Anything improper is cut out of the reel. If a kiss is too realistic, several feet are cut right out of the middle of it.

As a result, writes Dell, the movies are “sterilized, emasculated, completely innocuous.”


Movie theater fires were a real danger in the 1910s. The nitrate film which movies were projected from was extremely flammable, and anything from the heat of the projector lamp to a careless projectionist’s cigarette ash could send a theater up in flames. Theater fires were a problem that predated the moving pictures, but according to journalists, the combination of flammable film and cramped screening spaces without adequate fire exits created an increased threat. In some cases, the claustrophobic theater and fear of fire were enough to cause life-threatening panic (film historian Gary Rhodes dedicates an entire chapter to movie theater fires in his book The Perils of Moviegoing in America, 1896-1950). In 1911, The New York Times reported that 26 people were killed when calls of “Fire” broke out in a Pennsylvania theater, writing, “Yet this panic would not have resulted so seriously if the picture show had not been exhibited on the second floor of a building, with a crooked hallway, an ill-lighted stairway, and insufficient exits. … [The theater] was always prepared for a great slaughter. The scenery was set for the tragedy.”


It turns out all of our fears about smartphones and tablets ruining our vision got their start a long, long time ago. In 1912, a doctor named George Gould published an article in the Journal of the American Medical Association titled “Acute Reflex Disorders Caused By The Cinematograph,” in which he wrote, “That the moving-picture shows cause in many spectators, functional diseases similar to those of eye-strain and ocular labor must have been noticed by every general practitioner and oculist of the cities, and yet, so far as I know, none has publicly directed attention to this important fact.” Gould continued:

I have had so many patients who have been made sick at these places of amusement that I now ask routine questions to elicit this etiologic factor. … If it is true that about five million spectators are in daily attendance at the picture-show theaters, the consequent eye-strain injuries and sufferings must be enormous, however conservatively estimated, and there is little likelihood of their exaggeration by hygienists and physicians.


Some worried about what could happen in a dark theater once the lights went down. Among them was Mayor Gaynor of New York City, who in 1910 gave the commissioner of licenses, Francis Oliver Jr,, authorization to force movie theaters to turn on their lights. The order sent out to the theaters read:

Many of the moving picture shows in this city are given in rooms which are totally dark, or almost dark, while the pictures are being displayed. Tests have proven that it is possible to display pictures in well-lighted rooms. If moving picture shows are given in darkened rooms it is possible for many actions to take place without the knowledge of the owners or managers, which would not be tolerated if the owners or managers were aware of them.


In a 1910 Good Housekeeping article called “The Moving Picture: A Primary School For Criminals,” William McKeever wrote:

If the citizens of any community should assemble with the purpose of laying plans and devising means whereby to teach immorality, obscenity and crime, I can think of no better way definitely and certainly to bring about such results than the use of the moving-picture show as it is now conducted. It is a serious matter, this picture business. We tax ourselves heavily for educational purposes, and employ teachers in the schools to inculcate, among other things, certain higher moral principles. In fact, we agree that the end of all teaching in the schools is moral character, and then we permit and license these cheap and vitiating shows to run, and we permit our children to attend, and not only unlearn all the moral lessons of the schools, but learn directly many of the immoral lessons that were once confined to the worst centers of our largest cities. In fact, the motto of these moving-pictures organizations might be this: “A red-light district in easy reach of every home. See the murders and the debauchery while you wait. It is only a nickel.”


While most criticisms of movies in the 1910s centered around a single topic, others were more general. One contributor to The New Age [PDF] summed up some of the anti-film sentiments of the 1910s when he concluded a diatribe against the movies (which was signed “An Actor”) by claiming, “The cinema to-day is the microcosm of every evil with which our society is threatened. It will rob us not only of our souls, but also of our daily bread.”