Y7 June 2018

Page 1

TOP 10 LIST ISSUE

TOP 10 Things that happen when you sleep Disgusting beauty treatments Genes that could make you superhuman Diseases that came from outterspace AND MORE...



Copyright © 2018 by HSF Media Inc. All rights reserved. No part of this publication may be reproduced, distributed, or transmitted in any form or by any means, including photocopying, recording, or other electronic or mechanical methods, without the prior written permission of the publisher, except in the case of brief quotations embodied in critical reviews and certain other noncommercial uses permitted by copyright law. For permission requests, write to the publisher, addressed “Attention: Permissions Coordinator,” at the address below. HSF Media Inc 300 Cadman Plaza West, Floor 12 Brooklyn, New York 11202 submissions@ya7elweenmagazine.com

Ordering Information: Quantity sales. Special discounts are available on quantity purchases by corporations, associations, and others. For details, contact the publisher at the address above. Orders by U.S. trade bookstores and wholesalers. Please contact HSF MEDIA INC. or visit www.y7humanitiesmagazine.com. Printed in the United States of America EDITORIAL BOARD Hassan Farhat MD. President/ Editor in Chief April Khan Vice President/ Chief Business Officer /VP Marketing Dinah Rashid Senior Editor/Chief Content Officer Saood Mukhtar Deputy Design Director/ Deputy Web Editor Catherine Finley Articles Editor


Top 10 Amazing Feats And Facts About Glass CLARA JORDAN There is more strangeness and ability involved with that office window than most people give it credit for. But hand ordinary glass to Shaolin monks and scientists, and things get downright freaky. From puzzling performance feats to crazy alloys, glass is not as simple or weak as it seems. Research can now explain old mysteries and create advanced technology, but most amazing is what laboratory-forged glass can do—heal itself and even outlive human civilization.

Missing Crater’s Glass Trail

Photo credit: Live Science

About 800,000 years ago, a hefty meteor smashed into Earth. The rock measured 20 kilometers (12 mi) wide and, after the impact, threw glassy debris into the atmosphere. This fallout rained glass over an area of 22,500 square kilometers (8,700 mi2). Despite this massive footprint of glass across Australia and Asia, the crater has never been discovered. In 2018, more glass beads showed up in Antarctica. With each the width of a human hair, they were soon identified as part of the mysterious meteor’s debris. Called microtektites, their chemical composition grabbed the scientists’ attention. The low levels of sodium and potassium showed that the beads were likely the farthest fringe of the elusive crater. Sodium and potassium leach out under extreme temperatures, and hot debris also spreads farther away from an impact.[1] When the Antarctic microtektites were compared to ones from Australia, the latter had higher levels of sodium and potassium and, theoretically, were a step closer to the crater. Following this formula, going from hot to cooler, researchers expect to find the crater somewhere in Vietnam. If correct, then the Antarctic beads traveled an incredible


distance—about 11,000 kilometers (6,800 mi).

The Shaolin Needle Trick Shaolin monks are known for their impressive displays of agile martial arts. But recently, one man did something really unusual. Feng Fei threw a needle through a pane of glass—without shattering the glass. The monk hurled the needle with such strength that it popped a balloon on the other side. For all purposes, it should have broken the entire pane. When the superfast trick was viewed in slow motion, it looked like the needle’s point pierced the glass with some throws. At other times, it seemed like the needle just cracked the glass and popped the balloon with small shards released on the other side. Both remain an incredible feat. The answer boils down to how glass breaks at a molecular level. Glass is tough. Its molecules are linked in a network that shares (and thus weakens) any pressure against it. If you push a fingertip against a pane, the entire window will resist you. Cracks happen when molecular links fail and pressure is forced to follow the crack to its end.[2] If a needle can avoid bending and is thrown with enough accuracy and muscle, a deep crack will form. Once that is achieved, there will be little resistance to stop the needle from passing through.

Glass Wants To Be A Crystal

Photo credit: Live Science

Scientists are not sure what kind of matter glass is. A sheet of glass is not solid even though it may appear that way. Bizarrely, it sometimes behaves like a liquidand a solid at the same time. Glass atoms are caught in the same way as a gel’s—slow-moving atoms that never get anywhere because they block each other’s way.


In 2008, a breakthrough occurred when the focus turned on the pattern formed by glass atoms as they cooled down. They formed structures called icosahedrons, which resemble 3-D pentagons. As pentagons cannot be arranged in an orderly way, the glass atoms appeared as a random mess. The same study also found that glass tries its best to be a crystal. But for this to occur, molecules must arrange themselves in a highly regular pattern. The 3-D pentagons prevent this from happening. In other words, glass is neither solid nor liquid, has properties of a gel, and is something of a crystal suffering arrested development.[3]

Radioactive Clue To Moon’s Birth

Photo credit: phys.org

How our Moon was born remains a bone of contention among scientists. Glass left behind by the first atomic explosion can prove a theory that the Moon resulted from a collision between Earth and a planet-sized body about 4.5 billion years ago.

In 2017, researchers found glass forged by the 1945 nuclear test in New Mexico. Called trinitite, it was green and radioactive. By measuring the different chemical compositions in the glass, the first solid clue about the Moon’s formation was found. The trinitite nearest the explosion zone was empty of volatile elements, including zinc. Such elements vaporize under extreme heat, similar to what happens when a planet forms. Until now, this was pure theory. But after the nuke sucked out the elements, scientists now have their first physical evidence. Trinitite and lunar material are similar enough with their lack of water and volatile elements to prove that the latter react the same to high temperatures whether on Earth or in space.[4]


Prince Rupert’s Exploding Glass

Photo credit: Live Science

They look like teardrops or tadpoles. But Prince Rupert’s drops blend two polar opposites into one shape with a hair-trigger fragility and a strength that can withstand a hammer. When molten glass is dripped into ice water, the unusual drops are created. In the 1600s, Prince Rupert of Bavaria tried to figure out the mystery. When the head of the teardrop-shaped bead was hammered on an anvil, the glass refused to break. However, the moment that the thin tail was snapped off, the entire drop, head included, exploded into a puff of powder. King Charles II, Rupert’s uncle, ordered the Royal Society to unravel this secret, but they found no answer.[5] In 1994, high-speed photographs showed that a broken tail sent cracks barreling toward the head at over 6,400 kilometers per hour (4,000 mph). In addition, scientists discovered that cooling was behind the drops’ strange qualities. When molten glass hit the cold water, the outside cooled rapidly. The inside solidified much more slowly, which created a surface tension tight enough to withstand a beating. However, on the inside, that same tension bombs the drop at the first hairline crack.

(continued)


Glass As Radioactive Storage

Photo credit: eurekalert.org

One of the main issues with hazardous material is storing the waste—and globally, there is an unimaginable amount. Often, containers leak and toxic spills contaminate the ground, water sources, and even people. In 2018, the US Department of Energy found a novel way to store radioactivewaste as glass. At a former weapons factory called Hanford, tanks of waste are kept underground. Researchers chose low-activity radioactive trash for a test run of the theoretically spill-proof idea.

The liquid waste was mixed with glassmaking ingredients and then gradually injected into a melter. The 11 liters (3 gal) of waste went into the furnace and, after 20 hours, came out completely vitrified. This first attempt was hugely successful and managed to safely encase radioactive material within glass. A full-scale program will now tackle the millions of gallons of toxic tanks that remain below Hanford.[6]

Glass As Tough As Steel

In 2015, the University of Tokyo whipped up a novel kind of material—transparent glass nearly as macho as steel. Think along the lines of windows surviving car collisions or unbreakable wine glasses. All that had to be done was to figure out a way to mix alumina with glass. As far as toughness is concerned, alumina is close to a diamond’s hardness. It is also the additive that makes paints and plastics hard. For years, all attempts failed. The glass-alumina mixture crystallized the moment that it was poured into any container. In an innovative move, a new technique blended them in the air. Besides being transparent, the 50 percent alumina mix forged a glass as elastic and rigid as steel. The integrity remained even on a microscopic level.[7] This opens the door to advancements in phones, computers, and future electronics.


Glass That Heals Itself

In 2017, Japanese researchers were analyzing new adhesives when they accidentally invented something fantastic: self-repairing glass. While running tests, one scientist noticed that the edges fused when pressure was applied to cut glass pieces. Follow-up trials showed the material was not a one-time wonder. The magic element was a polymer (a substance consisting of many repeating units) called polyether-thiourea. When cut, it glommed on to itself after being pressed together for 30 seconds. The best part was that it happened at room temperature. Usually, materials need extreme heat to fuse. This made the glass unique among self-healing materials. Among them, polyether-thiourea also mends the fastest.[8] Despite being as robust as normal glass, the new polymer is earmarked for a wide variety of applications. One suggestion was almost immediate—the cure to annoying mobile phone screen cracks. The medical field is also in the wings, where shatterproof substances can help with repairs inside the human body.

Replacing Bones With Glass

Photo credit: BBC

Nobody relishes the idea of replacing a sturdy part of one’s skeleton with glass. As creepy as it sounds, surgeons feel that it is the perfect solution for broken bones. Forget the window pane type, the material that could revolutionize medicine is called bioglass. Stronger than bone, bioglass is also flexible and antiseptic.

In 2002, the first implant replaced a shattered orbital floor. Without this wafer-thin bone, the eye rolls back. In this case, the man also went color-blind. No conventional surgery helped. A plate of bioglass was inserted beneath the patient’s eye, and almost immediately, full sight was restored including color perception.


Remarkably, bioglass fools the immune system into accepting it as a part of the body. Safe from rejection, it spreads ions that fight infection and directs healing cells. The latest version of bioglass, which is not yet commercially available, is more rubbery but tougher. It has been designed to allow freshly broken legs to walk without pins or crutches.[9] Also, to finally succeed where all else has failed, bioglass is designed to replicate how cartilage heals. Since bioglass fuses with the body and stimulates regrowth, it might just be the holy grail of cartilage surgery.

Billion-Year Data Storage

Photo credit: theverge.com

A recently invented storage device might just outlast human civilization. A glass disc, resembling a tiny CD, is a new 5-D concept capable of storing 360 terabytes of data. This is great news for storage junkies, considering that each day adds the data equivalent of 10 million Blu-ray discs to the world.

The brainchild of researchers from the University of Southampton, each glass plate is created with a technique called femtosecond laser writing. Pulses of an ultrafast laser scribble information in three layers. The data is not written in the conventional sense. Instead of words, massive archives such as libraries and museums can store their records as dots. These nanostructures are about 5 micrometers (0.005 millimeters) apart. The three-dimensional position of every dot plus its size and orientation makes the disc a 5-D device. It can only be read using a special microscope with a light filter. Besides hugging insane quantities of information to its bosom, the discs can withstand 1,000 degrees Celsius (1,832 °F) and probably last for about 13.8 billion years.[10]


" A RECENTLY INVENTED STORAGE DEVICE MIGHT JUST OUTLAST HUMAN CIVILIZATI ON. "


Top 10 Health Disorders Made Up To Sell Products GREOGOR MYERS

These days, many new alleged disorders, health scares, and other psychological ailments seem to have been largely aided and abetted in their spread around the world by the Internet’s current viral culture. People will tell you that what you eat is killing you, what you breathe is killing you, and even Wi-Fi and drinking water will damage you forever. Looking to take advantage of people’s fears, hucksters have used these scares to dramatize made-up disorders. Sometimes, they even create health scams themselves to sell products and make money off the gullible general public.

Some People Falsely Believe That Wi-Fi Is Dangerous Or That They Have A Specific Intolerance

A growing number of people claim to have electromagnetic hypersensitivity (EHS). These individuals believe that the radiation and other radio waves from Wi-Fi and mobile phones are causing them to be sick on a regular basis. Some have even petitioned governments to give them disability for it. In France and Sweden, a few people have managed to get benefits for this fictitious disorder. The problem is that many of these people are being falsely legitimized because a couple of countries are confused by reports from the World Health Organization (WHO).


The WHO verified that EHS was real. But they also said that the electromagnetic part should be removed[1] because there isn’t a shred of evidence that Wi-Fi or other similar signals are actually causing any kind of specific disorder or symptoms. In fact, the people who constantly claim to have this disorder always have symptoms that are very common—such as headaches, nausea, or a feeling of being unwell—and can be attributed to almost anything. On top of that, studies have shown that EHS is likely to be an example of the “nocebo effect.” This occurs when someone comes to believe that something is dangerous for him. Before long, he will convince himself that it is making him sick. Then his body reacts in the opposite way it would to a placebo and sickens him for real. While the jury is still out on whether there are any long-term repercussions from constantly talking on a mobile phone, most researchers believe that any effects are slight and that your Wi-Fi itself is not something to seriously worry about.

Gluten Sensitivity Is Probably The Most Overhyped ‘Disease’ Ever

It has become incredibly trendy these days to take gluten out of your diet entirely, citing claims that it somehow makes you feel sick, tired, or weak without any actual evidence to back it up. Many of these individuals don’t bother to go to a doctor to see if they actually have celiac disease or a wheat allergy. Still, some claim to have a sensitivity even though there is no way to actually test for such a thing. Lately, some news sites have declared that research has proved the existence of gluten intolerance, but it did no such thing. Research has shown that some people who don’t test positive for a wheat allergy or celiac disease still claim to have some symptoms when eating gluten. But there are a lot of factors going on. To begin with, our old friend the nocebo effect returns. Many people have been


persuaded that gluten is bad for everyone, despite this being entirely untrue. So they psychologically convince their bodies that gluten is bad and make themselves sick. Doctors have also suspected that irritable bowel syndrome (IBS) may have a lot to do with this. Researchers performed a double-blind study with IBS sufferers who were supposedly gluten intolerant and found that gluten was not a reliable trigger any more than a placebo. The researchers believe that wheat and a lot of different foods can be tough on the tummies of people with IBS. These patients are just sensitive to almost everything, but gluten itself is a protein and not the culprit here. If you think you have issues with gluten, doctors recommend that you go for an official diagnosis first. There could be many non-gluten things causing your symptoms, and you could delay a proper diagnosis by thinking you have solved the issue on your own.[2] The truth is that a lot of people who claim that they have a sensitivity to gluten probably just have sensitive stomachs in general.

Your Body Does Not Need Its ‘Energies’ To Be Balanced

These days, many people talk about their energies and the energies of those around them. They suggest that others have “negative energy” and that they have “positive energy.” They believe that this is not just people reacting to the emotions of others but some kind of actual intangible energy field that humans have around them—an aura, if you will. This belief has led to an industry of fraudsters who claim that they practice something called Reiki. A Reiki practitioner believes that he can bring out the energy of his own body and use it to influence the energy field of someone else to help that person for the better.


Some have even claimed that they can perform this allegedly amazing feat from a distance. These practitioners implement their energy-based healing from many miles away without ever meeting their “patients.” Here’s the truth: While it is often put forth in scientific-sounding language, Reiki is utter gibberish that is meant to confuse those who don’t understand the words being used. Your body has no special energy field, and it doesn’t emit any magnetic force.[3] You have energy to perform tasks, but that does not create a specific energy field. It is just a measure of how much your body is able to accomplish in a given time. Whenever someone starts talking about the electromagnetism or heat emanating from your body and how they can use that to balance your energies, you know that you are talking to a fraudster or, at the very least, someone who is extremely delusional.

Halitosis Is A Made-Up Disease Imagined By Listerine To Sell More Product

Some of you may have heard of the condition known as halitosis (aka bad breath). Most people take this for granted as a serious problem these days and will often have various levels of social anxiety about having potential bad breath. In fact, some individuals have worried enough to go to doctors about the issue. Many keep their Listerine handy and even some breath mints to avoid such an embarrassing problem. However, not long ago, people didn’t worry about bad breath. As it doesn’t cause pain or any life-threatening symptoms, they figured it couldn’t be considered a real disease. And they were right. It isn’t. During its early days, the company that made Listerine, which has been around since the late 1800s, was selling a decent amount of product to sterilize wounds, especially in the mouth. But they felt that they were not selling enough. So they made up the term “halitosis”[4] and started a marketing blitz to play on people’s insecurities.


It worked beautifully. People were so convinced by Listerine’s ploy that almost no one will be caught with bad breath anymore. Unfortunately for Listerine’s profits, a lot of people these days just carry some breath mints and brush their teeth regularly.

‘Detoxing’ Your Body Is Not A Real Thing—It Is Pushed By People Trying To Sell Useless Products

In health aisles and on health blogs, you will often see detox supplements or detox diets meant to quickly flush all the toxins out of your body so that you are clean and ready to go. These products and the people who push them sound very alarming about the buildup of bad things in your body and suggest that what you are drinking or eating to detox is nothing more than the precious elixir of life itself. And you can have as much as you need—for a price. As you may have guessed, their statements are not true. Perhaps one day, laws will become stricter and these fraudsters will get what they deserve. There are a few major problems with their claims. First, they don’t understand what “detox” means. A detox[5] is used by medical professionals to get an addict to a point where he is safely off a drug. Addicts are often tapered off drugs so that the effects of quitting cold turkey don’t kill them, which can happen with some drugs. It has absolutely nothing to do with cleaning out your insides. When discussing detox diets or supplements, the toxins to which people refer are totally nonspecific and don’t even exist. The truth is that your body constantly flushes anything poisonous out of you on its own.


A good example of this is when your body slowly works the alcohol out of your system by using your liver. More importantly, if you really think your body is not getting the toxins out and you are becoming severely ill, then you may have something serious like organ failure and you should see a doctor immediately. A detox drink will not save you.

Vaginal Douching Is Completely Unnecessary

Vaginal douching is fairly common among women but actually has a rather short history. It only started to see widespread use in the last couple of centuries. At first, it was meant almost entirely as a method of birth control. Often, this meant vinegar or even chemicals would be used, which could be quite dangerous. Over time, the purpose moved away from birth control and became more about dealing with cleaning or odors. Lysol also heavily advertised in the early 1900s, suggesting that women should be using it to clean themselves. Although people eventually realized that it was a bad idea to apply Lysol to human skin, many women had become convinced that using products to clean their vaginas was something that should be done on a regular basis. However, the truth is that vaginas are self-cleaning and do not need any special products. On top of that, douching could upset the careful chemical balance and lead to an increased risk of infections and other complications. Doctors simply do not recommend doing this at all.[6] But it is an uphill battle to convince people otherwise because so many women have been douching for generations and passing the habit on to their offspring.


Hucksters Are Playing Up The Dangers Of Fluoride And Trying To Sell People Special Water Filters

Fluoride is one of the most controversial substances on the planet. It was made famous in popular culture with movies like Dr. Strangelove. The main character wanted to launch a nuclear attack because he felt the communists were poisoning our precious bodily fluids with fluoride and he had to stop them. Many people today feel similarly and think that fluoride is an incredibly dangerous substance that should never have been put anywhere near our drinking water. They cite bogus studies or reviews which claim that fluoride damages children’s growing brains, causes cancer, and more. However, these alleged tests never hold up to scrutiny. The oft-cited research that claims neurological damage results from fluoride was simply a review of tests in China. We know that China has many different factors impacting the quality of their drinking water, not just fluoride. Despite heavy testing all over the world, there is no evidence that fluoride is harmful. [7] The worst it can do is create small, purely cosmetic white spots on your teeth if you have too much over time. However, there are people who try to make a buck on fluoride fears and they will defend the claims of harm until the end. Some sites even charge exorbitant sums to sell people special water filters that supposedly remove all of that evil fluoride.


Depression Is A Very Real Disease, But The Majority Of Those Diagnosed Do Not Actually Fit The Bill

Depression and major depressive disorder are absolutely real. There is more than enough scientific evidence to prove it—we are not arguing that. The issue is that many of the individuals diagnosed with depression do not fit the criteria properly. In a study at Johns Hopkins, they took a look at nearly 6,000 people who had previously been diagnosed with depression and found that less than 60 percent of them truly qualified as having major depressive disorder.[8] Even worse, antidepressant use in the United States rose by 400 percent in about 20 years, with over 10 percent of the teenage-and-up population taking antidepressants of some kind. Although depression does exist, many people are given drugs that could make them worse while not helping with their real issues. When someone who doesn’t qualify as having depression feels sad, it is likely because life is difficult. These people may need to talk with a counselor. But taking unneeded drugs could harm the chemical balance of their brains over time. The reason for this is serotonin syndrome and the main way that depression is treated. Most antidepressants help you produce more serotonin, which is a feel-good chemical in your brain. Those with depression have trouble producing this. However, if you have too much serotonin in your system over time or at once, it can actually damage your ability to properly produce it. Excessive serotonin can also cause seizures in extreme cases. Although extreme reactions are only likely to happen if you overdose, taking antidepressants over time when you don’t need them cannot be good for your chemical balance. You are basically tipping the scales in the wrong direction.


Trypophobia Is Not An Official Disorder And Is Very Played Up By Peer Pressure

Photo credit: Peripitus

Recently, a ridiculous new condition known as trypophobia has arisen on the web. While no one seems to have found a way to profit from it yet, you can bet someone will try as soon as they figure out a way. If nothing else, some web administrators have received decent traffic from making a huge deal about this supposed condition.

Trypophobia is the alleged fear of clustered holes. Some people claim that these holes freak them out and make them feel a horrible sense of revulsion. Supposedly, their skin gets itchy or they feel panicky or nauseous when looking at clustered holes. However, there is little reason to believe that this is a real condition. No professional psychologist or doctor of any kind recognizes this phobia or condition. The handful of studies that have been performed were on a small scale and hardly conclusive of anything solid. Carol Mathews, a psychiatrist at the University of California, talked to NPR about the phenomenon. She believes that it isn’t a true fear but simply a combination of priming, disgust, and people’s good old “me too” social attitudes (aka peer pressure).[9] Trypophobia pictures are nearly always shown with images that most people would find disgusting whether the pictures had clustered holes or not. This primes the brain— along with being told that trypophobia is a thing—to feel disgust or revulsion when you look at other such images. Mathews also pointed out that many of these pictures, such as those of sliced cantaloupes, might gross out any of us if we look at them too long. But that doesn’t


mean we actually have a condition or an instinctual revulsion. Disgust is not the same thing as fear.

Showering On A Daily Basis Can Be Bad For Your Health—It Is More About Smell And Expectations

Showering daily, sometimes more than once a day, or at least once a week is a habit ingrained so deeply in so many individuals in modern societies that the idea of not doing it is utterly foreign. Many people just cannot imagine a life without regular showers or baths. They spend a lot of money on shampoos and conditioners over the years. However, there is some reason to believe that the shampoo and conditioner companies are really selling you a big, fat load of social insecurity while happily taking your money to the bank. There appears to be no scientific basis for believing that showering is good for your health in the slightest. In fact, all evidence, while not as strong as some would like, points to the contrary. The research suggests that showers are actually bad for you because they constantly kill off healthy skin bacteria and mess with the delicate microbial balance that keeps you safe from diseases and other problems.[10] Frequent bathing was not widespread until the more modern world emerged. Even then, the kinds of shampoos that we use now became common only recently. In most people’s eyes, there was never any real need to shower so often until clever marketers convinced people that their natural odor was socially unacceptable. These marketers wanted to sell shampoos and other products like deodorant, so they created an industry that is now worth billions of dollars a year.



Top 10 Practical Ways Women Used To Handle Menstruation HANNAH JANSSEN

These days, there are numerous commercials of girls diving into pools or frolicking through wildflower fields, all because they can, thanks to their awesome tampons and other feminine products. But the first pads were not invented until 1888, and even then, it wasn’t an everyday item. Women oftentimes could not afford the product, and it wasn’t until much later that they became more affordable and common. Tampons didn’t come till 1929. So what did women do before then?

Rags

Photo credit: Edal Anton Lefterov

Rags are an obvious stand-in for a pad. Fabric is absorbent and can be relatively longlasting, as well as abundant. Since at least the 10th century, women used rags or some kind of cloth as a way to absorb the flow. These were also reusable; once the rags had done their job, women would just wash them. This practice lasted up until the nineteenth century, at least, since that is when the pad was invented. Of course, since not all women could afford pads at the time of their invention, it’s likely, if not certain, that women continued to use rags up into the 20th century.


Papyrus

Photo credit:Â Wikimedia

The Ancient Egyptians supposedly used softened papyrus as a tampon. Papyrus is a plant that grows naturally in Egypt and was used for numerous purposes in antiquity, primarily as paper for writing. To make papyrus pliable, women would simply soak it in water. The water softened it and would bring out a natural stickiness to hold multiple pieces together. The characteristic of pliability and softness, as well its abundance, certainly makes it a decent tampon. Unfortunately, we cannot know for sure this was done. Since ancient texts were often written on papyrus itself, which is a very vulnerable material, any documentation of such a thing, if there ever was any, is lost

Wool


In Ancient Greece, wool was supposedly used as a tampon. Typically, ancient uses are left to just evidence and reasoning, but in this case, there is better record. The wool tampon was a treatment written by (or on behalf of) Hippocrates. Wool tampons are also logical, as that’s a resource the Greeks had. Hippocrates was a physician from Ancient Greece during the fifth and fourth centuries, and he is considered the father of medicine. He has many written works that describe his plethora of diagnoses and “discoveries.” Some are groundbreaking, but others are not entirely so, as our modern medicine has shown. For example, he said that fat women could not conceive because their fat crushed down on the uterus, and that the only way to conceive, therefore, was to lose the weight. Of course, without proper technology and understanding, it is reasonable that someone of his time would deduce something that, in today’s world, is outrageous.

Cedar Bark

Photo credit: Wikimedia

Cedar bark, as painful as that might sound, was used by Native American women as a menstrual pad and even as diapers. Typically, when we think of bark, we think of the rough hard side of a tree. When it comes to cedar, it is true that it is hard on the side of the tree; however, it does have a few special properties that could make it a decent, though still not very comfortable, pad. First of all, cedar bark is very lightweight and thin. Secondly, and most importantly, it is absorbent. The moisture retention qualities as well as its lightness make cedar a good candidate


for a pad or diaper, especially with relatively limited resources.

Buffalo Hide

Buffalo hide was used by the Arikara women as a sanitary pad.

Photo credit:Â George Catlin

The Arikira tribe, related linguistically to the more well-known Pawnee tribe, is located in the northern United States in North Dakota, Montana, and parts of Wyoming. Buffalo had a multitude of uses in Native American life. Of course, the meat was used for food, but the other parts of the beast were far from wasted. The bones were made into knives and tools, even boiled for glue. The hooves and horns were used for cups or other vessels. The sinews were made into bowstrings and other threads, to sew clothes with. Clothes were made from the buffalo skin. Other items made from buffalo skin were tipis and bags, among other useful items, including sanitary pads. Tanning buffalo hides involves soaking and scraping. The skin, just off the buffalo, is soaked in water, stretched, and then scraped to remove the hair. This process of soaking, stretching, and scraping continues until it is finished. Then it is time to dry it. To dry it, and to make the hide, the Native Americans would smoke it. That is, they would have it held over a fire and let the heat dry it and allow properties from the smoke enhance the hide. By the end of this process, the skin becomes relatively soft and pliable. This would make it a decent menstrual pad, especially compared to cedar bark.


Natural Sponges

Photo credit: Kirt L. Onthank

In ancient times, women in coastal areas, like Greece, used natural sea sponges as tampons. Sponges, as we all know, are very absorbent. Whether using a sponge straight from the sea is safe is worth questioning, though.

Since this was thousands of years ago, there is little information of this topic available, so it is difficult, if not impossible, to know what harm using the sponges might have caused and whether the sponges were even treated in any way before they were used. Today, however, this has been looked at, and it has been decided that sponges might not be so safe. With an increased fear of toxic shock syndrome, the use of sponges in modern times grew, so the Federal Drug Administration stepped in and, after analyzing scientific studies, declared that they’re “significant risk devices” due to bacteria, yeast, and other harms. Despite this, these sponges are still sold and used by numerous companies, and technology has advanced since the FDA made that statement in 1995. With more advanced technology comes more thorough cleaning and disinfecting processes, so the risks might be less. The ancient Mediterranean women, however, did not have the FDA, nor did they have much to clean the sponges, except perhaps boiling them in water. If these sponges are risky in today’s world, they were most likely much more so thousands of years ago

Grass

Grass was used in some form—a pad or a tampon—by women in Africa as well as Australia. The first form, a pad, was simply a bandage of sorts made of grass and vegetable fiber. Vegetable fibers are materials like flax or cotton that go into making fabrics. The tampons were made by constructing rolls of grass and roots.


The use of grass in either form could not have been very pleasant. Some species of grass, like carpet grass, can be soft enough that it might be suitable to use. Other grasses, perhaps more often than not, are itchy, rough, dry, or painful. Africa has many of these grasses, like nine awned grass. Of course, some other grasses are not as pointy and painful, given how many animals graze and would have to eat the grass. Nevertheless, grass in any form was probably not ideal for menstrual care. It is fair to note, however, that menstrual care in Africa is still lacking. In many places, women still have to use rags and rewash them daily. Sometimes they don’t dry completely by the time they need to be used, so bacterial infections and other diseases can occur. Still others have to resort to leaves or paper.

Paper

Photo credit: Wikimedia

In Ancient Japan, women would supposedly use rolls of paper as a tampon and bandaged it in place. This paper was held in place by a bandage called kama(totally unrelated to the Hindu text Kama Sutra). Understandably, this device had to be changed an average of about 10 times a day.

Paper in Japan at this time, though, was surprisingly durable and absorbent as far as paper goes. Good-quality paper, called washi, was made in Japan at an unmatched pace by AD 800. This paper was made of plant fibers, and during production, they were left long, instead of crushed up like Western paper. Between the production method and the sheer nature of the plants, this paper was relatively strong, absorbent, and lightweight. These qualities are superior to today’s paper, so if these women were changing eight to twelve times a day, imagine how often women would have to change with today’s inferior paper.


Rabbit Fur

Photo credit: Wikimedia

Supposedly, women used rabbit fur back in the day as a menstrual pad. There are multiple contexts that state this, but there are very limited, and few sources verify this claim. As frequently as it is written in passing, there might be some weight to it, but take it with a grain of salt. Just because there is a lack of sourcing does not necessarily mean it is not possible. Cultures like the Native Americans, African cultures, and others certainly used the fur of rabbits and many other animals for a variety of purposes, like clothing and blankets. Given how soft and pliable the fur is, it would not be surprising if women chose to use this to catch their monthly flow, but we do not know for sure.

Nothing! Photo credit: People

In 19th-century Europe, women just let nature run its course naturally. This was due mostly to the fact that one, nothing was invented yet, so they had nothing to go buy. Two, they couldn’t afford to buy anything if they could. Three, they could have used a homemade pad, but sparing rags or tearing up sheets might have been too costly to spare. In this case, it is mostly poor women who opted out of any kind of pad. This practice can’t be just centered on 19th-century women. Common sense leaves room for these circumstances to apply to women throughout the ages. Even today, any woman can get in a bind and have no choice but to free bleed. In fact, some women even do it intentionally.


Top 10 Fascinating Things That Happen To You When You Sleep DAMIAN COBURN

For most people, the word “sleep” conjures up peace and relaxation after a long day. However, your body undergoes many changes during the hours that you are unconscious. From undoing the day’s harm to preparing for the day ahead, the human body is constantly sorting, repairing, and refreshing. So next time you wake up feeling like a different person, know that there may be some truth to the thought.

Filing Away Memories

Human beings are very active organisms. We do things, go places, and interact with people, all the while creating memories. Memories are stored in the brain. But as anybody who has ever tried to find a file on a crowded desktop knows, saved things are of no use if they cannot be easily accessed and found. During sleep, the brain actually replays the day’s events, catalogs them, and stores them in the brain’s long-term memory centers. Simultaneously, the brain discards unnecessary memories. This storing of important long-term memories is critical to humans being able to function properly. This is because long-term memory is essentially limitless and set in stone, meaning that these memories will stay with you for your lifetime. Most people can remember specific memories from their childhood very vividly but


have a much harder time remembering exactly what they did two days ago. This kind of memory prioritization is essential for learning and retaining information as well as developing skills like problem-solving or mastery of a sport or game. Most of this memory consolidation occurs during one of the deepest parts of sleep, slow-wave sleep, in which there is little other brain activity. As the mind transitions into REM (rapid eye movement) sleep, the brain then stabilizes the critical memories for quick recall later

Blood Pressure And Core Temperature Drop

About 30 minutes before you fall asleep, the body begins to lower its body temperature. This occurs to slow down your metabolism to the point where you can sleep for hours without going hungry. As a result, heart rate and blood pressure also decrease. While this may not seem like a big deal, your core body temperature drops by more than 1.1 degrees Celsius (2 °F) to around 35.6 degrees Celsius (96 °F), which is one degree above hypothermia. Your body also needs less energy, so you are in no danger of freezing to death while taking a nap. Upon waking up, blood pressure and heart rate rapidly rise again to keep up with the demand for energy. But for a brief period of time, there is an imbalance, resulting in the sluggishness and clouded thinking experienced by people who have just awakened.


Paralysis

Photo credit: thesleepparalysisproject.org

Ever had a nightmare and found you couldn’t run or scream? While this can be terrifying, this phenomenon (known as “sleep paralysis”) is what keeps you from acting out all your dreams in real life (which is probably a good thing). During REM sleep in which dreams occur, the brain blocks neurotransmitters and receptors in muscles, effectively paralyzing you.

Occasionally, this can happen briefly when falling asleep or waking up, during which you are fully conscious but unable to move at all. This state is quite terrifying. It is also the root of many ancient legends, usually involving hallucinations. In these legends, people are visited by demons or other creatures (from the Old Hag in AngloSaxon tradition to the “Pinyin” in Chinese folklore) and are unable to move.

Stretching

Throughout the day, downward pressure is exerted on the spine due to gravity and your vertebrae compress. As a result, fluid drains out from between your vertebral discs and you actually shrink up to 1 centimeter (0.4 in) by the end of the day. Similarly, when your back is relieved of all that stress at night, the fluid is allowed back into the joint connections, permitting the body to stretch an extra 1 centimeter (0.4 in) or so.


Although this height difference is not that significant, the lack of pressure also enables children and adolescents to grow while sleeping. In fact, we can only grow while asleep. This is because of both the pressure that is taken off the spine and legs when lying down and the growth hormones that are released while sleeping.

Sleepwalking

Although not everyone sleepwalks, enough of the general population (around 30 percent) has sleepwalked at least once in their lives to include it on this list. Also called somnambulism, sleepwalking is technically a sleep disorder in which the brain is in a semiconscious state, performing complex tasks such as getting out of bed, going to the kitchen, even driving. Obviously, this can be very dangerous. But sleepwalking is relatively common, especially among children. Parents, roommates, and friends often report that the somnambulist in question will act dazed and confused while performing bizarre behaviors such as preparing a meal, only to return to bed. Scientists still are not sure why people sleepwalk, although research has shown that it may be genetic. Sleepwalking usually occurs during slow-wave sleep, in which the brain is busy processing the day’s memories. This may explain why a sleepwalker’s short-term memory is not very active while in this trancelike state. In fact, the person in question will have no memory whatsoever of the previous night’s events when they wake up in the morning.


Body Spasms

Photo credit: buzztache.com

When you fall asleep, your body jerks. Every time. As described above, we are usually paralyzed while asleep to protect us from acting out our dreams. However, there is a gray area, a moment when the body is not physically asleep but not awake, either. This is when most people experience what is called a hypnic jerk. It is believed to be a delay between the brain sending the message to relax and the nervous system getting this message.

We do not fully understand how this reaction came to be. Some scientists suggest that it is left over from a primitive reflex that misinterprets falling asleep to falling out of a tree. Others say that it is just the nerves “misfiring” as they are turned off. Whatever the cause, hypnic jerks are one of the few reminders of the complex processes that occur while asleep that we can actually observe while awake. This is because a hypnic jerk upon falling asleep can be so violent in some cases that it actually wakes a person back up.

Brain Uses More Energy The majority of energy produced while awake (about 80 percent) is used by various physical activities such as movement, breathing, and speaking. While asleep, this energy is obviously not being used and the “energy surplus” is diverted to the brain. This means that the brain’s energy consumption is actually higher in certain stages of sleep, such as REM, than it is while awake. This energy is put to good use, completing secretarial tasks that are backlogged while


awake, such as creating and strengthening neural connections and removing waste products. The brain is too busy during the day with more urgent and energy-hungry tasks like decision-making for these other activities to occur. During sleep, however, the brain has some “free time” to tidy up.

Lose Weight

Ever wake up to find you are suddenly very thirsty? This is because your body actually loses more than 0.5 kilograms (1 lb) of water to the surrounding air at night. Think about it this way: The air inside your lungs is hot—about 36.7 degrees Celsius (98 °F). It is also filled with moisture. Since most people’s rooms are much cooler than 36.7 degrees Celsius (98 °F), the air you breathe out as you sleep contracts as it cools, drawing moisture out of the air and your body. The weight of the lost water is minuscule, only about 0.02 grams per breath. But over the course of the night, this can add up to more than 0.5 kilograms (1 lb) of lost weight. CO2 has a similar but lesser effect. Everyone knows that you breathe in oxygen (two atoms) and breathe out carbon dioxide (three atoms). As one more atom is coming out than going in, an infinitesimal amount of mass is lost each time you take a breath. However, there are about a billion trillion carbon atoms in each breath you exhale, so this adds up to about 0.7 kilograms (1.5 lb) every night. This happens in the day, too, but you more than replace the water and carbon through food and drink.


Brain Cleansing

Photo credit: nih.gov

During waking hours, toxins and other waste products accumulate throughout the cells of the brain and body. As the rest of the body shuts down when you go to sleep, your brain gets to work. Essentially, it opens a valve that allows cerebrospinal fluid (pictured above) to flow from your spine into your brain, rinsing the tissue and taking all the toxins with it.

This process is part of a larger cycle known as cellular respiration, a series of reactions that help cells create energy from nutrients and keep the body running. The toxins removed at night are “leftovers” from this process. Although this cleansing occurs throughout the body, its effects are most noticeable in the brain, where inadequate sleep has noticeable effects. This brain gunk that remains is one of the main reasons you often feel lousy after a long night.

Dreams How could we forget dreams? They are such a mundane but essential part of life, and yet scientists still understand almost nothing about them. This includes the question: Why do we even dream? If one takes a step back and thinks about it, dreams are quite strange. Every night when your body falls unconscious, your brain conjures up an imaginary reality that exists only in your head but that you think is real. When you wake up, almost all of it disappears from your memory. Despite how strange it sounds, dreams are taken for granted as a fact of life, much like brushing your teeth or going to work. Although the actual need for dreams remains unclear, the related activities that occur


during REM sleep are understood and plentiful: long-term memory strengthening, flushing the brain of toxins, increased “secretarial” work, and so on. This makes it even stranger that we know so little about dreams themselves. These questions are not anything new. Speculation on the cause and meaning of dreams has been a fixture of human fascination for tens of thousands of years, going at least as far back as the ancient Greeks and Egyptians. Despite the modern technology at our disposal—from MRI scanners to EEG detectors—theories about the origins and purpose of these mysterious experiences will, at least for now, remain theories.


10 Disgusting Beauty Treatments BEN GAZUR

How far are you willing to go to get the hot new look? Much of human history has been driven by the desire to present ourselves in our best light. As it turns out, there’s not much that people won’t do in their struggle to be attractive. Here are ten of the grossest things people have tried in the name of beauty.

Eel Exfoliation Bath

Photo credit: Christopher Jones/REX via The Guardian

Some very expensive bubble baths leave you feeling slimy after you get out of the tub. Their manufacturers prefer to say “moisturized”—but for one treatment, “slimy” is definitely the word. To get that perfect glow in their skin, some people in China are taking baths filled with tiny eels.

Each eel is about the size of a pencil. They wriggle over the body and nibble at the dead skin covering it. This leaves the youthful-looking living skin underneath shining through. Unfortunately, the eels haven’t learned to differentiate between the skin on the outside and the internal membranes. A gentleman wearing loose underwear in the eel bath felt a sharp pain and found that an eel had found its way into his penis. It took a three-hour surgery to remove the errant fish and prompted other countries to watch out for those looking to import the eels.


9Radiation Therapy

Photo via Cosmetics and Skin

Whenever a scientific discovery is made, there’s always a scramble to find a marketable use for it. Sometimes, this benefits mankind; sometimes, it leads people to smear their faces with radioactive elements. The eerie glow of radiation made newly documented elements like radium and polonium seemed like ideal boosters for that healthy glow everyone wants. The mysterious radiation rays were quickly touted as cure for medical ailments and were also used in cosmetics. Radiation was added to face creams, soap, rouge, and powders. For those wanting extra sparkly teeth, there was radioactive toothpaste. Of course, the downside to these radioactive products was that instead of health, they caused untold numbers of cancers. Sores and hair loss are common side effects of radiation poisoning, as well. Those who worked with radium developed bone necrosis and incurable cancers. This soon put an end to the fad for radioactive products.

Lead, Arsenic, And Mercury

While today’s must-have is a tan, which suggests we have the leisure to be continually on the beach, in the past, the reverse was the case. To have a tan showed that you spent your days toiling outside. To cultivate a pale face, you needed the money to stay indoors. Or you could take the shortcut of covering your face in white lead, as people have throughout history. The problems with this were not unknown even then. The lead would rot the skin, requiring ever more to be used to cover up the effects of the treatment. The skin would break open as it thinned. Lead also causes aggressiveness, headaches, vomiting, seizures, and eventually death—then you have the perfect pale


flesh. Those wanting to remove spots, freckles, and other skin problems could turn to another dangerous element. “Dr. James P. Campbell’s Safe Arsenic Complexion Wafers” promised to clear the complexion. Ironically, one of the organs hit hardest by arsenic poisoning is the skin. An overdose of arsenic can lead to hair loss, bloody vomit, diarrhea, and convulsions. But at least you won’t have freckles. While those products are thankfully things of the past, there is a current vogue for everyone to appear with as light skin as possible. Skin lightening products very often contain mercury, an element which can lead to many horrible side effects, especially kidney problems. You may end up with a fashionably Western face but a deeply unfashionable dialysis machine.

Bee Stings

You should always be wary of following celebrity advice. Being pretty and a good actor does not make you a doctor. Gwyneth Paltrow has gone on record about her beauty treatments, and one of them has a sting in the tail. The actress told an interviewer: I’ve been stung by bees. It’s a thousands of years old treatment called apitherapy. People use it to get rid of inflammation and scarring. It’s actually pretty incredible if you research it. But, man, it’s painful. The treatment is painful for Paltrow, but it can be deadly, and not just to the bees. One case of liver failure has been linked to apitherapy.


Those unwilling to be actually stung by bees can simply buy cosmetic products containing the venom. Not that it will do anything except speed the decline in bee populations and your bank balance.

Fish Pedicure

Photo credit: BeautySchool.com

Fish aren’t picky eaters. If you dip your toes into a tank of Garra fish, they will quickly dart in and nibble off the dead skin from even the smelliest feet. They are toothless and usually stop at eating the tough outer skin, but there are reports of them taking their feeding frenzy too far and causing bleeding. While the risks of getting an infection from the fish is thought to be very low, there are dangers to dangling your feet in water used by both fish and other clients. Fungal infections could easily spread, and bacteria in the fish tank could cause boils. Leaving aside the “ick” factor of being eaten by fish that live solely on other peoples’ stinky feet, some legal areas deem the fish to be unsanitary. One Arizona fish spa was shut down, as cosmetic products had to be disinfected and dried before reuse— obviously not an option with fish.

Tapeworm Diets


Losing weight should be easy. If you burn more calories than you consume, your weight goes down. But food is so delicious that many find reducing their caloric intake is too hard to do. So they look for ways to increase their calories burned (other than exercising, obviously). One way of losing weight is to get yourself a friendly tapeworm to take up residence in your intestines. This worm will eat a portion of your food, and it will grow rather than your waistline. While there is evidence that people in the past sold pills which supposedly contained tapeworms, there are recent cases of people actually going through with it. Tapeworm infections cause weight loss and loss of appetite—but also pain, malnutrition, diarrhea, blindness, convulsions, and death.

Placenta The placenta is an organ that develops in pregnant mammals to filter oxygen and nutrients to embryos and remove waste products. Most animals will eat the placenta after birth to regain its nutrients. Some humans do, too. Some aren’t content with leaving it at that, though. Because of their association with youth and birth, some face creams include placenta in the hopes of it passing some of that goodness on to the user. Unfortunately, there is no evidence that they work. The most common form of placenta used comes from sheep, but those with deep pockets and no gag reflex can also buy creams with human placenta.


While placenta may not do much for the face, the side effects can be alarming. Its use in hair products has caused girls as young as 14 months old to begin to develop sexually due to the level of hormones it contains. Stopping use of the products reversed the effects.

Snails

Photo credit: Dennis Gray/AP via The Guardian

Snail facials are a beauty treatment in which live snails glide across your face. The trail of gel they leave behind is said to fight the signs of aging. Snails have been used for thousands of years to treat inflammation, so there may be something to it. Snail facial specialists say they reduce scars, acne, and stretch marks. There is no scientific confirmation of this, however—so perhaps wait before you plop some snails on your face. For those who cannot stand the idea of snails rasping at their skin with their radula (a toothed, tongue-like structure) you can buy creams that contain snail gel. The snails exude the gel, which is said to be more effective when they are stressed. There is no information on how the makers of snail gel cream go about stressing out their snails, but it’s unlikely these creams are animal-friendly.

Bull Semen In the crowded marketplace of cosmetics, you need something that separates you from the competition. One way of doing that is to reveal that the magic ingredient in your face mask and hair conditioner is bull semen.


The inventor of the semen hair treatment had been looking for a high-protein recipe and apparently thought the best option would be bull sperm. She comforts her customers by saying, “It really works. The semen is refrigerated before use and doesn’t smell. It leaves your hair looking wonderfully soft and thick.” If you want the glamorous look without the mental images, Imprivo makes a range of products containing the coyly named BSP (Bull Seminal Plasma).

Foreskin Facials

“As smooth as a baby’s buttocks” is a common expression. Some cosmetics companies have taken that cliche and run with it. Taking the foreskins left over after baby boys are circumcised, they have turned human flesh into cash in the bank. There are face creams which use the cells from foreskins to enrich their products with growth factors, collagen, and other proteins that are claimed to reverse the signs of aging. Because the foreskin contains stem cells, a single one can be grown in the lab to produce enough cells for thousands of treatments. This has not reduced the controversy around using them as an ingredient, especially among those who see circumcision as a form of genital mutilation. For those who want the direct benefits of stem cells, it is now possible to have cells derived from foreskins injected into your face. The fibroblast cells reinforce the structure of the skin, it is claimed, and users say they detect improvements in their appearance. With each vial of cells costing around $1,000, though, you might hope for more dramatic results.


I

10 Mutant Genes That Could Make You Superhuman

AN MONTGOMERYÂ

At some point, everyone has fantasized about being a superhero. Sure, the tights may be snug in some awkward places, but that’s a small price to pay to wield amazing powers against the forces of evil. Well, luckily, those dreams may be close to becoming reality. Geneticists are tracking down specific mutations responsible for some truly incredible abilities. From immunity to electricity to Hulk-like strength, these tremendous talents may soon be as normal as eating and breathing.

Unbreakable Bones

A broken bone is a fantastic way to ruin your entire day (your entire several months, really). Despite being the hardest substance in the human body, bone is definitely not invulnerable. Unless you find yourself with an extremely rare mutation of the LRP5 gene, that is. LRP5 is responsible for the density of your bones. Researchers have known for a while that mutations in this gene can lead to lower bone density, or osteoporosis. However, it has recently been found that they can lead to the opposite effect as well. One Connecticut family has been discovered to have LRP5 mutations that give their bones such incredible density that they are nearly unbreakable. Not one of them has ever suffered a broken bone. Focused on the spine, skull, and pelvis, this increased strength gives members of the family the strongest known skeletons on Earth.


Researchers believe this mutation causes too many “bone-growth signals” to be sent, leading to beefier bones and a potential superhero. It is hoped that one day, a controlled form of the mutant gene could be used to end bone disease.

Super Speed

We all have the natural ability to run, if not always the will. Yet some people seem naturally better at this incredibly basic skill. Sure, it could be down to training or steroids, but geneticists believe the answer is nowhere near that. It turns out naturally gifted runners may be more fit for the X-Men than the Olympics. The gene ACTN3 is present within every human body, but a small percentage of them mutate to help us produce a very special substance. This protein, alpha-actinin-3, is responsible for controlling the fast-twitch muscle fibers that allow us to run. Increased amounts lead to more explosive bursts of muscle power, which translate into better performance in all kinds of sports, especially sprinting. Interestingly, there are two versions of this mutant gene, and athletes with both have been found to perform consistently better than their conventionally chromosomed counterparts. We may be on the brink of a new age of performance enhancement.

Poison Immunity

When it comes to poison, the human body gets very fragile very quickly; a single drop of something like cyanide or ricin, and it’s all over. Whether accidentally ingested or snuck into our food, these sinister substances are hard to defend against. But for millennia, the villagers of San Antonio de los Cobres in Argentina have been sipping mountain water laced with 80 times the safe level of arsenic. And, surprisingly,


there are still villagers in San Antonio de los Cobres. Despite extreme daily exposure to the deadly metal, residents carry on completely unaffected. This is all down to a mutant gene that has been honed by thousands of years of natural selection. AS3MT is the name of this South American savior. It allows the body to process arsenic rather than let it build to dangerous levels, so owners of these microscopic mutants can chow down on as much of the stuff as they want. It is estimated that a total of 6,000 people now possess this gene.

Short-Sleeping

The life of a superhero is not an easy one. Working your mundane cover job during the day and prowling the rooftops by night doesn’t leave a lot of time for sleep. But if you’re lucky, your host of mutant abilities includes short-sleeping. The genes involved in the process of sleep are many and incredibly complex. One, however, stands out to researchers. DEC2 is responsible for regulating the amount of sleep we need each night to properly function. For most of us it demands a full eight hours or more, but about 5 percent of the population enjoy a slightly different mutant version. Tests on a mother-daughter pair with the mutation have revealed an ability to sleep just 4–6 hours each night. Mere mortals would begin to experience negative effects after just a few days of this, but these mutants function completely normally. Researchers are hopefully working to replicate this mutation to allow crime fighters and the exceptionally busy to waste less time unconscious.


Shock-Proof Skin

Photo credit: odditycentral.com

Electricity is one of the most dangerous things we encounter each day. We tend not to think much about it because we’re so used to it, but we are almost always surrounded by more than enough power to kill us in an instant. That’s never bothered Serbian man Slavisa Pajkic, though, whose unique genetic makeup makes him largely immune to electricity.

The typical human is covered with millions of sweat glands, which normally give electric shocks a nice wet path directly into our skin. Pajkic, on the other hand, has no sweat or salivary glands due to a rare genetic condition. This means electricity has no way of penetrating his body, so it skims harmlessly over his skin into whatever he happens to be holding. This unique talent has earned Pajkic the title of “Battery Man.” Able to cook food, boil water, and even set things on fire by passing electricity over his body, he has set a handful of records and appeared on several TV shows. He also uses his gift to treat (or at least claim to treat them) various ailments like migraines and back pain in his native Serbia.

Super Partying

The lifestyle of rich and famous musicians is notorious for being pretty hard on the body. Countless lives have been snuffed out prematurely due to the stresses of constant drug and alcohol use. But bizarrely, one of the men associated most strongly with this life of dangerous excess has been going strong for decades. Rocker Ozzy Osbourne’s ironically long life may be due to more than simple luck.


Researchers analyzing the legendary musician’s genetic code have recently discovered a staggering number of mutant genes. Most of them have to do with the way the body breaks down alcohol and various other chemicals. For instance, a mutation of the ADH4 gene gives him increased amounts of proteins that remove alcohol from the body. Genetic variations like this help explain Osbourne’s continued existence despite the “swimming pools of booze . . . cocaine, morphine, sleeping pills, cough syrup, LSD, Rohypnol” he has admitted to indulging in over the years.

Metal Munching

Photo via modernnotion.com

There comes a moment in every superhero’s career when defeat looks certain. The villain’s doomsday device is ticking away, and all hope seems lost. But luckily, the hero still has a genetic variation or two up his sleeve. He grabs it, adds a dash of salt, and crams it down his mighty throat. The day is saved.

But only if his name is Michel Lotito. This phenomenal French showman spent his life chowing down on absolutely everything as the amazing Monsieur Mangetout—Mr. Eats-All. Televisions, shopping carts, beds, and even an entire airplane made their way through his super-powered stomach. Swallowing shards of glass and twisted scraps of metal would of course kill most people, but Lotito survived his deadly eating habits for decades. It is believed that Lotito’s stomach-turning talent was the result of a very specific genetic defect. Born with bizarrely thick lining in his stomach and intestines, his system was durable enough to avoid the inevitable shredding most people would experience. A few gulps of lubricating mineral oil were the only safety precaution he required.


Super Flexibility

Photo credit: Wikimedia

Inhuman flexibility has been a comic book staple for years. This is mostly because the ability to warp the body into bizarre and amazing shapes is dramatic and visually stunning. But its pop culture popularity may lead some to think of this talent as mere fiction. It isn’t.

For those born with the genetic condition known as Marfan Syndrome, tendons and ligaments may as well be rubber bands. Mutations in the gene responsible for producing the protein fibrillin-1 cause the body to create connective tissues with superhuman flexibility. Selectively dislocating joints to twist the body into shapes worthy of Mr. Fantastic is nothing for the typical Marfan sufferer. Of course, this gift comes with a price. Sufferers may develop unnaturally long limbs and facial disfigurements. Problems with the skeleton, nervous system, and even the heart can also arise from the genetic defect, some of which can be fatal.

Super Strength

Super strength is the quintessential super power; it has been the go-to ability for comic book writers since Superman first flew onto the scene. The raw power it represents is fascinating, leaving many yearning for rippling muscles and brute strength with absolutely no effort. Well good news: It’s possible. Bad news: You have to be born with it. For a lucky few born with mutations of the gene responsible for producing the protein myostatin, the dream of effortless abs is a reality. Myostatin tells the body to stop producing muscle when enough has been created, but defective genes keep it from


being made. This means muscles automatically grow to twice the average size, while fat deposits are halved. A handful of people around the world have this condition, and scientists are trying to harness its power. It is believed that by studying these mutant genes, we may one day be able to cure muscle conditions like muscular dystrophy.

Immunity To Pain

Pain. Whether by banging your toe on the corner of your bed, cutting yourself shaving, or walking into a door, we all experience this annoying sensation almost daily. Pharmaceutical companies make billions offering us ways to avoid it, but the secret to true painlessness may be hidden in the garbled genes of a very rare few. The SCN11A gene determines the amount of sodium in the body’s cells. This might not sound terribly impressive, until you realize that nerve cells use sodiumto decide when to send a pain signal. With the mutant gene lowering sodium levels, nerve cells never have enough to send those signals, rendering the body completely immune to pain. Bizarrely, however, people with this seemingly enviable condition are prone to broken bones and accidental self-mutilation. Without pain to tell them not to do something, they have a tendency to injure themselves, especially as infants. Still, their mutant genes are incredibly rare and valuable, as they may be the key to revolutionary new pain medications.


10 Diseases That Possibly Came From Outer Space RAYMOND GERALD

For years, many scientists have believed that the seeds of life might have ridden to Earth on meteors. This theory is called panspermia, and it answers a lot of questions about our origins. Unfortunately, those same scientists believe that alien bacteria and viruses may still be raining down on us. These microscopic invaders have been blamed for all sorts of illnesses—from the everyday to the completely bizarre. Germophobes beware.

Pandoravirus

Photo credit: sci-news.com

If the terrifying name of this newly discovered virus isn’t enough to worry you, maybe the fact that it is 10 times larger than any ordinary virus will. Discovered by French scientists in 2013, this bizarrely unique microorganism is only found in two places on Earth: off the coast of Chile and in a single Australian pond. It shares only 6 percent of its genetic makeup with the rest of life on Earth.

This has led many to the conclusion that it isn’t actually native to Earth. It may seem silly, but researchers are seriously investigating the possibility that the Pandoravirus is alien in origin. But don’t worry. Even if this scary-sounding creepy-crawly ends up being from Mars, it’s really only harmful if you’re an amoeba.

Mad Cow Disease

Mad cow disease, or bovine spongiform encephalopathy, is a devastating illness that occasionally pops up to cripple the beef industry. Worse yet, it can pass to humans when infected meat is eaten, leading to dementia and even death. Bizarre microorganisms called prions are responsible for the infection, and some believe that


their strangeness isn’t just coincidence. Researchers in India recently announced that the brain-attacking microbes may have come from a passing comet. The frozen balls of space dust have been found to contain chemical structures very similar to prions and other microorganisms. Tiny frozen aliens may have been blown out into a comet’s dust trail and left behind to be snagged by the Earth’s gravity. These scientists have since sent balloons into the upper atmosphere to search for the cow-killing germs.

Mutant Salmonella

This one isn’t possibly from space, it definitely is. In 2006, space shuttle mission STS115 was performing an experiment: The Salmonella bacterium was grown on board to observe its development in space. Expecting a routine study, the unwitting astronauts were shocked to find themselves plunged into a science fiction plotline. The bacteria grown aboard the spacecraft displayed drastic mutations in a whopping 167 genes, with alterations to the production of 73 proteins. Tests with lab mice showed the mutant germ to be much deadlier than the Earth variety, even when far fewer bacteria were present. Researchers were able to figure out that low gravity was to blame for the extreme transformation, proving that any bacterium in space could become a killer. Who knows? Astronauts may accidentally breed an apocalyptic disease in their tube of mashed potatoes.

Spanish Flu

In 1918, the world was in the grip of one of the deadliest epidemics of all time. Called Spanish flu, this disease infected one-third of the Earth’s population and claimed roughly 20 million lives. It was a uniquely deadly strain of a common virus, and English


astronomer Sir Fred Hoyle had a theory as to why. Comets, he thought, deposited frozen alien viruses into the atmosphere. They were then blasted down to the Earth’s surface by energy generated by sunspot activity. This activity peaks every 11 years, pushing more of the tiny invaders to ground level. Convincingly, this 11-year cycle was shared by all global flu outbreaks for over 250 years, meaning that the Spanish flu disaster may have been Earth’s first large-scale alien invasion.

Ebola

Photo credit: uk.news.yahoo.com

Ebola is one of the most horrifying diseases imaginable. Causing heavy bleeding and an agonizing end, it really is a fate worse than death. So of course, the infamous 2014 outbreak had the world united in terror—but some for more reasons than others. Ashley Dale of England’s Bristol University had a theory.

Millions of years ago, he thought, the Ebola virus may have arrived on Earth from a meteorite collision. Citing evidence that microscopic life-forms have been shown to survive the vacuum of space, he believes that alien rocks would have made the perfect vehicle. The bizarrely deadly virus would have then adapted to our planet’s conditions before finding the perfect hosts: us. The evidence is shaky, but it does make for a tempting explanation of a disease that seems too terrible to come from boring old Earth.


Super Zika

Photo credit: medicaldaily.com

By now, the entire world has learned to fear the Zika virus. The disturbing birth defects caused by the illness are scary enough, but the virus’ ability to quickly mutate is another cause for concern. It changes faster than we can figure it out due to its ability to absorb foreign DNA. According to University of Buckingham researchers, that’s more dangerous than we realize. They believe that the only explanation for the virus’ random adaptations—such as the sudden ability to transmit via sexual contact—is the absorption of alien DNA. Claiming that alien microbes are constantly being delivered to Earth by space debris, the researchers think Zika is using the extraterrestrial genes to beef itself up. Unless action is taken, they say, the virus will mutate out of control and threaten humanity. So, uh, consider yourself warned.

SARS

In 2002, the severe acute respiratory virus (SARS) first appeared in China. In no time, it had spread worldwide, keeping people too scared to leave their homes. It was undoubtedly deadly, but it was quickly contained. The sudden appearance of such a uniquely deadly disease gave many people, including scientists at England’s Cardiff Centre for Astrobiology, cause to wonder where it had come from. The scientists suspected that virus-filled space dustcould have drifted down through the atmosphere and landed east of the Himalayas, where the stratosphere is at its thinnest.


Then these microbes would have started infecting the locals, whose immune systems would have been defenseless against the alien germs. The theory, while bizarre, accounts for the strangely sudden outbreak and deadliness of the virus.

HIV

When it comes to disease, the human immunodeficiency virus (HIV) takes the nightmare-inducing cake. Responsible for AIDS, HIV is one of the most feared viruses on the planet. However, Chandra Wickramasinghe, a professor at England’s Buckingham Centre for Astrobiology, believes it’s more than just scary. He thinks it’s alien. Wickramasinghe and his team believe that superviruses like HIV are constantly being introduced to the Earth by comet collisions. They even believe that one such virus may have been the real cause of the dinosaurs’ extinction. No terrestrial virus, they argue, could be responsible for such devastation. In the case of HIV, they feel that it was probably a dormant Earth virus which absorbed strands of alien genetic material and became deadly. Of course, this contradicts the accepted theory that the virus developed in monkeys, leaving the professor with few believers.

The Common Cold

That case of the sniffles you deal with several times a year isn’t just annoying, it’s astronomically annoying. At least according to the 1979 book Diseases from Space. Years before Professor Chandra Wickramasinghe formed his theory about HIV, he and astrobiologist Sir Fred Hoyle were busy penning this exhaustive defense of panspermia. It suggests that most space dust consists of microscopic creatures, going so far as to claim that the common cold is actually an alien invader.


It simply drifted down into the atmosphere and waited for rain or snow to pick it up and carry it to its waiting victims. As difficult as it may be to believe, the research is surprisingly solid and the book makes a great case.

Morgellons Disease

Photo credit: timesocket.com

Imagine the constant, maddening sensation of crawling beneath your skin. Imagine horrible sores popping up on your body. Now imagine that those sores began to sprout coarse, multicolored threads. Welcome to the mysterious world of Morgellons disease. For years, this mysterious ailment has plagued people around the world. Skin-crawling sensations lead to bizarre fibers growing from the victim’s flesh. Rotting teeth and sleeplessness are also symptoms. Most troubling, though, is that testing of the odd strings reveal that they have no cellular structure and can’t be identified as any known material.

Theories abound about this horrifying illness, with many convinced that it doesn’t come from Earth at all. Some even believe that it could be the result of alien parasites that rode in on the Genesis space probe, which crashed in the Utah desert in 2004. That may be unlikely, but the disease itself is very real and continues to stump experts to this day.


10 Eye-Popping Facts About Vision JANA LOUISE SMIT

There’s more to vision than just looking at stuff. The gift of sight is a complex marvel of mysteries and incredible feats., from colors only certain people can see to color nobody can see, the remarkable adaptations of the deaf to the dreams of the colorblind. These are just some of the things that make the window of the soul worth gazing into.

Blue-Eyed Chinese

Photo credit: ADG

A Chinese boy with sky-blue eyes can allegedly see and write in pitch darkness. According to Nong Yousui’s teachers and the reporters who tested him, he can complete a questionnaire in the dark, and when examined by a flashlight, his eyes flare a luminous green. At night, a cat’s eyes reflect light in the same way. This made some believe that Nong was born with an incredible mutation: acute night vision never before seen in humans. If he truly has feline eyes, the flash effect should show up on video—which it doesn’t. Also, scientists dismiss the very idea. Any mutation of this caliber doesn’t happen overnight. Though Nong’s uncanny ability is plausible if his eyes have extra light receptors, the whole thing could also be fake. Either way, given his race, the blue eyes remain mesmerizing—something scientists believe could be a form of albinism.

Seeing Stars

Whether you’re seeing stars, flashes, experiencing visual disturbances from migraines, or a light show after rubbing the eyelids, it’s all caused by the same thing: pressure or stimulation of the retina.


The eyeball is filled with a thick gelatinous fluid that keeps the eye round. Sometimes this gel can press against the retina and its center responsible for creating pictures in the brain. It can happen when the eyes are rubbed vigorously, a powerful sneeze shakes the retina or stimulates the optic nerve, or even when a person stands up too fast. The latter occurs when dropping blood pressure and the oxygen-deprived brain affects the sight parts of the eye. Any message from the retina gets interpreted by the brain as light, whether there is actual light involved or not.

The Gender Difference

Men and women look at things differently. Together, both will see the same movie, but men will be more sensitive to smaller details and movement. Women will be more aware of different color hues and how they change. During a conversation, the sexes also focus differently. Men are more likely to watch the other person’s lips while they’re talking and will be more easily distracted by movement behind the speaker. When women listen to someone talk, they tend to alternate their gaze between their companion’s eyes and body. Other people, rather than movement, are more likely to divide their visual attention.

The Speed Of Color

Never mind the secret life of bees. Their eyesight is far more intriguing. The tiny insects can detect color three to four times faster than people do. At first glance, it appears to be a wasted ability. Most objects carry a permanent hue and this type of vision also expends a lot of energy. Yet, bees have it. The little honey makers have evolved their eyesight around finding flowers, which


means being able to identify certain colors with accuracy. While petals and blooms don’t exactly change their own shades every second, something else can. Researchers think this skill helps bees when they are confronted with flickering light. A speedy flight through a bush could cause colors to shift rapidly due to changing light and shadows. In such a case, fast color vision will allow bees to track each abrupt change in shade.

Deaf Vision

Individuals born deaf tend to have peripheral vision that is more sensitive to movement and light. The explanation could be a brain adaptation. Whenever a person looks at something, two pathways in the brain process the information. One assesses the object’s position and motion while the other is all about recognition. During motiontracking experiments, the first pathway showed enhanced activity in the deaf and most likely is the reason why peripheral sight is stronger in them. Another experiment suggests the deaf can also use their sense of touch to develop visual acuity. Two study groups received a flash near the corner of the eye. During this, the hearing participants received two beeps, and the deaf got two puffs of air to the face. Both reported the same hallucination of seeing two flashes. In deaf cats, the hearing part of their brain also appears to sharpen their peripheral vision.

Why We See In 3-D

Three-dimensional vision helps with depth perception. Each eye views an object from a slightly different angle. Called binocular disparity, it helps the brain to gauge depth. It’s vital but not the only way to view the world in 3-D. The parallax phenomenon is the difference in speed at which things move as you pass


them. It’s most notable while driving: nearby trees will shoot past, while a radio tower in the distance moves at a snail’s pace. Other ways to calculate an object’s range include their size, being able to see more detail in closer objects, parallel lines that appear to converge, and the way items stand in relation to each other.

The Forbidden Colors

Photo credit: Life’s Little Mysteries

There are colors humans can’t see. This isn’t colorblindness but something that happens to everybody. Called “forbidden colors,” they are two-toned hues that can’t be seen by the naked eye because they cancel out each other’s frequencies. These mischievous elves are red-green and yellow-blue.

The same retina cells that activate when we see something red also deactivates in the presence of green. This dampening of cell activity registers as green in the brain. They both can’t happen at the same time and thus cannot be perceived together as a single hue. The same thing happens with yellow and blue. Researchers are divided into two camps. There are those who believe forbidden colors can be produced and seen by humans during certain image experiments. But others argue the results are yet-unnamed intermediate shades of the known colors but not the true forbidden colors.

The Gray Realm

Researchers have possibly uncovered why the world appears gray to depression sufferers. A study involving patients with major depression found that, compared to healthy individuals, their retinas were less responsive—in a dramatic way—to


especially black-and-white contrasts. This was true even for the participants who were taking antidepressants. Researchers believe the link between depression and vision could be the substance dopamine. Healthy contrast vision depends on certain cells within the retina. Called amacrine cells, they connect the brain cells in the retina with each other. Pulling double duty, dopamine is needed for these cells to work properly as well as making a person feel driven and focused. When the chemical is lacking, it can cause cheerlessness and possibly blunt the effectiveness of the amacrine cells. This could be why everything look like an old photograph to sufferers during times of depression.

Dreamscape Of The Color-Blind

Individuals with colorblindness can dream in color. Just how much depends on when the person turned colorblind. When somebody is born seeing the world in shades of black, white, and gray, that will be the environment of their dreamscape. Should they become colorblind later, their colorful past can spill over into dreams. Anyone with other forms of colorblindness, such as the common inability to distinguish between reds and greens, will dream in their own range of the rainbow. For instance, they will dream of a green Santa Claus instead of a red one because that is also their waking reality. It is also extremely uncommon for those with normal vision to dream in black and white. Difficulty remembering colorful dreams could be because during sleep, the dreamer is more occupied with doing something or reaching a destination than focusing on their dreamscape’s hues.


The Rainbow Women

Some women can see more color than the rest of population—not just an extra shade here or there but in technicolor most of us can’t even comprehend. Called tetrachromats, they see vivid colors where others merely perceive monotone shades. It’s a literal rainbow world, and it seems to be an exclusive female one. Everyone has three kinds of cone cells in their eyes and each reads a different light bandwidth. They combine frequencies to recognize individual colors. An extra cone adds hundreds of combinations and a huge extra set of colors. It’s estimated that about 12 percent of women might have an extra cone, but not all of them are tetrachromats. The true ones are rare, and they don’t always have it easy. Because this genetic condition remains widely unknown, they are seldom believed whenever they attempt to share their magical sight.


10 Fascinating Facts About Plastic Surgery MIKE FLOORWALKER

At the mention of plastic surgery, most people probably think of expensive, unnecessary procedures performed on rich people for purely aesthetic purposes. Of course, there is an element of truth to this as the vast majority of procedures performed are still nose jobs and breast augmentations. However, the techniques pioneered by plastic surgeons have a long history, and those specializing in them have had to fight diligently just to be taken seriously by their peers. As we will see, plastic surgeons have also been responsible for pioneering many life-enhancing procedures that go far beyond the cosmetic. But first, let’s answer the question that most of you likely have.

Its Name Has Nothing To Do With Plastic

Photo credit: Gasparo Tagliacozzi

The documented beginnings of plastic surgery techniques date all the way back to the 16th century when Italian physician Gaspare Tagliacozzi—who was himself copying techniques described in an Indian manual written roughly 1,000 years earlier— successfully reconstructed the damaged nose of a patient using tissue from the inner arm. But the term “plastic” was first used to describe these techniques in 1837—a good 18 years before the invention of plastic, the substance. The term is from the Greek plastikos, meaning to mold or shape, and specialists in these techniques were initially far more focused on the reconstruction of misshapen or damaged body parts than cosmetic augmentation. By the mid-19th century, advances


in anesthesia and sterilization had made it possible for more daring procedures, such as the original nose job, to be attempted. Throughout this time, however, plastic surgery was not formally recognized as a branch of medicine despite its obvious potential. And while it is true that its early focus was helping those disfigured by injury or disease, we will take a brief aside to answer your other obvious question.

Breast Augmentation Has A Longer History Than You Think

The first successful breast augmentation was likewise reconstructive rather than cosmetic as the patient had previously had a large tumor and a portion of her left breast removed. German surgeon Vincenz Czerny used a good-sized lipoma—a fatty, benign tumor—from the patient’s back to reconstruct the breast, and it’s safe to assume that the attempt was only able to be made because biological material from the patient was available to work with. This happened in 1895, and surgeons spent the next 70 years trying to come up with a viable material for commercial breast implants. Paraffin, alcohol-soaked sponges, and beeswax all failed to make the grade, but fortunately for breasts everywhere, Houston junior resident surgeon Frank Gerow came along in the early 1960s. Gerow conceived of the silicone implant after squeezing a blood bag and noting the similarity to a woman’s breast. His first experimental procedure was performed on a dog. It was successful, and before you ask, yes, the implants were removed once it was determined to be so. Timmie Jean Lindsey, his pilot human patient, was asked to volunteer for the procedure


after coming in to consult about having a tattoo removed. She was thrilled with the results. As a testament to the viability of the procedure, she still retains her implants— the first ones ever—to this day.

Modern Reconstructive Surgery Was Pioneered During World War I

Photo credit: Daily Telegraph

While the aforementioned advances in anesthesia and antisepsis had plastic surgeons performing complex procedures on delicate areas by the early 1900s, the burgeoning specialty had never seen challenges such as those presented by World War I. Entire new categories of explosives and weapons were being deployed on the battlefield, and thousands of soldiers were returning home with the types of injuries that had literally never been seen before. It was in leading the response to these challenges that the field underwent perhaps its greatest sustained period of advancement, largely thanks to the efforts of New Zealand–born, London-based surgeon Harold Gillies, widely considered the father of modern plastic surgery. Recently uncovered records detail over 11,000 procedures performed on more than 3,000 soldiers in the eight years between 1917 and 1925, including groundbreaking skin and muscle grafting techniques that had never before been attempted. As antibiotics did not yet exist, infection was always a major concern. Dr. Gillies mitigated this by inventing the tube pedicle or “walking-stalk skin flap” technique, which involves rolling the graft to be used into a tube and “walking” it up to the target site. This technique alone likely spared thousands from infections.


When the war ended, Gillies and other wartime plastic surgery pioneers were frustrated to find that their techniques and expertise were not exactly welcomed with open arms by the medical community at large. The field was not well-defined, and its practitioners had no means of sharing expertise or defining areas of specialty until the American Society of Plastic Surgeons was founded in 1931.

A Plastic Surgeon Helped Make Cars Safer

Photo credit: National Institute of Standards and Technology Digital Collections, Gaithersburg, MD 20899

Debates over auto safety, which had been raging for some time prior, came to a head in 1935 with the publication of a Readers’ Digest article entitled “—And Sudden Death.” Author Joseph C. Furnas mainly took the tack of shaming careless drivers, attempting to shock them into better behavior by opining that for the reckless driver, the best hope was to be “thrown out as the doors spring open. At least you are spared the lethal array of gleaming metal knobs and edges and glass inside the car.” While it did not seem to occur to Furnas that optimizing the safety of the actual vehicle would be helpful, Detroit plastic surgeon Claire Straith arrived at this commonsense conclusion after several years of specializing in the reconstruction of faces of car accident survivors.

After Straith sent a sternly worded letter to Walter P. Chrysler, five different Chrysler models were introduced in 1937 with features that were specifically designed with safety in mind, a first for any auto manufacturer. These features included rubber buttons instead of steel, rounded door handles, and recessed knobs. Although it would take a while for Straith’s other recommendations—padded dashboards and safety belts—to be implemented, it didn’t stop the good doctor from


installing both in his own vehicle years before they became standard.

A Plastic Surgeon Performed The First Organ Transplant

Photo via Wikipedia

Although most people don’t think of transplant procedures as having much to do with plastic surgery, they involve many of the same small-scale techniques, such as reconstruction and reattaching of nerves and tissue and dealing with the potential for rejection. Indeed, the first successful organ transplant of any kind—in this case, a kidney—was performed by renowned plastic surgeon Joseph E. Murray in 1954.

Murray was already highly regarded for his work furthering the treatment of burn victims and those with facial disfigurements. However, this transplant procedure was incredibly groundbreaking in that, up until it was actually achieved, nobody even knew whether or not it was possible. A decade of research and experimentation on the part of Dr. Murray had failed to yield positive results. With an assist from a donor organ given by the patient’s identical twin, the successful 1954 procedure ignited the medical community with possibilities simply by establishing organ transplants as viable. Dr. Murray subsequently became an international authority on transplant and rejection biology, even helping to develop the first generation of immunosuppressants in the 1960s. In 1990, he was awarded the Nobel Prize in Physiology or Medicine for his pioneering work. He was one of only nine surgeons, and the only plastic surgeon, to ever receive the award.


A Plastic Surgeon Also Performed The First Successful Hand Transplant

Dr. Warren Breidenbach, chief of the Division of Reconstructive and Plastic Surgery at the University of Arizona in mid-2016, has had a long and storied career. His current focus includes the establishment of an institute for the study of composite tissue transplantation and leading-edge work on immunosuppressants. He is considered the world’s foremost authority on hand transplants and for good reason. In 1999, he became the first surgeon to perform the procedure successfully. The recipient, Matthew Scott, had lost his hand in a fireworks accident an unbelievable 14 years prior to receiving the landmark surgery. Planning the procedure took three years. Breidenbach had to deal with the scrutiny of the entire medical community over ethics concerns as once again there were serious questions as to whether the procedure was even viable. Previous attempts—one in 1964 when immunosuppressant drugs were in their infancy and one just a year prior in 1998—had both resulted in the host’s immune system rejecting the donor hand. Since this time, over 85 recipients have received hand or arm transplants worldwide, including children, amputees, and victims of explosives. Once again, the procedure could never have come to fruition without the advances already made by plastic surgeons and it took one of the very best to do it successfully. As of 2016, Breidenbach has performed more hand transplants than any other surgeon and has trained the majority of the rest who are qualified to perform the procedure in the US.

‘Medical Tourism’ For Plastic Surgery Is Exploding

Photo credit: Pinaew


As our readers in the United States know and the rest of you may have heard, the US health care system leaves a little something to be desired. Although the quality of care and technology is generally good to great, waiting times for some procedures can be excruciating, and the cost for major surgeries tends to be . . . well, an arm and a leg. As such, those in the market for expensive procedures—both cosmetic and medical— have been increasingly looking to countries where the cost of health care is more manageable. But we’re not talking about stereotypical back-alley Mexican nose jobs. Although Mexico and Brazil are still getting their share of the so-called “medical tourism” market, newer major players like Dubai and Thailand are able to offer high-tech, quality care in a price range that is actually forcing the Western medical establishment to up its game in the face of their competition. Thailand, for example, has become a world leader in medical tourism with cutting-edge equipment, internationally trained surgeons, and hospitals that look and feel more like luxury hotels than medical facilities. In 2013 alone, the country brought in a whopping $4.3 billion solely from foreigners seeking medical treatment.

The Newest Techniques Don’t Involve Surgery At All Of course, for minor and less invasive procedures such as tucks and face-lifts, newer techniques are always being sought out to reduce healing time and potential scarring. New York plastic surgeon Doug Steinbrech offers a surgery-free face-lift, thanks to a special device that slowly stretches the skin over the course of three hours (under anesthesia, of course). Although stitches are required, healing is complete in five days, and the whole thing only costs $35,000, making it ideal for those who sleep on piles of money and really, really hate knives. Fellow New Yorker Dr. Doris Day—who is, of course, a local media personality with a name like that—has also demonstrated nonsurgical techniques that use ultrasound to shrink problem areas, followed by Botox and laser treatments. Ultrasound can similarly be used in place of traditional liposuction. Day calls it “the newest kid on the block for helping to resculpt and melt fat. [ . . . ] It’s like liposuction, but it’s a nonsurgical approach. [ . . . ] It uses that high-density focus ultrasound to actually heat up and melt fat.”

Men Are Pulling Even With Women Most of us tend to think of surgery purely for cosmetic purposes as a largely female pursuit, and in years past, this may have been the case. But in recent times, the numbers show that a rapidly growing segment of this market—$14 billion annually as of 2014—is professional men. According to the American Society for Aesthetic Plastic Surgery, between 1997 and 2014, there was a 273 percent increase in the number of men seeking cosmetic procedures, with a 43 percent increase just in the last five years of that period. A large


part of the reason, says Dr. Steinbrech (him again), is that they view cosmetic surgery as a career investment. “Men are at the top of their career, and they feel young and confident,” said Steinbrech. “But they’re worried they don’t look it.” Although the huge demand for cosmetic procedures may seem absurd to some, the same techniques involved in tucks and lifts must first be mastered before going on to accomplish the near-miracles that we’ll talk about next.

Full Face Transplants Are Increasingly Feasible

In 2012, Baltimore plastic surgeon Eduardo Rodriguez performed the most extensive full face transplant ever done on Richard Norris, who had attempted suicide in 1997 via shotgun to the face. Needless to say, it was perhaps the most intensive and complex plastic surgery procedure ever performed up to that time. Only a few similar attempts had been made before then. The earliest—a partial face transplant—succeeded in 2006. Norris’s procedure also succeeded. Although his appearance is a bit odd and he must take drugs to keep his immune system at halfpower for the rest of his life, the fact that his new face is functional given his injury is nothing short of astounding. Rodriguez has since repeated his success. In 2015, he gave a new face to firefighter Patrick Hardison, whose original visage had been completely obliterated in a fire. The results are shockingly good, with Dr. Rodriguez commenting, “Tremendous advances in medicine have occurred, tremendous advances in innovation and technology that allow us to do this procedure reliably in today’s day and age.” Although three deaths have occurred due to complications—a relatively small number given the acknowledged riskiness of the procedure—full or partial face transplants have been successfully performed on over 30 patients as of mid-2016.


10 Ancient Civilizations That History Forgot MICHAEL VAN DUISEN

Much like Isaac Newton imagined when he gave his famous “shoulders of giants” quote, our modern civilizations owe a great deal to those which came before us. While examples like the Sumerians or Egyptians are deeply ingrained in nearly everyone’s minds, there are a number of other civilizations which have been largely forgotten. Here are 10 of them.

Hattian Civilization

Photo credit: Bhushan Kotakar

The Hattians were a civilization which inhabited the area of present-day Anatolia, Turkey from the 26th century to around the 18th century B.C. Believed to be the earliest urban settlers of the area, their existence can be traced to 24th-century Akkadian cuneiform tablets. Most archaeologists believe that they were indigenous to the area preceding the more famous Hittite civilization, which arrived in the 23rd century B.C. The two cultures slowly merged together, with the Hittites adopting a variety of Hatti religious beliefs and practices. Many of the largest Hittite settlements, such as Alaca Hoyuk and Hattusa, are believed to have originally been Hattian. While they had their own spoken language, no evidence of a written form of the Hatti language has ever been found. It’s likely that they were multilingual, perhaps to facilitate trade with their Assyrian partners. In fact, most of what we know about the Hattians comes from the widespread adoption of their culture by the Hittites. Their population probably existed as a majority for decades—if not centuries—while they were under the aristocratic rule of the Hittites, before they eventually faded away into


obscurity.

Zapotec Civilization

Photo credit: Rod Waddington

While most people are familiar with the Aztecs and the Maya of Mesoamerica, the people known as the Zapotec remain relatively obscure. Among the first people in the area to use agricultural and writing systems, they also built one of the earliest recognized cities in North America—Monte Alban. Founded in the fifth century B.C., the city was home to a maximum of 25,000 citizens and lasted for over 1,200 years. In Monte Alban, a privileged class made up of priests, warriors, and artists ruled over the lower classes.

Like many of the civilizations of Mesoamerica, the Zapotecs subjugated the surrounding areas through a mix of warfare, diplomacy, and tribute. The sudden downfall of their culture seemed to have no reason, and their largest city was mostly left intact, though it was eventually ruined by years of abandonment. Some scholars believe that a failure of their economic system may have pushed the Zapotecs to find work elsewhere. The rest of the population grouped together into various city-states, which proceeded to fight each other (as well as outside forces) until they were no more.

Vinca Civilization

Europe’s biggest prehistoric civilization, the Vinca, existed for nearly 1,500 years. Beginning in the 55th century B.C., they occupied land throughout Serbia and Romania. Named after a present-day village near the Danube River, where the first discoveries were made in the 20th century, the Vinca were a metal-working people, perhaps even the world’s first civilization to use copper (they also excavated the first mine in Europe). Though the Vinca people had no officially recognized form of writing, examples of


proto-writing, symbols which don’t actually express language, have been found on various stone tablets which date as far back as 4000 B.C. In addition, they were artistic and fond of children; archaeologists have found various toys, such as animals and rattles, buried among the other artifacts. They were also extremely organized—the houses of the Vinca civilization had specific locations for trash, and the dead were all buried in a central location.

Hurrian Civilization

Photo credit: Rama

Another civilization which influenced the Hittites was the Hurrian people, who lived throughout the Middle East during the second millennium B.C. It’s probable that they were around even earlier than that: Personal and place names written in the Hurrian language were found in Mesopotamian records dating back to the third millennium B.C. Unfortunately, very few artifacts of their civilization exist; most of what we know about them comes from the writings of other cultures, including the Hittites, Sumerians, and Egyptians.

One of their largest cities is known as Urkesh and is located in northeastern Syria. Urkesh is also where the earliest known text in Hurrian, a stone tablet and statue known as the Louvre lion, was found. Long believed to be mainly nomadic, scholars now believe that the Hurrians may have had a much bigger impact than previously thought, mostly due to the way their language differed from other Semitic and Indo-European tongues. However, by the end of the second millennium B.C., nearly all ethnic traces of the Hurrians had disappeared, with only their influence on the Hittites left behind.


Nok Civilization

Photo credit: Ji-Elle

Named after the area in Nigeria in which artifacts of their culture were first discovered, the Nok civilization flourished during the first millennium B.C. before fading into obscurity in the second century A.D. Some theories posit that the overexploitation of natural resources played a large role in the population’s decline. Whatever the case, scholars believe that they played an important role in the development of other cultures in the area, such as the Yoruba and Benin peoples. Perhaps the best-known examples of their artistic nature are the terra-cotta figures which have been found throughout the area. They were also the earliest known Africans to have smelted iron, though it’s believed that it was introduced to them through another culture, perhaps the Carthaginians. The reason for this assumption is that no evidence for copper smelting has ever been found, which was a precursor to an iron age in nearly every other civilization. Although they’re believed to be one of the earliest African civilizations, evidence of their existence has been slow to come to light because modern-day Nigeria is a notoriously difficult place to study.

Punt Civilization

A popular trading partner with ancient Egypt, the land of Punt (pronounced “poont”) was famous for producing incense, ebony, and gold. Scholars differ on where they believe the civilization was, with a range from South Africa all the way up the coast to the Middle East. Even though the Egyptians wrote extensively on the land and its people, they never bothered to actually say where it was. A lot of our knowledge of Punt comes from the reign of Hatshepsut, the famed female pharaoh who ruled Egypt during the 15th century B.C. Reliefs in her mortuary temple


contain information on a rather large trade expedition to Punt, as well as more specific details, like pictures of beehive-shaped houses on stilts. A scene showing Hatshepsut receiving wondrous gifts from the exotic land is also carved into the temple walls. Unfortunately, no actual archaeological evidence showing the location of Punt has ever been found, although there have been numerous Egyptian artifacts inscribed with the civilization’s name, giving scholars hope that Punt might one day be unearthed.

Norte Chico Civilization

Photo credit: Sharon odb

Beginning with its arrival during the third millennium B.C. and lasting for over 1,200 years, the Norte Chico civilization dominated South America as the oldest sophisticated culture on the continent. Named for the region of present-day Peru which they occupied, they had 20 major cities, with advanced architecture and agriculture making up a large portion of their settlements. They also developed intricate irrigation systems, sophistication which was unheard of in the Americas at that time. Artifacts recognizable as religious symbols have been found throughout the area, especially near the stone pyramids for which the Norte Chico civilization is famous. There is some debate over whether or not they qualify as a civilization, as well as what that term even means. Usually, indicators like a form of art and a sense of urbanization are key, but the Norte Chico civilization possessed neither of these. Whatever the case, there is no denying that they were an influence on later South American cultures, such as the Chavin civilization, which began a few hundred years after the fall of the Norte Chicos.


Elamite Civilization

Photo credit: dynamosquito

Although their name for themselves was Haltam, the name “Elam” comes from the Hebraic transcription of the word. The Elamite civilization consisted mostly of land inside present-day Iran, along with a small portion of Iraq. One of the earliest civilizations, it was founded sometime in the third millennium B.C. and is by far the oldest in all of Iran. Situated along the borders of Sumer and Akkad, the land of Elam was similar to its neighbors, although its language was altogether unique. Although they lasted as an independent kingdom for at least a millennium, if not longer, very little is known about them because Elamite scribes were not concerned with documenting their mythology, literature, or any scientific advancements. Writing was mostly seen as a way to honor the king or perform administrative duties. Due to this fact, they made a rather small impact on the development of future civilizations, especially when compared to the Egyptians and Sumerians.

Dilmun Civilization

An important trading civilization in its heyday, Dilmun encompassed an area consisting of present-day Bahrain, Kuwait, and parts of Saudi Arabia. Although very little concrete evidence has been found as of yet, scholars believe that a few sites, namely Saar and Qal’at al-Bahrain, are ancient settlements of the Dilmun people. Saar is still being investigated, but a large number of the artifacts that have already been found there date to the third millennium B.C., lending credence to the theory that it was built by the Dilmun civilization. Dilmun was a major commercial player in its day, with control over the Persian Gulf trading lanes and a communication network that reached as far away as Turkey.


Numerous water springs flow all across the area, which researchers believe may have led to the legend of Bahrain being the Biblical Garden of Eden. In addition, Enki, the Sumerian god of wisdom, was said to have lived in the underground springs. Described as “the place where the sun rises,” Dilmun played a large role in Sumerian mythology; according to legend, Dilmun was the place where Utnapishtim was taken to live for eternity.

Harappan Civilization

Also known as the Indus Valley Civilization, the Harappans were a group of people who lived in parts of present-day Pakistan and India. Gifted with the idea that planning cities in advance would be a good idea, their urban areas were second to none; unfortunately, due to what scientists believe to have been a massive, centuries-long drought, their culture slowly declined, never to rise again. This is currently nothing more than a theory, but it helps explain other cultural declines in the area as well. Beginning sometime in the 25th century B.C., the Harappans also developed their own language, a script with nearly 500 different characters which has not been completely deciphered even today. Their most noteworthy artifacts are seals, usually made of soapstone, which depict various animals and mythical creatures. Harappa and Mohenjo-Daro are the two largest Harappan sites, with the former labeled as a UNESCO Heritage Site. When it collapsed, the ruins of the Harappan civilization provided a template for the various other cultures which sprang up after it.



10 Tragic Events That Created Iconic Pieces Of Pop Culture NATE YUNGMAN

The world can be a terrible place, but there are always things to cheer us up. Amazingly, they sometimes overlap. A great number of the things we turn to every day to make us happy were built on terrible tragedies and torture. In fact, some of the most iconic pieces of pop culture were created in surprisingly dark circumstances. Next time, you belt out “Dancing Queen” or tinker with LEGOs, remember that your joy could not exist without someone else’s pain.

Nuclear Fear Inspired “Do You Hear What I Hear?”

“Do You Hear What I Hear?” is one of those Christmas songs that seems like it has been around since the birth of Christ. In reality, it’s only as old as the “Monster Mash.” This calming and simple song was not written solely to commemorate the religious holiday. It initially served as a call for peace in the middle of a nuclear stalemate. Noel Regney and Gloria Shayne Baker wrote the song as a response to the Cuban Missile Crisis and possible Armageddon. For 13 days, the world was filled with existential dread. While they were in the studio, the producer took a break from recording to turn on the radio to see if World War III had started. Regney and Baker needed to take their minds off the situation in Cuba, so they went for a walk. Along the way, the couple saw two mothers pushing their babies in strollers. The songwriters were so struck by this moment of innocence that they wrote the opening line right then: “Said the night wind to the little lamb.” The song became a Christian classic, but it’s a universal message for people all over the world to put aside their differences. Regney and Baker’s paranoia appears in the song’s lyrics, and now you’ll be one of the few who understands it. Though many interpret the line “a star, a star, dancing in the night with a tail as big as a kite” as a reference to the star above Bethlehem, the line hints at the nuclear missiles that inspired the song. The fear in the song still causes Baker and Regney to cry when they sing it today.[1]

A Cult Created the Super Bowl Halftime Show

Few things cause more devotion than the Super Bowl. Fans go to such ridiculous extremes to root for their respective teams that the fanaticism is cultlike. That is why it is so appropriate that the most celebrated football game of the year incorporated a real cult into the middle of the game for years.


For the first three decades, the Super Bowl halftime show was mostly done by local area marching bands or jazz legends singing classics. The only modern band allowed to sing was Up with People. Their songs called for world peace and utopianism. That viewpoint was rooted in a controversial religious movement called Moral Rearmament (MRA). MRA was founded to stop the spread of liberal counterculture forces in the late 1960s. That is why this group of square-jawed folkies was receiving funds from companies as varied as Exxon, Halliburton, Pfizer, and General Electric. The conservative philosophy limited many of the members’ rights. Every facet of their lives was controlled by the MRA. The MRA forced the group to exercise for hours each day to the point of exhaustion. Gay members of the group were frequently beaten. Sex of any sort was strictly forbidden. Those who engaged in the activity were abandoned in random cities while touring.[2] Despite the toxic culture of the group, Up with People performed at the Super Bowl four times. Their productions were nonsensical and over the top. By 1986, NFL commissioner Pete Rozelle had grown tired of the shtick. As these shows were so hollow and campy, the NFL turned to popular musicians to supply the theatrics. In 1991, New Kids on the Block became the first contemporary artists to headline the halftime show. Michael Jackson’s performance in 1993 turned the Super Bowl halftime into a major phenomenon, thus paving the way for wardrobe malfunctions and “Left Shark” decades later.

Stephen Colbert Became A Comedian Because His Family Died

Photo credit: businessinsider.com

Whether working on The Daily Show, The Colbert Report, or his current stint as the host of The Late Show, Stephen Colbert is one of the most popular comediansof the past few


decades. He may have brought a lot of joy to millions of people, but his personal life is filled with heartbreak. Colbert got his start in comedy as a way of handling the grief of losing his father and two brothers on the same day. On September 11, 1974, Eastern Air Lines Flight 212 encountered thick fog. The pilots had been distracted by a conversation when they were caught off guard by the fog. As they were unaware of their settings, the plane crashed. Seventy-two of the 82 people aboard died immediately. Three others died later from their injuries. The youngest of 11 children, Stephen was the only one still living with his parents at the time of the crash. At only 10 years old, Stephen helped his mother get through her grief. Driving back from the funeral, he noticed that one of his sisters was laughing so hard that she fell over in her seat. In that moment, Stephen understood the ability of comedy to drive out all despair. His interest in comedy was spurred on by listening to some of the albums left behind by his brothers. For the next eight years, Stephen dedicated his life to comedy. He has had some success with it recently, too.[3]

Nazi Experimentation Birthed ABBA

ABBA is a lighthearted, fun, beloved disco band. None of that describes the Nazis, but one of the four members of ABBA would not exist if it wasn’t for the Nazi occupation of Norway. Starting in 1940, the Nazis implemented a program where the army was ordered to conceive as many children as possible with Aryan Norwegian women. This practice resulted in thousands of births, including Anni-Frid Lyngstad (better known as “Frida”). Many of these children were taken from their parents and forced into reeducation rooms. In these old familiar rooms, children would play, eat, and live under the guidance of Nazi soldiers. As the Third Reich invaded other countries, the program eventually devolved into wholesale kidnapping. The kidnapped children were placed into orphanages where they couldn’t escape if they wanted to. If they failed to be racially pure, they were executed. Following the war, these children were returned to their communities. Many of the mothers and babies were ostracized due to their Nazi relationships. When Frida was young and sweet and only 17 months old, she and her mother were driven out of town.[4] When she arrived in Sweden, Frida still felt isolated and without a real home. By 1971, she had started dating Benny Andersson. He asked her to join his new band, ABBA, which included Agnetha Faltskog and Bjorn Ulvaeus. In the end, ABBA conquered Europe and America better than the Nazis ever could.

LEGO Rebuilt The Founder’s Life Brick By Brick

LEGOs (as they’re colloquially known in the US) are such a quintessential symbol of childhood that the company has sold over 400 billion of their namesake bricks. The company’s success has allowed them to branch off into equally popular video games,


movies, and even theme parks. However, anybody who’s accidentally stepped on one in the night already knows how much pain LEGO can cause. Behind the facade of childhood fun, the early history of LEGO was driven by a series of disasters that plagued the founder. In the early 1900s, Ole Kirk Christiansen was simply his village’s carpenter. At his workshop, he produced furniture, ladders, or stools. That business ended in 1924 when his son accidentally set some wood chips in the shop on fire. The fire spread and destroyed Christiansen’s workshop and the family’s house. Even though he was practically broke and now homeless, Christiansen decided to continue on. He soon encountered two more obstacles.[5] In 1929, the collapse of the US stock market triggered a global depression. Then, in 1932, Christiansen’s wife died. Economic and personal disasters made him fire most of his staff and reduce the size of his operation. His social worker argued that the business should no longer focus on furniture. Instead, he should turn to toys. They required less wood, were cheaper to manufacture, and could potentially cheer up Christiansen. For many years, Christiansen barely made a profit, even declaring bankruptcy until his brothers bailed him out. Following the German occupation of Denmark in the 1940s, Christiansen’s factory burned again. Due to limited resources, he was forced to switch from wood to plastic. This allowed Christiansen to mass-produce toys like never before, including a series of interconnecting bricks.

Robert Kennedy’s Assassination Changed Hip-Hop

The tragic and untimely death of Robert Kennedy had tremendous consequences on the politics of the 1960s. However, the event’s real legacy may be the roundabout way that his death influenced hip-hop. In the late 1960s, Michael Viner joined Robert Kennedy’s campaign as an aide. There, he met famed football player Rosey Grier, who was working security for Bobby Kennedy. Grier was the one who wrestled the gun out of Sirhan Sirhan’s hand during the assassination. Grier and Viner had joined the campaign with hopes of working alongside Kennedy in Washington. When Kennedy died in California, Grier and Viner stayed there. No longer employed, the two men went into the movie business. Grier starred in and Viner produced the soundtrack for a forgettable B movie, The Thing with Two Heads. On the soundtrack, they had a minor hit with “Bongo Rock.” Building from that success, Viner formed the group “Incredible Bongo Band.” As a fan of surf rock, he made the band record a cover of “Apache” by Bert Weedon. The Incredible Bongo Band’s version of “Apache” would go on to be known as “the national anthem of hip-hop.” Popularized by DJ Kool Herc, “Apache” was the go-


to song for his block parties. It was the first song that “Grand Wizzard” Theodore scratched, pioneering turntablism.[6] By doing so, he turned it into the sound of the genre. Hundreds of artists—including Afrika Bambaataa, Grandmaster Flash, Nas, Kanye West, LL Cool J, and even MC Hammer—would go on to sample it.

The Chestburster Scene In Alien Killed Its Creator

Photo credit: The Guardian

It is one of the most disturbing twists in all of cinema. The kindhearted Kane sits down to eat. While he is minding his own business, he suddenly feels a terrible pain in his side. His body ruptures. Eventually, his bubbling insides kill him at far too young an age. That is the horrific story of Kane and the man who came up with the scene, Dan O’Bannon.

O’Bannon wrote the screenplay for Alien. For the famous chestburster scene, he drew upon his own struggle with Crohn’s disease. Chris Foss and O’Bannon were eating fast food one day. As a result of the disease, O’Bannon had digestive problems. He described the experience as a little beast inside him. In later conversations with Alien’s designer, H.R. Giger, O’Bannon said that he wished his abdominal pain could just leave through his stomach. These two ideas were synthesized into one of the most shocking scenes in movie history.[7] In a cruel irony, O’Bannon suffered the same fate as Kane. O’Bannon’s stomach condition went undiagnosed for a long time. Though he experienced stomach troubles chronically throughout his life, he did not seek proper medical assistance until it was too late. In 2009, O’Bannon succumbed to Crohn’s disease. He was only 63.


The Lord Of The Rings Exists Due To Two World Wars

Photo credit: thedailybeast.com

The Lord of the Rings is not really a fantasy story. The ultimate tale of good versus evil, the corrupting nature of authority, and the destructive force of power may feature orcs, wizards, and elves, but it is still grounded in J.R.R. Tolkien’s personal struggles. The beloved trilogy is a thinly veiled allegory for the unprecedented destruction of World War I. There are direct connections between World War I and the books. For example, Gandalf’s famous exclamation of “You shall not pass” is a rewording of the battle cry of the Verdun, “They shall not pass.” Other elements are more tangential. Samwise Gamgee was inspired by Tolkien’s companionship with fellow soldiers. Tolkien’s friendships in the trenches were a major reason that the stories were finally written. To keep his spirits up, he entrusted an early version of Middle Earth to three former school friends and soldiers. After two of those men died in the Battle of the Somme, Tolkien felt forced to finish his story in their honor. The grief did not sustain Tolkien’s vision for too long. His motivation only returned thanks to another world war. During World War II, it was time for Tolkien’s son to serve England. Christopher Tolkien was stationed in South Africa. J.R.R. sent fragments of the stories to Christopher to stave off boredom.[8] No matter how personal Tolkien’s manuscripts were, there was still no guarantee of success. Initially, he wanted to publish the trilogy as one sprawling epic novel. The text would have been more than 1,000 pages, seriously hurting any chance of it becoming a phenomenon. As it was so thick, it would have been equally expensive. Written in the aftermath of World War II and with the economy in shambles, it was unlikely that people would have had much disposable income to plop down for a hobbit’s epic quest. Due to a paper shortage in World War II, the publisher insisted that the book be divided into thirds. This ensured that the books would be cheaper and thus more popular.


Darth Vader Is Luke Skywalker’s Dad Because Of Cancer

Photo credit: empireonline.com

Though the quote is often misremembered, Darth Vader telling Luke Skywalker that he is Luke’s father is one of the most iconic moments in any movie. From a certain point of view, the line changes the entire direction of the franchise. Without this twist, Vader’s sacrifice and redemption in Return of the Jedi would not be nearly as impactful. It is such an essential part of the movies that one would assume that this had been the plan from the beginning. In reality, the line was scribbled as a last-minute change because the original scriptwriter had just died. Following the success of Star Wars, George Lucas decided that he didn’t want to write the sequel. He passed the responsibility to Leigh Brackett. She only had a few months left to live due to a terminal case of cancer, but she was able to churn out a script before succumbing to the disease. Her direction for the movie was very different than what George Lucas could have predicted.[9] She conceived Vader as the owner of a steel castle protected by demons, gargoyles, and a lava moat. Later films in the franchise would incorporate some of these ideas. But at the time, Lucas thought the castle was ridiculous. With Leigh Brackett’s passing, he rewrote the draft himself. Revising the script, Lucas first came up with many of the notable scenes from the movie, including Han getting frozen in carbonite and the character of Boba Fett. Of all the changes, the most important idea was to turn Star Wars into a galactic game of family feud.

Freddy Krueger Is Based On A Bizarre True Story


Photo credit: comicbook.com

Wes Craven’s A Nightmare on Elm Street is arguably the greatest horror movie of all time. Not because it debuted one of cinema’s most fearsome characters with Freddy Krueger, but because it turned the real deaths of untold Southeast Asian men into a blockbuster.

Wes Craven drew inspiration from many sources, including a childhood bully, a disfigured homeless man whom Craven happened to see, and the song “Dream Weaver.” But the darkest was an article that he read in the Los Angeles Times. In the wake of the Cambodian genocide, many people fled Asia to settle in California. Separated by an ocean, their physical dangers were gone. But they still brought with them the emotional weight of what they had seen. The mental stress manifested itself into terrible nightmares. These nightmares were so traumatic that even people in perfect health would die in their sleep. To prevent any further distress, people would go days without sleeping. When they finally crashed from their exhaustion, they would have one more scream and then die.[10] The bizarre trend was known as Sudden Unexpected Death Syndrome (SUDS). In Los Angeles, three refugees who escaped the Khmer Rouge died in this manner. The deaths were sensationalized in the local paper. Back in mainland Asia, the deaths from the disorder were too common to be publicized. Between 1982 and 1990, 230 people died from SUDS in Thailand alone. Freddy Krueger could only have dreamed of killing that many.


10 Common Articles Of Clothing And Their Origins GEORGE WILSON

Of the fundamental physiological needs of humans, clothing is interesting, as it bridges the gap between necessity and identity. On one hand, clothing provides a layer of protection from the immediate environment; it can keep you warm, shield you from the Sun, or camouflage you from predators. On the other hand, clothing can help form an individual’s identity, such as wearing a uniform, cultural or ceremonial garb, or something unique and meaningful to that individual. Therefore, while important for survival, clothing has allowed people to identify themselves through style and fashion. Despite the differences in identity worldwide, however, certain items of clothinghave become global mainstays. This list will examine the most common modern items of clothing and how they have developed into what they are today.

Pants

Nothing feels better after a long day at work than peeling off this confining leg wear, which begs the question: What’s with all the pants? The short answer is that they provided a military advantage. It is much easier to ride horses while wearing pants than it is to do so in robes or togas (sorry Greeks and Romans). The first recorded usage of pants was during the sixth century BC by Greek geographers who were noting the leg wear of Central Asian and Persian horse riders.[1] They scoffed at these early trousers, saying that only barbarians would wear such clothing. Like the Greeks, the Romans similarly rejected them, but they ultimately found their effectiveness and practicality


overwhelming. Eventually, Europe was taken over by pants-wearing knights and the noble elite. Pants in Europe during the 15th century became more and more ornate, with big, puffy bits that tightly cinched around the knees and connected to socks. Luckily, this style faded out as the popular working class began to wear more practical pants. Finally, during the 19th century, the modern idea of pants really developed thanks to the stylings of the eldest son of Queen Victoria, Edward VII. Today, pants help to form the image of a working individual who can be ready for any type of action at a moment’s notice. So, while marauding the land on horseback is not your typical activity, the ability to move both legs independently and not have to worry about exposing oneself is well worth the extra steps it takes to relieve oneself.

Socks

Socks have been around for quite some time. Most believe that socks were developed out of animal hides during the Stone Age to protect one’s feet. There are also references to socks made from animal hairs during the eighth century BC. The Romans in the second century AD were using pieces of leather to wrap around their feet and legs; however, they soon developed what was known as udones, which were fitted to a specific person’s foot. The oldest-known socks currently existing come from ancient Egypt and are dated to between the third and sixth centuries AD. Oddly enough, these socks were meant to be worn with sandals! It was certainly a time when practicality trumped fashion. During the fifth century AD, the pious people of Europe wore socks as a symbol of purity, and it was not until 500 years later that socks became a status symbol. Developed in conjunction with pants, socks lengthened and connected at the breech


around the knees when pants were the ornate, ballooning symbols of nobility. Socks finally reached critical mass with the advent of the stocking frame knitting machine, created by William Lee in 1589. It is said that the machine was created because Lee was in love with a woman who was too preoccupied with knitting to notice him.[2] Since then, socks have been developed in mass quantities, and the added use of stretchier materials has allowed them to be worn by anyone.

Sunglasses

Sunglasses can be one of the top symbols of status and style. They can also be vital in environments with lots of sunlight and bright lights. It is believed that the Inuit in prehistoric times used flattened ivory lenses to protect their eyes from the Sun.[3] The next recorded use of sunglasses was during Roman times, when Emperor Nero would watch gladiator fights through emerald green gems. Sunglasses were also used around the 12th century by Chinese judges. The smoky quartz glasses offered no aid to vision but served to conceal expressions that might give away the judges’ decisions and judgments. Sunglasses also began to make their appearance throughout the world in the 12th century, and their first recorded use in a painting was in one done by Tommasso de Modena in 1352. Until the early 20th century, most changes in sunglasses revolved around developments in prescription sunglasses and correcting vision problems. In 1929, however, Sam Foster began mass-producing and selling sunglasses to Americans in Atlantic City, New Jersey. It was around this time that movie stars began to wear them to protect themselves from the blinding light of cameras and the limelight. The military eventually developed sunglasses for their pilots during the 1930s. Sunglasses made a huge impact in World War II, when Ray Ban used lenses from the new Polaroid camera to develop polarized anti-glare lenses for pilots. Sunglasses have not changed much


since then, other than to block out the haters.

Baseball Cap

When it comes to headgear, nothing seems more standard and natural than a baseball cap. While originally an American phenomenon, the popularity of baseball caps has exploded across the world and societal classes. It can be worn as a fashion statement, to identify one’s loyalty to a sports team, or to block out the glare of the Sun and hold one’s hair out of their face while working. For these reasons it, has been referred to as the “Common Man’s Crown,” and it is no wonder why baseball caps are worn by almost everybody. The New York Knickerbockers introduced their baseball uniform to the world in 1849, and it featured brimmed straw hats. Other baseball teams followed suit with their own headgear. It wasn’t until 1954 when New Era made their 59Fifty model hat that the modern-day baseball cap was born.[4] (This model is still worn today by MLB players.) While popular among baseball players, it was considered crass and weird to wear baseball caps off the field until the 1970s. Again, celebrities paved the way for wearing a previously niche article of clothing in everyday life when Tom Selleck donned his Detroit Tigers hat in the television show Magnum P.I. Other celebrities such as Spike Lee introduced other demographics to the style of baseball caps, making it a truly crosscultural symbol of tribalism and utilitarianism. The overall design and simplicity of the baseball cap is sure to stay for some time, as any deviation usually tends to err on the wacky side. (I’m looking at you propeller caps).


Business Suit (Lounge Suit)

This symbol of corporate servitude and conformity actually stemmed from a more rebellious upbringing. Originally referred to as a lounge suit, predecessors of the buisness suit appeared during the 1600s under the rule of Charles II in the courts of Britain. After an outbreak of the plague, Charles II ordered nobles to begin dressing in more uniform and practical tunics and breeches in more neutral and dark colors. This clothing eventually evolved, with the help of tailoring, into the morning suit, or tuxedo, which was actually considered to be the least formal outfit still deemed formal. As for the development of the business suit itself, its origins remain a mystery, but what is known is that it started appearing in the mid-19th century as a way for the elite to dress down and the working class to dress up. The easiness of the wear and the high level of style led to the popularity of suits with every man from cab drivers to celebrities. Interestingly enough, suits were also seen as a form of rebellion during the 1930s. The Zoot Suit Riots of 1943 were a series of brouhahas between black and Latino men and servicemen returning from World War II. The riots were coined after the suits worn by the black and Latino men, who took to wearing oversized, shoulder-padded coats and very baggy pants.[5] As is the case now, seeing groups of men in suits approach you was a cause for concern back then, albeit probably for much different reasons.

Hoodies

While suits have transitioned from rebellious icon to a symbol of corporate conformity, the hooded sweatshirt, or hoodie, has remained an icon for the rebellious underground. While hoods have existed throughout history, Champion Products claims to have created the hoodie in the 1930s.[6] It was originally designed for laborers and athletes working in harsh environmental conditions, but the garment eventually made the


jump to personal wear when high school athletes began giving their hoodies to their girlfriends. During the mid-1970s, the hoodie began to gain its underground identity on the streets among muggers and graffiti artists in an attempt to conceal their identities and maintain a low profile. The hoodie was also iconic in the film Rocky and helped to add to that lower-class “us vs. the world” aesthetic. Since then, the hoodie has been adopted by other groups like skaters, punks, rappers, and street artists. The connecting thread among these groups has been utilizing their lesser means to express themselves and their complicated relationship with law enforcement. Even recently, the shooting of Trayvon Martin has largely hinged on his allegedly sinister appearance with his hoodie. As a result, many people have donned hoodies to show their support for Trayvon Martin’s cause, even if it came as a violation of a dress code. Despite all of this, the hoodie is still appropriate for warmth and everyday use. Denis Wilson of The New York Times probably best typified the hoodie with an analogy to Rocky Balboa himself: “Rocky Balboa is beloved as much for his average-Joe, big-lug appeal as for his bone crushing and face pounding. And sometimes a hoodie is just soft and warm.”

Bras

There’s no question that women today have much more freedom than their historical counterparts, and that much can be seen with the rise in popularity of the bra. The predecessors of bras first appeared in ancient Greece as a wrap that women would wear around their chest. In the 1500s, corsets became the staple for upper- and middle-class women catering to the standards of beauty. Despite physicians blaming corsets for the multitude of health risks associated with them, it took a world war to end their prominence. Since corset frames were typically made of metal, the US War Industries Board requested in 1917 that women refrain from purchasing them in


order to use the metal for military supplies. This opened up a niche for the bra to gain popularity and freed up enough metal to build two battleships! The modern bra was first patented in 1914 by Caresse Crosby. Crosby was downtrodden by finding out that her restricting corset was actually poking through her gown before a debutante ball and, with the help of her maid, fashioned the first modern bra by sewing together two handkerchiefs with pink ribbon.[7] Women were amazed with how freely she danced and moved and offered to buy her bras. She started a business for her backless brassieres. However, the business was short-lived, and she eventually sold her patent to the Warner Bros. Corset Company for $1500. Early on, bras were fashioned from stretchable material and one-size-fits-all, but during the 1930s, many developments, including elastic bands, cup sizes, and padded cups, helped to make bras more wearable. Throughout the years, there have been other innovations, but the bra itself has remained largely unchanged. As of now, about 95 percent of women in the Western world wear bras, but with recent trends such as #FreeTheNipple, the future popularity of the bra is questionable.

Boxers And Briefs

Chances are that you’ve heard the age-old, tell-all question, “Boxers or briefs?” You may be surprised, however, to find that this question is only as old as the early 20th century, when the two items were first invented. Loincloths, braies, codpieces, and knee-length drawers were the predecessors to boxers and briefs, which weren’t even invented until after the 1920s.[8] Interestingly enough, boxers were invented first in 1925, when Jacob Golomb, the founder of Everlast, replaced the original leather-belted shorts worn by boxers with


ones that featured an elastic waistband. Then, in 1934, Arthur Kneibler invented tighty whities after receiving a postcard depicting a man in a bikini-style bathing suit. The bathing suit inspired him to create a snug, legless undergarment with a Y-shaped fly. This garment was called the Jockey and sold extremely well compared to its pugilistic counterpart, which was criticized for having so little support for men’s goods. Boxers and briefs really came into their own during the 1970s and 1980s, when designers like Calvin Klein began to make underwear more of a statement you would show off rather than hide under your pants. As a result, Joe Boxer jumped on the bandwagon and into the world’s eye with boxers that had US $100 bills printed on them. The Secret Service found that these boxers violated counterfeit laws and seized them from the company. Following the ideology that no publicity is bad publicity, Joe Boxer made this into a lighthearted issue, showing that boxers were the fun alternative to stuffy briefs. Underwear also has a further use outside of protecting a man’s valuables: It can be used to indicate the valuables of a nation. In 2008, Federal Reserve chairman Alan Greenspan explained that a good indicator of economic health is the state of the men’s underwear industry. His reasoning was that men will usually wear ratty, old underwear until they are physically impossible to put on, and thus, replacing their underwear is done more out of luxury than necessity. Therefore, if there is a drop in the economy, one of the first things men will stop purchasing is new underwear. This was exemplified during the recession in 2008; men’s underwear sales fell 12 percent that year.

Tank Top

With its sleeveless design and thin material, the tank top is the optimal piece of clothing for hot environments. Go to any warm location, and you are bound to see multiple


men and women wearing it. Its name leads one to believe that it originated from the military, but tank tops actually stemmed from the growing freedoms of women in the early 1900s. In 1912, the Olympic Games in Stockholm added women’s swimming to the competition. Twenty-seven women donned swimsuits with a top that looked much like today’s standard tank top, which afforded them the movement needed to swim competitively. With this exposure, rebellious women began to practice various forms of immodesty, and this swimsuit was one of their forms of rebellion. The swimsuit was referred to as a tank suit because back in the day, swimming pools were actually called swimming tanks. Tank tops gained popularity in the public eye when men began to wear them in films. The garment was usually associated with the villains in movies, who were often depicted being physically abusive to their wives. This is why Americans began to colloquially refer to them as “wife-beaters.”[9] It wasn’t until the 1970s that tank tops became a standard item of clothing, however, and its evolution into the 1980s gained reached global proportions when the German army began selling surplus tank tops known as the Bundeswehr Tank Top. Tank tops have continued through to the modern day and are still one of the most common items of clothing you’ll find when out in warmer climates.

T-Shirts

Arguably the most popular article of clothing in the modern day, T-shirts have expanded to include various styles, designs, and cuts while crossing cultural and socioeconomic boundaries. The T-shirt has modest beginnings that stem from workers who modified their long johns into two pieces so that they could be worn in warmer weather. The top half was modified by the Cooper Underwear Company in 1904, and the “bachelor undershirt” was created.[10] The bachelor undershirt was a simple pullover shirt with


no buttons or safety pins and thus did not require any sewing know-how to own and maintain. Shortly after this, the US Navy adopted the undershirt as a part of the uniform, as they were employing many young bachelors who knew little to nothing about sewing. The Army also adopted the undershirt after seeing its success in the Navy. The first known mention of the T-shirt was in F. Scott Fitzgerald’s novel This Side of Paradise as something the main character takes with him to school. As we have seen with other items of clothing, sports stepped in to advance the design of the T-shirt. The University of South Carolina made a request to Jockey International Inc. in 1932 to create a T-shirt for their football players to wear under their padding. Thus, the crewneck was born. Up until this point, T-shirts were popular as an undergarment, but it wasn’t until soldiers in World War II returned home and began wearing them casually that the popularity of the T-shirt as outerwear began to take root. Marlon Brando’sportrayal of Stanley Kowalski in A Streetcar Named Desire furthered the popularity of the garment. Companies and businesses were not far behind in realizing the economic potential of placing their logos and designs on these shirts, as well. Nowadays, modern T-shirts serve as easily wearable, fundamental articles of clothing with enough variation in their designs to make them as unique as each person who pulls one over their head.


10 Body Parts That Are Secretly Awesome SAM WHYTE

Some body parts get all the attention, whether it’s the famous essentials like the heart, brain, and liver or the beauty of smiles or athletic musculature. However, there is a whole world of phenomenal body parts that deserve some more attention. These unsung anatomical heroes might not be the most eye-catching, but they’re why you don’t walk into walls, choke every time you eat, or simply keel over dead while you’re reading this article, among other things. Here is a list of ten of the most underappreciated, interesting, and important parts of the human body.

Vestibular System Ever wondered how you know where your head is in space? How you don’t get dizzy every time you nod or tilt your head? Or why you can’t walk in a straight line after spinning in a circle for a long time? The answer is the vestibular system (VS), a minuscule, complex setup comprised of three semicircular canals and two chambers in each inner ear. The VS sits behind your eardrum, just next to the cochlea. The semicircular canals are three round tubes filled with liquid, which lie in different planes, enabling sensation of movement in all directions. There are special areas called maculae (not to be confused with the maculae in the retinas) at the end of the tube loops which are covered with sensory hairs. On top of the hairs is a jelly-like substance with tiny weights in it called otoliths. When you move your head, the semicircular canals and maculae move, but the fluid and jelly lag behind. This lag bends the sensory hairs and sends a message to your brain about the direction your head is moving. When you stop moving (or accelerating) and keep your head in a particular spot, the effect of gravity on the weighted jelly tells your brain where you are in space. So, what happens when we spin in a circle and get dizzy? Ask a friend to spin in a tight circle, either on their feet or in an office chair, for over 30 seconds and then suddenly stop and try to focus on a fixed point. They will feel dizzy and struggle to walk in a straight line, and if you look closely, you will see their eyes flicking from side to side (a phenomenon called nystagmus). This happens because your VS has stopped moving, but the fluid inside the loops has enough momentum to keep moving. This tells your brain you are spinning, but your eyes and cerebellum don’t agree, so you feel completely off-balance, and your vision is distorted.[1] You can also watch the medical student above try it.


Kneecaps

If you have ever fallen on your knees or had that sickening feeling of sliding a chair under a desk and colliding with an unfortunately placed table leg, you’ve probably been grateful for their protection. However, kneecaps are much more than built-in, rudimentary kneepads! It’s all a matter of leverage. The main function of the kneecap, technically called the patella, is extension of the knee (straightening the leg). The kneecap is tethered to the shinbone (tibia) by a strong tendon, and the top of the kneecap is connected to a major muscle in the quadriceps group. Your “quads” are a group of four muscles, hence the name. The patella increases the effective force with which the knee can extend by 33 to 50 percent due to the increased leverage around the joint.[2]

Cerebrospinal Fluid

Amid all the flesh, blood, and guts in the human body is this beautiful, crystal-clear fluid. Cerebrospinal fluid (CSF) is produced in ventricles deep within the brain and circulates around the brain and spinal cord. CSF has many functions, including protection, as it provides an area of shockabsorption for the brain when the skull is hit or shaken. It also works to provide nutrients and clear waste from the brain and spinal cord in a similar way to blood in other parts of the body. The CSF is produced and absorbed in an exquisite balance to maintain the correct pressure to surround and support the central nervous system (brain and spinal cord). Doctors sample CSF by performing a procedure called a lumbar puncture—inserting


a needle into the spinal cord and collecting some of the fluid.[3] It can be used to identify people who have an infection (such as meningitis), a bleed around the brain (hemorrhagic stroke), and other conditions.

Uterus

Most women are not particularly fond of their uterus, as it is often a source of pain or problems, but it deserves a prized place on this list. The most obviously remarkable feature of the uterus is its ability to expand from approximately the size of a woman’s fist to fill most of the abdomen and some of the thorax during pregnancy and contain a full-grown fetus, placenta, and amniotic fluid. The proliferative capacity of the uterus is unrivaled in the human body. The muscular function of the uterus is also truly unique. Most people are familiar with the pain and power of contractions during labor (which are in themselves a remarkable feat of physiology), but a less well-known muscular function occurs directly after birth. After the placenta detaches from the inside wall of the uterus, there is a huge risk of bleeding (postpartum hemorrhage), as multiple large blood vessels are exposed. If that happened on your arm or leg, what would you do? Apply pressure. The uterus applies pressure to itself! Straight after delivery of a baby and placenta, a surge of hormones causes intense contraction of the uterus, which compresses the blood vessels and helps them heal and close.[4]

Valves


Most of us are grateful for our sphincters (or should be), but what about our valves? The cardiovascular system is essentially plumbing, and one-way valves keep things flowing in the right direction. We have four very strong pumps (the heart) which work in coordination to pump blood in a figure eight to the lungs to exchange gas and then to the rest of the body, supplying nutrients, removing waste, and keeping everything in balance. Blood is pumped out of your heart into arteries, which expand and contract as the heart pumps. This is why you can feel a pressure wave in them, your “pulse.” As blood moves away from your heart, arteries branch into smaller and smaller vessels until they pass through extremely fine tubes called capillaries that are only a cell wide. This is when exchange happens between blood and the tissues it supplies. Blood needs to move slowly here and no longer has a pulse due to the large surface area of the microscopic capillaries. On the way back to the heart, blood travels in veins, which converge into larger and larger vessels. However, there is not a lot of pressure driving blood back to the heart, and most of the blood needs to overcome gravity to return. To deal with this, veins have one-way valves which keep blood flowing in the right direction. Sometimes you can see valves in people’s arms, particularly when you have a tourniquet on for a blood test; they look like little knobbles along an otherwise straight vein. There are also four essential one-way valves within the heart. Each of the four pumping chambers in the heart has a one-way valve which snaps shut when it contracts to prevent blood from being pumped out in the wrong direction. The chambers in your heart work in pairs, and it is the sound of these valves snapping shut during the


pumping action that you hear as the two “lub-dub” heart sounds. If there is anything wrong with how the valves work, you can hear added heart sounds, and the pump will work less effectively.[5]

Lens

If you’ve ever had glasses fitted, you know how arduous the process is to find exactly the right lens to correct your vision. Much like the lenses in glasses, you have lenses within your eyes. They are transparent, concave structures that bend light to focus images onto the back of your eyeball, the retina, which sends the information to your brain to be interpreted as vision.[6] Unlike glass or polycarbonate lenses, our anatomical lenses are elasticated and able to change their shape to focus on objects at all different distances. As we age, the lens gradually loses elasticity. This is why most people require glasses to assist with reading as they get older; the lens is less able to recoil or “bounce back” into its thickest form, which is required for near vision. Glasses help to bend the light more, prior to passing through the eye.

Ciliary Muscle

How exactly do our lenses manage to change shape? This is achieved by the ciliary muscle, a rim of muscle around the lens which contracts and relaxes to make the lens thicker or thinner.[7] This, in turn, bends beams of light entering the eye more or less, to keep images in focus. This muscle movement, known as accommodation, is one of the most sophisticated motor functions in the body. Indeed, our eyes are among the most complex organs in our bodies.


Epiglottis

Anatomically, our trachea is in front of our esophagus, so every time we swallow, our food or water needs to pass over our windpipe and in to our food pipe. If this action is not coordinated, we choke. The epiglottis is a flap of elastic cartilage which projects from the top of the larynx (the top part of the windpipe). When you swallow, the larynx is pulled upward. This is why you can see people’s throats move up and down when they swallow. The “Adam’s apple” is a prominence of cartilage in the larynx which makes this action more obvious in males. When the larynx is pulled upward, the epiglottis is folded over the entrance to the windpipe so that food and water pass over it, into the esophagus. This is why it is important to lie someone on their side in the recovery position in first aid when appropriate. This is to keep their airway open and to allow any water or secretions to drain out of the mouth instead of into the airway.[8]

Diaphragm

The diaphragm is a large area of fibrous and muscle tissue which separates the abdominal and thoracic cavities, and when it twitches, we get hiccups. Although the rib cage expands and contracts, diaphragm is the main muscle responsible for breathing. When relaxed, the diaphragm is dome-shaped, curving up into the thoracic cavity. When it contracts, the muscle flattens, increasing the intrathoracic volume and creating a sucking action, drawing air into the lungs as they expand. The diaphragm also helps to regulate pressure on the chest and abdomen


when vomiting, coughing, urinating, and passing stool. When you look at an X-ray of the chest, the diaphragm is higher on the right than the left, due to the location of the liver.[9] Every time you breathe, all your abdominal contents below the diaphragm move slightly as you inhale and exhale.

Skin

It’s the largest organ in the body, and although it’s one of the more highly recognized body parts on the list, its importance is not. The skin has six primary roles, if any of which stopped working, you would get very sick or even die.[10] Firstly, skin provides a barrier against physical, thermal, chemical, and radiation sources of potential trauma encountered in daily life. Also, skin regulates your body temperature. As annoying as we find sweating to be, it is actually essential to maintaining our normal physiology and is also involved in another primary function: maintaining the body’s fluid and electrolyte balance. Skin also has multiple immune functions, acting as both a physical and immunological barrier against infection and allergic triggers. Metabolic functions of the skin include the production of vitamin D and other proteins that cells need to work. Finally, the skin is the most diverse sensory organ in the body, capable of sensing heat, cold, light and firm pressure, pain, and vibrations.


10 Foods We Eat That May Lead To Poisoning Or Death SHARON SEALS

Many consumers take advantage of the foods often found at local grocery stores. We assume that they would never deliberately sell us toxic products. We also believe that commonly encountered ingredients could never be harmful. The truth is that we eat many mainstream products in our daily lives that could lead to poisoning or even death. Here are 10 of these surprising foods and spices.

Cinnamon

Cinnamon comes in two forms: “regular” and “true.” Ceylon is “true” cinnamon, and cassia is the “regular” alternative sold by most grocers. Ceylon is often pricier, so most people are eating the cassia alternative. While cinnamon does have many benefits, it can also be a contributing factor to certain health issues. For example, cassia cinnamon contains a compound called coumarin. Small doses are not harmful and may even produce health benefits. But studies on coumarin have shown that substantial intake may lead to an increase in the risk of cancer and other liver issues.[1] Substantial use over an extended period is the concern of most experts. Due to the elevated risk associated with high consumption, they recommend that cinnamon is best ingested in moderation, especially for those with liver problems. Anyone with a liver condition should be very wary of cinnamon as it could worsen the situation. For individuals in this category, it may be wise to avoid this spice.


Mushrooms

Mushrooms are a delicacy that come in many shapes, forms, and sizes. Ranging from super cheap to ultra expensive, these little fungi are found in a multitude of dishes. Generally speaking, the fresher, the better—because no one wants to eat a slimy and moldy mushroom. However, every once in a while, these tasty morsels may sit too long with broken plastic wrapping or get canned improperly. The result is no less than slimy mold on the skin and the bacteria botulinum. Botulinum is found in the intestinal tracts of animals and can be left behind on fresh produce to grow under the perfect conditions. Moldy mushrooms may be an indicator of this deadly bacteria. Botulinum is a neurotoxin that prevents the nervous system from reacting correctly. Also known as Botox, this toxin is often used in cosmetic procedures. While small doses and injections are not usually toxic, a large intake of improperly stored or canned mushrooms can lead to muscle paralysis and difficulty breathing.[2]

Potatoes

Potatoes and a variety of other vegetables are members of the toxic nightshade family. Despite the deadly connection, these starchy vegetables are usually very safe to eat. Greening potatoes are another story, though. We usually dismiss a greenish hue on potatoes as chlorophyll due to exposure. However, consumers should be wary. This coloring may also indicate signs of damage that could mean a rise in dangerous levels of a toxic glycoalkaloid called solanine.[3] In foods like potatoes, solanine content is rarely an issue. But if high levels of this toxin are ingested at once, it can be harmful to the body. For anyone who eats substantial amounts of these tainted buds or has sensitivities to nightshade family


members, a reaction can cause everything from headaches to gastrointestinal problems. As a result, it is wise to avoid green potatoes, especially in large quantities. Anyone thought to have allergies may also want to reconsider before including them in their diet. To be on the safe side, be choosy when buying potatoes from the store and cut away any green parts. If an area still tastes bitter after peeling, it may be safest not to eat it.

Nutmeg

Nutmeg is a universal spice used in everything from sweets to curries. It is also often used in medicine around the world to treat nausea, diarrhea, and other stomach issues. In earlier years, it was even known as an anesthetic in dentistry. For anyone allergic to nutmeg, it is also an unpleasant hallucinogenic. Nutmeg contains myristicin. Substantial doses and allergies make myristicin deadly when ingested. Overdoses of this toxin can contribute to many unpleasant side effects that are also known as acute nutmeg poisoning. The symptoms may include hallucinations, drowsiness, delirium, and even unconsciousness.[4] Nutmeg can create a “peyote-like” high, but the aftereffects are said to be very unpleasant. As a result, most people use nutmeg for its tasting qualities instead of as a recreational drug. Anyone thought to be sensitive to nutmeg should ask about the ingredients in homemade products to ensure that there is no substantial use of the spice. This is especially relevant around the holiday season.


Alfalfa Sprouts

These tasty little greens are often added to salads, soups, and even burgers. For many nutritionists, alfalfa sprouts also make their lists of “superfoods.” However, eating them raw has raised a few health concerns. For one thing, alfalfa sprouts are likely to become contaminated with E. coli. If the grower and the consumer take proper precautions, though, this shouldn’t be too much of an issue. Many store-bought veggies are just as likely to pose a similar risk. However, the most significant concern with alfalfa sprouts is that they contain a toxin called L-canavanine. This nonprotein amino acid naturally occurs in many plant species to provide a defense against insects. But it also causes severe responses in autoimmune-compromised persons.[5] Studies on animals with autoimmune tendencies have shown that consuming vegetables containing L-canavanine caused an increase in conditional flare-ups. Some people have even found it to be a contributing factor to the development of specific diseases like lupus. Specialists are still conducting studies in humans to nail down the exact connections between raw sprouts and autoimmune problems. For now, though, they recommend that anyone with a compromised immune system steer clear of these little sprouts.

Cassava Cassava is another starchy vegetable not commonly seen in American kitchens. However, many people around the world use this strange root in their cooking.


People eat cassava in many different forms. Unfortunately, it contains a deadly toxin known as linamarin. Few individuals realize that cassava is fatal if improperly prepared. Linamarin is like sugar in its makeup and structure. When cassava is ingested raw, the human body converts the linamarin to the deadly poison cyanide. Chemical companies use cyanide to create fertilizers, pesticides, and fumigants, and it has even been used as a potent chemical weapon. When prepared correctly, the cyanide is no longer present in the cassava root. If it is not adequately cooked, a meal of cassava can turn into a story with an unfortunate ending.[6] For those preparing or trying cassava, know that it is a very healthy and filling food that is eaten regularly without issues around the world. Just remember that it can be deadly, too, so you should ensure that it is correctly prepared.

Mangoes

The mango plant is a part of a genus that belongs to the Anacardiaceae family. This family produces fruits called drupes, which are known for their fleshy outsides and stony insides. Blackberries, cashews, and mangoes are all in this category of tasty treats. Unfortunately, sumac and poison ivy are also members. A few plants in the Anacardiaceae family produce a substance known as urushiol—the white, sticky substance that oozes from the mango rind. Allergies to urushiol are not an issue for the majority of the population, but anyone with a sensitivity to it will break out in a blister-like rash. Many individuals only experience these problems, known as mango itch, when dealing with the skin of the fruit. By wearing gloves when peeling, these effects are easily avoidable. For those with a hypersensitivity, contact with the rind, leaves, and flesh can lead to rashes and even anaphylactic shock. If you experience symptoms of mango itch, avoid dealing with the skin and overeating this raw, tasty treat in the future.[7]


Sweet Potatoes

This potato alternative isn’t actually a potato at all but a member of the bindweed or morning glory family. Many people enjoy this food as a traditional holidayfavorite, and some prefer to eat it for its health benefits. Although sweet potatoes contain many vitamins and nutrients, consumers need to be wary of the potential health hazards from mold growth. Due to many different storage and age factors, mold can grow on the skin of sweet potatoes. This specific mold can cause hepatotoxicity when ingested. The deadliness of sweet potato mold was mainly discovered due to cattle herds. In more than one reported case, details came in that the bovines were experiencing unknown respiratory issues. These issues were eventually traced back to moldy sweet potatoes.[8] This toxic fungal growth is also unsafe for human consumption. Although most people recognize moldy food and tend to throw it away, a piece or two may sometimes slip through the cracks. As this mold may cause hepatotoxicity, it is wise to check the skin of sweet potatoes thoroughly. Discard the tuber if there are any doubts about the state of the peel.

Red Kidney Beans

Many favor red kidney beans for their use in tacos, chili, soups, and more. With these beans found in most stores and rarely seen as harmful, few people realize that proper bean preparation is essential. Consuming raw kidney beans can be fatal. Beans contain phytohaemagglutinin, a natural toxin found in many legumes. The amount of lectin found in these foods is not very toxic, but it is highest in raw red kidney beans. The USDA refers to this compound as kidney bean lectin. Cooking and soaking breaks down this lectin, but it is still present if the legumes are undercooked. As little as a handful of undercooked beans can induce a reaction and


cause food poisoning to occur.[9] To prevent issues, follow the cooking directions on the packaging. Soak beans overnight, and always cook for the recommended amount of time at the proper temperature.

Quail

Although quail isn’t commonly found on everyone’s dinner plate, many people still enjoy hunting for and eating this delicacy. Quail themselves do not present any potentially harmful issues. However, what may cause problems is precisely what these birds are eating. Quail are small scavenging fowl that consume seeds, various grains, and random insects. During migration, they move across the country and add other varieties of foods to their diet—including hemlock. Hemlock is a plant with a high toxicity level for most animals. Quail actually show resistance to the plant and appear to eat it without adverse effects. Unfortunately, humans do not share this trait. As a result, quail poisoning (aka coturnism) occurs when a person eats one or more of these tainted fowl.[10] Reports of coturnism have appeared throughout history but with very few linking attributions. Unfortunately, many quail connoisseurs don’t realize that they could be eating tainted meat. Suspected cases often report dinner guests as suffering from vomiting, muscle soreness, and pain. These symptoms associated with toxic quail are hard to pinpoint, but many experts have linked them to eating tainted birds. Coturnism is a rarely seen phenomenon. It is familiar enough, though, that it should be recognized in the culinary community. If the tainted quail is eaten in small amounts, a


person may experience nothing more than indigestion. However, the unlucky few who consume too much of the fouled fowl can experience permanent damage to the nervous system and other parts of the body. In the worstcase scenario, coturnism can lead to coma and even death. As a result, experts warn us to be wary of quail in migration mode.


10 Diseases That Prevent Other Diseases ASHLEY HOPKINS

Genetic disorders are passed down from generation to generation. Sometimes, only one parent passes down the faulty gene, which creates carriers of a genetic disease. Some carriers of certain genetic disorders have been proven to be more resistant to certain viral or infectious diseases. Although many genetic disorders can be very harmful, there can be some benefits to either being a carrier or showing full symptoms of a disease. Similarly, infection by certain pathogens will sometimes grant the sufferer resistance to other illnesses down the road. The following diseases have been proven to promote some degree of resistance against other illnesses. Some of the virueses mentioned continue to be incurable, and studying factors that grant resistance to these pathogens can help researchers develop more effective treatment options. So here are ten diseases that prevent other diseases.

Sickle-Cell And Malaria

Photo credit: Ute Frevert, Margaret Shear

People who are carriers of the sickle-cell gene have been proven to be more resistant to malaria. Sickle-cell is a condition where the red blood cells are misshapen, becoming crescent-shaped and more susceptible to clotting. According to the Centers for Disease Control, 60 percent of sickle-cell carriers survive malaria. This means that in areas of


high concentration of malaria transfer (Central and South America, Africa, Asia, and the Indo-Pacific region), there are also large numbers of sickle-cell carriers. How exactly sickle-cell prevents malaria is by a component of hemoglobin, haem. Low concentration of haem stimulates haem oxengase-1, which also breaks down haem. This allows for carbon monoxide to become more evident in the blood, because haem oxengase-1 releases carbon monoxide, which plays a critical role in the prevention of malaria.[1] A group of scientists tested this on mice and observed these results.

Tay-Sachs And Tuberculosis

Photo credit: Yale Rosen

Tay-Sachs carriers have shown signs that their Tay-Sachs gene protects against Mycobacteria tuberculosis, which, you guessed it, causes tuberculosis. TaySachs disease destroys neurons in the brain and spinal cord and is more common among Ashkenazi Jews, probably because of the segregation and lack of immigration in this group. There has been a proven correlation between the widespread Tay-Sachs gene and tuberculosis in this particular population.

However, Tay-Sachs carriers produce a certain subunit of the enzyme hexosamindinase. This subunit is closely associated with the prevention of tuberculosis, because it destroys the Mycobacteria and causes other bacteria on the cells’ surfaces to become less active.[2] So, despite the increased incidence of tuberculosis in Ashkenazi Jews, there are fewer deaths due to the disease.

Cystic Fibrosis And Cholera


Photo credit: Ronald Taylor, Tom Kirn, Louisa Howard

Carriers of cystic fibrosis have been shown to survive Vibrio cholerae, the lethal strain of cholera. Cystic fibrosis causes channels in the respiratory system to be blocked with thick mucus. The mucus will build up in the lungs and create a bacterial breeding ground. It also affects the digestive system by blocking the enzymes that digest the food in the small intestine. However, carriers do not experience the effects of this disease, and they may not experience the effects of cholera, either.

Cholera is deadly because it will cause the patient to lose about 19 liters (5 gal) of water a day, ultimately leading to dehydration. Cystic fibrosis blocks chloride channels, keeping fluids in. As a result, even carriers of the cystic fibrosis gene who are infected with cholera will lose half the amount of fluid. This limited fluid secretion is enough to flush out the cholera toxins from the intestines without causing dehydration. [3] So, just one cystic fibrosis gene will prevent the lethal effects of cholera by preventing the dehydration associated with it.

Cystic Fibrosis And Tuberculosis

According to New Scientist, cystic fibrosis does protect against cholera, but cholera doesn’t kill enough people to justify the prevalence of the cystic fibrosis gene. Between 1600 and 1900, about 20 percent of deaths in Europe were caused by tuberculosis, and that would explain why carriers of the cystic fibrosis gene are so prominent, because carriers live to maturity to pass on their genes. Those who have two genes for cystic fibrosis die before being able to pass on their DNA, and the same goes for many people who contracted tuberculosis.


However, those who only have one gene for cystic fibrosis have shown some resistance to tuberculosis, hence the gene still being prevalent among Europeans and those of European descent. The cystic fibrosis gene would have died out, but it has lasted for thousands of years, so there must be some usefulness to it.[4]That usefulness is said to be resistance to tuberculosis.

Cowpox And Smallpox

Photo credit: George Henry Fox

Cowpox, a viral skin infection, is basically a mild smallpox. Although cowpox isn’t necessarily pleasant to contract, the human body will stop the progression of the infection after a certain period of time, so the infection itself is not lethal. Cowpox can prevent smallpox because they are both essentially the same infection. By being introduced to the cowpox infection, the immune system is able to develop immunity to it. When a more deadly version of that infection is introduced, it is easier for the immune system to prevent severe effects. Famously, Edward Jenner utilized cowpox to create the smallpox vaccine in the late 1700s.[5]

Phenylketonuria And Mycotic Abortions

According to an online study, “Physicians observed that women who were PKU [phenylketonuria] carriers had a much lower than average incidence of miscarriages.” [6] PKU is a genetic disease in which phenylalanine builds up in the body, which causes issues when the patient consumes a large amount of protein. The body


disables the production of an enzyme that breaks down this substance, and the buildup can be lethal to the patient. Although PKU may cause significant health issues, carriers have an advantage when it comes to protecting themselves against mycotic abortions, pregnancy losses due to fungal infection. This is most prominent in Scotland and Ireland, because the atmosphere is a prime environment for fungi, some of which can cause mycotic abortions. Phenylalanine, which causes the health issues in PKU patients, fights against the major toxin in most fungi that cause spontaneous abortions. Since PKU carriers have a large amount of phenylalanine, they are able to better fend off infectious fungi and protect their unborn offspring.

Myasthenia Gravis And Rabies

Photo credit: Centers for Disease Control and Prevention

There is a correlation between patients with myasthenia gravis and the prevention of rabies. Myasthenia gravis is a muscular disease in which the voluntarily moved muscles become fatigued and are weakened. This is caused by faulty connections between the nervous system and the muscular system. Rabies infects the nervous system best through the skeletal muscles, probably because rabies is usually transmitted through animal bites.

Since rabies is commonly injected into the muscular system by a bite, those afflicted with myasthenia gravis are much less susceptible to rabies because they have faulty connections between the muscles and the nerves. It is very difficult for rabies to cause harm to the nervous system when it cannot enter it in the first place.[7] Although the muscles are not the only point of entry rabies has into the central nervous system, they are a significant entry point for the peripheral nervous system. This can prevent the


infection or prolong it until the patient can seek medical attention.

Niemann-Pick Disease And Ebola

Photo credit: Daniel Bausch, Division of Viral and Rickettsial Diseases, National Center for Infectious Diseases, CDC

Niemann-Pick is a disease where cholesterol abnormally accumulates within lysosomes. The cholesterol accumulates because of a shortage of a specific protein called NPC1, which will transport the cholesterol out of the lysosomes. It has been proven that the NPC1 protein is associated with the Ebola infection process. The Ebola virus has been documented to poorly infect fibroblasts of patients who have NiemannPick disease, while it did better in fibroblasts where NPC1 was abundant. Basically, the Ebola virus cannot efficiently infect people with Niemann-Pick disease because without the presence of NPC1, it is extremely difficult to do so.[8]

Niemann-Pick Disease And Marburg

Similar to the Ebola virus, Niemann-Pick disease promotes resistance to Marburg. Marburg is a filovirus, like Ebola, and has a high mortality rate.[9] It causes hemorrhages and severe shock syndrome, mostly fatal among humansand nonhuman primates. Much like with Ebola, Niemann-Pick disease patients resist Marburg by having a shortage of NPC1, which enables filoviruses to reproduce and spread. Because these viruses are unable to spread, it is much easier for patients with Niemann-Pick disease to fight Marburg, since it is no longer deadly if it cannot reproduce.


Congenital Disorder Of Glycosylation 2b And Viral Infections

Congenital disorder of glycosylation 2b (CDG-IIb) has been shown to prevent viral infections such as HIV, influenza, herpes, and hepatitis C. This extremely rare disease causes resistance to viral infections by the presence of a “defective mannosyloligosaccharide glucosidase (MOGS), which is the initial enzyme in the processing phase of N-linked oligosaccharides.”[10] This basically means that glycoprotein synthesis is not able to function properly. Viruses depend on proper cell glycosylation for reproduction, and because CDG-IIb patients do not have proper glycosylation, these viruses are unable to be maintained. Studies show that people with CDG-IIb responded normally to nonreplicating viruses but did not respond to live glycosylation-dependent virus vaccines. MOGS inhibitors also prevent the replication of cells infected with enveloped viruses, and this means that the viruses are unable to spread.


10 Amazing New Food Innovations That Will Make Your Mouth Water GREGORY MYERS

Food and drink are some of the most essential parts of life—we could not live without the sustenance they provide. Although there are some people in the world who are truly starving and would eat anything, modern man in most industrialized civilizations has made something of a hobby out of eating. We spend millions on new designs and innovations to create foods and new cooking techniques that the world has never seen before. Here are 10 that will make your mouth water.

A Swiss Chocolatier Has Perfected His Formula For A New ‘Ruby’ Chocolate

Photo credit: The Guardian

For the longest time, we have been stuck with nearly the same varieties of chocolate, although you won’t hear most people complaining about that. Of course, there is the delicious white chocolate. Most varieties of milk or dark chocolate are fairly similar— they just have more actual cocoa in them.

Then came a Swiss chocolatier named Barry Callebaut. He has come up with a formula for an entirely new treat called “ruby chocolate.” The chocolate variety has a pinkish red hue. Although it does have some sweetness, it also has a bit of a sourness that you don’t usually expect from chocolate. It has already seen widespread testing in Japan and South Korea and will be hitting shelves in the UK in the form of a special KitKat on


April 16, 2018.[1] This uncommon treat is supposed to have less of the usual cocoa taste without any crazy genetic modification. It’s created with an existing type of cocoa bean that is processed before fermentation. This is a patented trade secret at this point. Perhaps if it catches on, Callebaut will start licensing his secret to the big-name players.

Plant-Based Burgers That Taste And Even ‘Bleed’ Like A Real Meat Patty

Photo credit: impossiblefoods.com

After adopting their new eating habits, many vegans miss the taste of many foods they were once used to—American comfort foods that most of us could not imagine doing without. Vegans try to fill this void with substitutions made from plants, but most agree that a veggie burger really does not mimic the experience of a proper burger with a real meat patty. Enter a small group of vegan scientists who want that experience and hope to convert meat eaters to vegans by giving them a proper substitute. Several years ago, these scientists started Impossible Foods, a company in Silicon Valley, to make that perfect fake burger. The company has been using complicated food science to mimic the taste, texture, and entire experience of eating a burger with a real meat patty. Multiple plant products go into their formula, but beet juice is the key to making it seem like the burger actually bleeds.[2] They say that the key to their success was something called heme, a building block of life that is found in both meat and plants and helped them imitate the taste and texture appropriately. There is already limited testing of the Impossible Burger in New York, San Francisco, and Los Angeles. You may also get lucky and find it locally because Impossible Burger is trying to get as many restaurants as possible to try out their


product.

Grapes That Taste Just Like Cotton Candy And Are All Natural

Photo credit: npr.org

Due to genetically modified organisms (GMOs), we now have hybrid fruits and vegetables with special flavors that you would never imagine finding in nature. Some people decry these foods because they are worried about GMOs even though scientists say they are not dangerous. However, for those who are worried about GMOs, they can breathe easy when it comes to this amazing, tasty treat. According to the farmer who invented them, cotton candy grapes were carefully designed by crossbreeding various species of wild grapes until the desired flavor was achieved. No genetic engineering or laboratory trickery was involved. Cotton candy grapes were conceived, cultivated, and ultimately invented by a lone farmer who had a really good idea and started working slowly but surely on his new creation. They can be found at most grocery stores today and, due to their novelty, tend to cost a little bit more than traditional grapes.[3] However, most people who have tried them feel that the taste is well worth the price. In blind taste tests, most people have no expectations beforehand yet identify these grapes as tasting like cotton candy.


Ice Cream That Is Made Right In Front Of You Using Liquid Nitrogen A few years ago, a husband-and-wife team of engineers wowed the sharks on ABC’s Shark Tank when they used liquid nitrogen to make delicious ice cream right in front of everyone. All but one of the sharks enjoyed the delicious treats and were impressed by both the wow factor and the safety of the process.

However, the sharks were not impressed enough to offer the inventors a deal. Although they liked the idea, they felt the business wasn’t where it needed to be. The company wanted to pursue franchising, and the sharks did not agree with that business plan. Although the sharks didn’t invest, the company founders have continued on their journey. Sub Zero Ice Cream has slowly started franchising throughout the country and wants to make this exciting new experience a global brand.[4] The new method wows children and adults alike. It also removes the need for a freezer, which means less of a carbon footprint and lower energy costs for the franchisee. The process involves using liquid nitrogen to quickly freeze whatever flavor ingredients you want. Then you just stir it properly as it freezes to get that desired creamy consistency.

Edible Water Orbs That Can Replace Plastic Bottles And Are Entirely Biodegradable Not that long ago in relative terms, bottled water was a rare commodity that was only used by people in rural or other areas without proper access to potable water. Then, during the 1990s, bottled water started to become a huge trend. Now landfills all over the world are overflowing with plastic bottles. Plastic takes a long time to break down. The amount of trash created from these containers is practically apocalyptic, even though many advocates have pushed for people to buy reusable water bottles. A company called Skipping Rocks Lab has come up with a product called Ooho, which they believe can revolutionize how we drink water on the go. To reduce the amount of waste around the world, they created a stable, flavorless orb from which you can drink water. Then you can eat the orb afterward.[5] Made of algae, it is biodegradable if you decide to throw it away instead of eating it. The creators believe that these orbs could replace plastic water bottles entirely with the right distribution. They would also greatly decrease the garbage problem from our massive plastic consumption.

The Anti-Griddle Is Expensive, But It Allows For Incredible And Speedy Frozen Creations


Photo credit: instructables.com

One of the more fascinating new pieces of culinary equipment is the anti-griddle. It was originally dreamed up by Grant Achatz, a guest judge on Top Chef, who used it in his Chicago restaurant. After people started to notice his invention, he worked with Philip Preston, an expert on new culinary devices, to make the anti-griddle available for the global mass market. This device allows you to flash-freeze or semi-freeze foods almost instantly. Unfortunately, most individuals cannot afford to buy the anti-griddle. Mainly restaurants with higher budgets can afford them. With the basic models clocking in at around $1,500 each, most people cooking at home will consider them as an overly expensive luxury. However, for those who do want to try them out, the folks at Instructables have your back. They provide a step-by-step blueprint with pictures to build a functioning DIY version of the anti-griddle for about $15. If you are handy with tools and enjoy making things, you could be like the crazy chefs on TV for less than 20 bucks.[6]

Cricket Flour Helps Ease People Into A Valuable New Food Source While Tasting Delicious

As food sources become scarcer around the world, people are becoming increasingly worried about how we are going to feed everyone. The world already struggles with hunger in many regions around the globe, and some think that climate change will make things noticeably worse within the next few decades. There is also concern that the beef industry has a gigantic carbon footprint and may


not be entirely sustainable in the future. To combat these worries, some scientists have gone to disgusting lengths. They are trying to get us to eat bugs. Once you get past the ick factor, eating bugs is not a bad idea. They are high in protein, and there are tons of them around. They are also relatively easy to breed. Most people are grossed out by the idea, but entrepreneurs have found a way to get people started. They have created a flour from crickets, which helps remove the disgust factor.[7]This flour has been used in many successful products—from chips to protein bars. No one knows if it will catch on in a major way in the Western world, but it would be a great food source. As flour used for baking, it may help people to get past their initial reservations about eating bugs.

Once Only For Snobby Chefs, Sous Vide Is Becoming Increasingly Mainstream

Photo credit: marthastewart.com

In the last several years, one of the most talked-about innovations in food is the new sous vide cooking trend. This involves putting food inside plastic bags and then submerging the bags in water to slowly and consistently cook with little babysitting or work needed in between.

Many chefs and restaurants have started using the method because it allows them to do a lot of things at once without paying much attention. Sous vide also results in nearly perfect cooking every time. Quite a few people consider this method of cooking incredibly pretentious. In fact, most home cooks are convinced that they can’t do this without a huge investment. Immersion circulators and special thermometers for the process can set you back thousands once


you have a full set, but Martha Stewart has your back. She explains how to sous vide at home on the cheap.[8] You just need basic cooking thermometers, plastic Ziploc bags, and the proper knowledge. It may be easier and neater to use fancy equipment, but you can do the process just fine at home without it.

The Trend To Eat Black Ice Cream Made With Activated Charcoal Is Dangerous For Some

Photo credit: lamag.com

Recently on social media, black ice cream has become a big trend. It started as a backlash to the recent unicorn trend, especially after the debacle in which Starbucks gave several of their baristas nervous breakdowns.

The first shop to sell this creation was the Little Damage ice cream shop in Los Angeles, California. But the idea soon took off. You cannot copyright a recipe, and before long, people discovered that the ice cream was made using activated charcoal. Specialty shops everywhere started doing it. Although it is a delicious and harmless trend for most people, some individuals should be quite cautious before they enjoy this new treat. You may have heard that grapefruit can have bad interactions with certain medicines. Well, ice cream made with activated charcoal can cause similar problems.[9] The reason is that activated charcoal can be a potent detoxifier. This can draw the medications out of your body, thus causing them to become ineffective. It’s not just a problem for people who are ill. The activated charcoal in black ice cream can also affect people who take vitamins or birth control pills.


Deboned Baby Back Rib Steaks That Aren’t A Mishmash

Al “Bubba” Baker was a former NFL player who spent 13 seasons as a defensive lineman for several teams. He ended his respectable career with the Cleveland Browns in 1990. However, like many former NFL players, he got out of sports and found himself bored and wondering what to do for a new career. Well, Bubba really liked ribs, but his wife didn’t. They were too messy. He wanted her to be able to enjoy ribs like he did, but he didn’t want her to have to accept a Frankenstein “rib” creation like the stuff you get from McDonald’s for a limited time. So, he started working on a process—which he patented—to take the bones out of a whole rib and keep the meat intact. He had only $154,000 in total sales when he went on Shark Tank, but the sharks were incredibly impressed by him, his product, and his patent on a food process. Ultimately, he made a deal with Daymond John, who helped him meet some important executives at the parent companies of Hardee’s and Carl’s Jr.[10] They made a multimillion-dollar deal to buy his boneless ribs to sell at their fast-food restaurants around the country, and his sales leaped to $16 million in three years. Baker’s simple dream to help his wife enjoy one of his favorite foods has made him a multimillionaire whose family is now set for life. And, if the idea of a rib sandwich with a whole, authentic boneless rib sounds mouthwatering to you, just make your way over to a Carl’s Jr. or a Hardee’s restaurant, where they still feature sandwiches with the new boneless ribs on their menus.


10 Fascinating Facts About M&M’S MARK LEE

The candy that “melts in your mouth, not in your hand.” We all recognize that famous tagline for M&M’S. But there is so much more to these chocolate treats than whether they’ll melt in your mouth or your hand. Here, we’ll explore their fascinating roles in the entertainment industry, the White House, World War II, and more. So grab a handful, and we’ll treat you to the sweet history of M&M’S chocolate candies.

E.T. Phone Home

When Steven Spielberg was in the early stages of making E.T. the Extra-Terrestrial, he reached out to Mars (the company that owns the M&M’S brand) and asked permission to use their product in the film. They declined the offer, which resulted in Spielberg turning to Reese’s Pieces. Shortly after the movie’s release, sales of Reese’s Pieces tripled in just two weeks. [1] Mars justified the bad decision by stating that they believed “E.T was ugly and would scare children.” To think that the company was so close to earning a permanent spot in 1980s nostalgia. The decision is still baffling to this day. Spielberg was a huge name even back then, having already directed Raiders of the Lost Ark, Close Encounters of the Third Kind, and Jaws. How on Earth could Mars have doubted him?

Blue M&M’S

The blue M&M’S were introduced to replace the tan M&M’S in 1995. The decision was left to the public’s vote, and the results were 54 percent in favor of the color blue. The choice was between purple, blue, and pink as the new color. Over 10 million people voted in the competition over the course of the allotted two months. When blue was chosen, the Empire State Building was lit up in blue to celebrate.[2] The New York skyscraper is often lit up with different colors to celebrate certain holidays. The building is red on Valentine’s Day and yellow and white to celebrate Easter. M&M’S have frequently been associated with American patriotism, so it only seems appropriate to see the product associated with a landmark in one of the country’s greatest cities.

Van Halen


Photo credit: vhnd.com

While on tour in 1982, Van Halen requested a bowl of M&M’S in their dressing room but specified that there be “absolutely no brown ones.” While this may sound like the exploits of a group of divas, it was actually a test to see if the venues were listening to their requests and reading every word of their contracts.

Due to the long list of requirements to run a safe concert, David Lee Roth wanted to ensure that all requests were adhered to. He hid the “no brown [M&M’S]” request within the contract so that he had an easy way of telling whether or not corners were cut.[3] It is often speculated that had this request been ignored, Van Halen would have had the legal right to cancel the concert at any point, even at the last minute. Can you imagine how upsetting it would be if you had traveled to the arena, booked accommodations, found a decent standing area, and then were informed that the concert was over because a brown M&M’S candy was found at the bottom of a snack bowl?

The Creators

Although M&M’S are one of the most popular chocolate products, most people do not know that M&M stands for “Mars and Murrie.” The candies were named for Forrest Mars Sr., son of the Mars Company founder, and Bruce Murrie, the son of Hershey executive William Murrie (who owned 20 percent of the product). It is rather interesting how the sons of two higher-ups in the US chocolate industry combined their efforts to put their own mark on the chocolate business. It all began when Forrest fell out with his father and moved away from him and the company.[4] During the Spanish Civil War, Forrest was amazed to find soldiers eating chocolate in the summer. The lack of proper air conditioning meant that chocolate often melted easily in the summer, causing a drop in sales. Forrest Mars took advantage of this gap in the market. After observing the soldiers eating their candy, he mimicked the concept of hard sugar shells around pieces of


chocolate, which prevented the chocolate from melting. Upon his return to the United States, he and Bruce Murrie teamed up and brought the chocolate to market.

Seinfeld

E.T. wasn’t the only example of Mars turning down an opportunity for M&M’S to be part of the entertainment industry. A Season 4 episode of Seinfeld showed Jerry and Kramer observing a heart operation. Kramer is eating a pack of Junior Mints when he accidentally knocks one of them from the observation desk and into the open chest cavity of the patient. They remain silent as the operation ends and the patient is closed up with the mint inside.[5] This scene gave Junior Mints a lot of publicity. In fact, the episode even ends on what is essentially a commercial. When offered a Junior Mint by Kramer, the doctor states that they can be “very refreshing.” Declining the opportunity for their product to be used in this episode of Seinfeld is yet another example of Mars failing to trust others with their M&M’S brand. “What are you eating?” Jerry Seinfeld asked in the episode. Kramer replied, “Junior Mints. You want one?” That could have been you, M&M’S! Funnily enough, over the credits, there was a stand-up scene with Jerry where he riffs on the idea of the colors of M&M’S being completely different when you are younger. It appears as though Seinfeldultimately won.

World War II

Photo credit: confectionerynews.com

After Forrest Mars Sr. and Bruce Murrie invented M&M’S, the chocolate was sold exclusively to the US military. It was a convenient snack food which did not melt easily in intense conditions. It was also easy to transport.


There appear to have been two reasons for the decision to exclusively supply the US military. Patriotism was a huge deal because the US had just entered the war. But with the new situation came rations in food products such as sugar and chocolate. To ensure that M&M’S could be produced, Mars and Murrie approached the US military and the rest is history. After the war ended, the soldiers returned home with a love of the product. From there, M&M’S were distributed to the general public and quickly became one of the top chocolate products in America. The company continues to donate to the troops, with $750,000 donated in 2016 alone to celebrate the 75th anniversary of Mars and Murrie creating the product.[6]

The White House

Photo credit: refinery29.com

M&M’S are the official presidential candy in the US. Prior to President Ronald Reagan, the majority of presidential products were matchboxes and cigarettes. These were often handed out on Air Force One or to visitors at the White House. However, due to Reagan’s sweet tooth, presidential boxes were eventually filled with M&M’S. Reagan requested that they replace the cigarettes. The tradition continues to this day, with the current president’s signature printed below the presidential seal on the box.[7

Yellow M&M’S


Photo credit: guacamoley.com

You know those commercials where oversized red and yellow M&M’S candies banter for 30 seconds? Well, one of those voices sounds so familiar because the yellow M&M’S candy is voiced by J.K. Simmons. The actor made his mark on the movie industry with roles such as J. Jonah Jameson in Sam Raimi’s Spider-Mantrilogy and, more recently, Whiplash, for which Simmons received an Academy Award for Best Supporting Actor.[8]

As for the yellow M&M’S candy, Simmons has voiced the character for over 20 years, having taken up the mantle in 1996. Prior to Simmons, the yellow M&M’S candy was voiced by none other than Golden Globe winner John Goodman. What is it about the yellow M&M’S candy which attracts such star quality? Of course, yellow is not the only one voiced by a renowned actor. The red M&M’S candy is currently voiced by Billy West, the actor behind Futurama’s Fry.

Honey, Honey

Having fed off a plant which had processed the remains of a discarded bag of M&M’S, a group of honeybees began to produce honey that matched the colors of the candies. Jars of blue and green honey were collected from the hives in northeastern France.[9] Unfortunately, the honey could not be sold due to industry regulations. But still, it makes for an interesting art piece! This was not the first time that oddly colored honey has been retrieved from a hive. In 2010, a group of bees in Brooklyn began to feed off the supplies at Dell’s Maraschino Cherries Company, which resulted in their honey being a bright shade of red.


The head of the New York City Beekeepers Association explained why honeybees are often drawn to artificial flavorings: “Bees will forage from any sweet liquid in their flight path for up to [5 kilometers (3 mi).]”

The Scream

Photo credit: artnews.com

When Edvard Munch’s The Scream was stolen from the Munch Museum in 2004, Mars stood up and set a reward for its return. However, it was not a monetary reward. In 2006, they offered two million M&M’S in exchange for the painting’s return. Mars had recently launched a campaign for their dark chocolate M&M’S. It was these which they offered in return for the painting. The incentive worked. But the chocolate was not given to the convict who revealed the location of the stolen art.[10]

Instead, the cash equivalent of two million M&M’S (about $26,000) was donated to the Munch Museum. A spokesperson for Mars explained, “We’d never give a reward to a convicted criminal.”




Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.