Over the years, humans have built up a collection of stock answers to medical queries that have become so ingrained in our common psyche that they are now regarded as facts. Here we will bust five frequently repeated myths.
Old wives’ tales and superstitions have become part of the fabric of human understanding.
Today, with the endless information that the Internet has to offer, questions can be answered at the click of a mouse.
This, you might think, would spell the end of scientific and medical misinformation, but the sheer quantity of information that is now available is so bewildering that “common knowledge” has been left largely in place.
Most people are too busy to fact-check details that don’t directly impact their lives.
Here, we will briefly run through five medical “facts” that most people have taken for granted since they were children.
Firstly, we will tackle the pervasive rumor that waking a sleepwalker is a terrible idea.
Sleepwalking can be an unsettling event for the person doing the walking and anyone who happens to witness the event. Somnambulism, as it is also called, occurs in the deepest part of sleep, normally a few hours after onset.
Affecting an estimated 1-15 percent of the general population, sleepwalking is surprisingly prevalent, particularly among children.
It is common knowledge that waking a sleepwalker can give them a heart attack or put them in a coma. However, according to the National Sleep Foundation, the reverse is, in fact, true: it is dangerous not to wake a sleepwalker.
Waking a sleepwalker might confuse them, but not waking them might leave them free to fall down the stairs, smash a glass, or get in their car and take a drive (worse things have happened). That said, waking a sleepwalker can occasionally be dangerous for the person doing the waking – somnambulists have been known to act violently.
Where possible, simply guiding the sleeping wanderer back to bed is the best option. But if the sleepwalker defies being shepherded, this is how the National Sleep Foundation suggest awaking them:
“Use loud, sharp noises (from a safe distance) to wake up the person […] This will most likely startle the sleepwalker, but it’s better than shaking the person in close range because that might trigger the sleepwalker to feel attacked and lash out and hurt you.”
They go on to remind us that the individual is likely to be “confused, disoriented, and scared,” so it is best to explain gently that they have been sleepwalking.
Never go swimming on a full stomach – wait at least 1 hour – otherwise you face cramps and potential drowning. That statement is repeated so often that it has been indelibly marked as “true.”
The basis of the myth is that, after eating, blood flows to the stomach for digestion. This leaves less blood for the muscles to use while swimming, causing them to descend into pulsating cramps.
When asked whether there is any truth to this old wives’ tale, Dr. Roshini Rajapaksa, a gastroenterologist at the New York University School of Medicine, said that if one was to swim incredibly strenuously, minor cramps might occur.
However, for the standard swimmer, there are no worries, and drowning because of a cramp is even less likely.
A report carried out by the American Red Cross Scientific Advisory Council looked at multiple relevant studies and asked a number of experts in the field; they concluded:
“There is no correlation between eating and drowning or near-drowning events.”
A note of caution: if alcohol is involved in the pre-swimming meal, the likelihood of drowning certainly is increased.
If you’ve ever glanced at the thin veins on your wrist, you could be forgiven for thinking that the blood within them is blue. We are taught this from an early age: deoxygenated blood is blue, and once the lungs have furnished it with oxygen, it is red.
However, when we cut ourselves, the blood is always red. This, we have been told, is because the blood is oxygenated as soon as it touches the air.
Despite the way things appear, none of the above is true. Blood is never blue. When it is deoxygenated, it is a deep shade of red, and, once oxygenated, it is cherry red.
So why do the veins look blue? It’s actually a rather complex answer that involves at least four factors:
- The way in which the skin scatters and absorbs light is complicated. Because the skin is made of numerous compounds with a variety of optical properties, the way that light travels through it, or bounces off it, is difficult to predict.
- Blood’s oxygenation state affects the way that light is absorbed. When it is deoxygenated, its absorption coefficient is altered.
- The depth and diameter of the blood vessels has an effect. For instance, smaller vessels near to the surface appear red, whereas a larger vessel, at the same depth, will look bluer.
- The way in which humans perceive color.
So, why veins look blue is a very simple question with a very complicated answer.
Another blood-based misconception is that the iron within the hemoglobin gives blood its red color. In fact, it is hemoglobin’s interaction with other molecules, such as porphyrin, that produces the redness.
Most people will be familiar with the classic “tongue map,” where the sections of the tongue responsible for detecting sweet, sour, bitter, and salty tastes are described. This theory is taught widely at schools and is considered to be a fact by most people.
However, in reality, we taste different flavors using taste buds spread across all parts of the tongue.
It is true that certain areas of the tongue are more sensitive to certain flavors – for instance, sweet or sour – but the differences are small, they vary between individuals, and are not as neat as the famous tongue map.
Also, many of us were taught that there are just four primary tastes: bitter, sour, salty, and sweet. In fact, there is a fifth – umami – a savory,
If children attend a birthday party and consume copious amounts of sugary drinks and snacks, their energy levels sky rocket, and they bounce off of every available wall.
Contrary to popular belief, there is no scientific evidence that sugar increases children’s energy levels.
The theory has been put to the test by at least a dozen double-blind, randomized control trials where the children, researchers, and parents were unaware of the conditions.
Even studies carried out using children with attention deficit hyperactivity disorder (ADHD), or who were deemed “sensitive” to sugar, came to the same conclusion. It seems that the parent’s perception of their child’s behavior is partly to blame; added to that is the fact that the children have been at a party where they’ve had oodles of fun.
A meta-analysis of 16 trials concluded:
“The meta-analytic synthesis of the studies to date found that sugar does not affect the behavior or cognitive performance of children. The strong belief of parents may be due to expectancy and common association.”
That certainly flies in the face of common knowledge.
Although the five examples given above are only the tip of the iceberg, one must wonder what other “facts” have been taken for granted that just aren’t so.