There are few topics that cause more intense feelings of revulsion than cannibalism. The consumption of another human’s flesh is abhorrent, vile, and — to Western sensibilities — morally wrong. However, is cannibalism bad for your health?
Although the knee-jerk reaction to eating human flesh is strong, the actual morality and ethics behind those feelings aren’t as simple as they first appear.
Cannibalism occurs in many species and has been a part of human culture for thousands of years.
In some cultures, cannibalism involved eating parts of one’s enemies in order to take on their strength. In other tribes, the consumption of human flesh had a more ritual significance.
In desperate times, people have fallen back on cannibalism to survive; for instance, there are reports of cannibalism during the North Korean famine in 2013, the siege of Leningrad in the early 1940s, and China’s “Great Leap Forward” in the late 1950s and 1960s.
In Europe, from the 14 century up until the early 18th century, human body parts were knowingly sold and purchased as medications, particularly bones, blood, and fat. Even priests and royalty routinely consumed human body products in an effort to stave off anything from headaches to epilepsy, and from nosebleeds to gout.
In some cultures, once a loved one has died, parts of them are consumed so that they, quite literally, become a part of you. To “civilized” minds, this might seem disturbing, but to the minds of those that entertain these “
Once we start to strip away at cannibalism’s ability to make us instantly recoil, we see that our feelings aren’t quite as clear-cut as they seem. For instance, many of us eat our fingernails, and some women eat their placenta after giving birth. The lines are, perhaps, slightly more blurred than our initial reaction might infer.
For the purpose of this article, we do not need to wade into the interplay between instinctive gut feelings and cold, hard logic. Here, we will focus on the negative health ramifications of cannibalism.
In most civilizations, cannibalism is the last port of call, used only if the alternative is certain death. But what are the potential health consequences of eating one’s neighbor, if any?
Although it seems “wrong,” the good news is, consuming cooked human flesh is no more dangerous than eating the cooked flesh of other animals. The same goes for the majority of the human body; the health implications are similar to that of eating any large omnivore.
However, there is one organ that should be avoided at all cost: the brain.
Kuru is a unanimously fatal, transmissible spongiform encephalopathy; it is a prion-based disease similar to BSE (bovine spongiform encephalopathy), which is also known as mad cow disease.
Prion diseases are associated with the accumulation of an abnormal glycoprotein known as prion protein (PrP) in the brain. PrP occurs naturally, particularly in the nervous system. Its functions in health are not yet fully understood. However, PrP is known to play a role in a number of diseases, including Alzheimer’s disease.
The Fore people are the only population who have experienced a documented epidemic of kuru and, at its peak in the 1950s, it was the leading cause of death in women among the Fore and their nearest neighbors.
The word “kuru” comes from the Fore language and means “to shake.” Kuru is also known as “laughing sickness” because of the pathologic bursts of laughter that patients would display.
The first report of kuru to reach Western ears came from Australian administrators who were exploring the area:
“The first sign of impending death is a general debility which is followed by general weakness and inability to stand. The victim retires to her house. She is able to take a little nourishment but suffers from violent shivering. The next stage is that the victim lies down in the house and cannot take nourishment, and death eventually ensues.”
W. T. Brown
At its peak, 2 percent of all deaths in the Fore villages were due to kuru. The disease predominantly struck down females and children; in fact, some villages became almost entirely devoid of women.
This gender difference in the disease appears to have occurred for a couple of reasons. Fore men believed that, during times of conflict, consuming human flesh weakened them, so women and children more commonly ate the deceased.
Also, it was predominantly the women and children who were responsible for cleaning the bodies, leaving them at an increased risk of infection via any open wounds.
Kuru has a long incubation period where there are no symptoms. This asymptomatic period often lasts 5–20 years, but, in some cases, it can drag on for more than 50 years.
Once symptoms do appear, they are both physiological and neurological and are often split into three phases:
- joint pain
- loss of balance
- deterioration of speech
- decreased muscle control
- become incapable of walking
- loss of muscle coordination
- severe tremors
- emotional instability — depression with outbursts of uncontrollable laughter
- cannot sit without being supported
- virtually no muscle coordination
- unable to speak
- difficulty swallowing
- unresponsive to surroundings
- ulcerations with pus and necrosis (tissue death).
Generally, the patient will die between 3 months and 2 years from the onset of symptoms. Death usually occurs due to pneumonia or infected pressure sores.
Thankfully, kuru has almost entirely disappeared. During the 1950s, Australian colonial law enforcement and Christian missionaries helped reduce the funerary cannibalism of the Fore people.
Once the practice was stamped out, or significantly reduced, the prion could no longer spread between members of the tribe. The last victim of the disease is thought to have died in 2005.
Although kuru is never likely to be a major health issue for the majority of humanity, the outbreak has proven useful to medical researchers. The relatively recent concerns around BSE and Creutzfeldt-Jakob disease has spawned a resurgence of interest in kuru.
Kuru remains the only known epidemic of a human prion disease. By understanding this disease and how it works, treatments might be designed to prevent, or at least reduce, the chances of future neurological prion-based epidemics.