Monday, October 10, 2011 - 0 comments

The Mind-Body Interaction in Disease : 7. New Approaches to Treatment

By : Esther M. Sternberg and Philip W. Gold

Immune Cells. Altered Genetic Activity in immune
cells is an effect of cortisol. The
cortisol receptors in immune cells are
folded and bound to large “heat-shock”
proteins. When cortisol enters a cell and
binds to its receptor, the protein is displaced
and the receptor unfolds. The receptor
then binds to DNA in the nucleus,
changing the cell’s transcription of messenger
RNA (mRNA) and production of
proteins. (Other molecules called c-fos
and c-jun bind with the receptor and confer
more specificity on its action.) The
proteins leave the cell and directly affect
cytokine and lymphocyte production.
For centuries, taking the cure at a mountain sanatorium or a hotsprings spa was the only available treatment for many chronic diseases. New understanding of the communication between the brain and immune system provides a physiological explanation of why such cures sometimes worked. Disruption of this communication  network leads to an increase in susceptibility to disease and can worsen the course of the illness. Restoration  of this communication system, whether through pharmacological agents or the relaxing effects of a spa, can be the first step on the road to recovery.

A corollary of these findings is that psychoactive drugs may in some cases be used to treat inflammatory  iseases, and drugs that affect the immune system may be useful in treating some psychiatric disorders. There is growing evidence that our view of ourselves and others, our style of handling stresses, as well as our genetic  makeup, can affect activities of the immune system. Similarly, there is good evidence that diseases associated with chronic inflammation significantly affect on one’s mood or level of anxiety. Finally, these findings suggest that classification of illnesses into medical and psychiatric specialties, and the boundaries that have  emarcated mind and body, are artificial.



* * *

ESTHER M. STERNBERG and PHILIP W. GOLD carry out their research on stress and immune systems at the National Institute of Mental Health, where Sternberg is chief of the section on neuroendocrinology and behavior and Gold is chief of the clinical neuroendocrinology branch. Sternberg received her M.D. from McGill University. 
Her work on the mechanisms and molecular basis of neuroimmune communication has led to a growing recognition of the importance of the mindbody interaction. She also is an authority on the L-tryptophan eosinophilia myalgia syndrome, which reached almost epidemic proportions in 1989. Prior to joining the NIMH in 1974, Gold received his medical training at Duke University and Harvard University. Gold and his group were among the first to introduce data implicating corticotropin- releasing hormone and its related hormones in the pathophysiology of melancholic and atypical depression and in the mechanisms of action of antidepressant drugs.
- 0 comments

The Mind-Body Interaction in Disease : 6. Stress and Illness

By : Esther M. Sternberg and Philip W. Gold

In the past, the association between an inflammatory disease and stress was considered by doctors to be secondary to the chronic pain and debilitation of the disease. The recent discovery of the common  underpinning of the immune and stress responses may provide an explanation of why a patient can be  susceptibleto both inflammatory disease and depression. The hormonal dysregulation that underlies both inflammatory disease and depression can lead to either illness, depending on whether the perturbing stimulus is pro-inflammato- ry or psychologically stressful. That may explain why the waxing and waning of  depression  in arthritic patients does not always coincide with inflammatory flare-ups.

The popular belief that stress exacerbates inflammatory illness and that relaxation or removal of stress ameliorates it may indeed have a basis in fact. The interactions of the stress and immune systems and the hormonal responses they have in common could explain how conscious attempts to tone down responsivity to stress could affect immune responses.

IMMUNE SIGNALS TO THE BRAIN via the bloodstream can occur directly or indirectly. Immune
cells such as monocytes, a type of white blood cell, produce a chemical messenger called interleukin-
1 (IL-1), which ordinarily will not pass through the blood-brain barrier. But certain cerebral
blood vessels contain leaky junctions, which allow IL-1 molecules to pass into the brain.
There they can activate the HPA axis and other neural systems. IL-1 also binds to receptors on the
endothelial cells that line cerebral blood vessels. This binding can cause enzymes in the cells to
produce nitric oxide or prostaglandins, which diffuse into the brain and act directly on neurons.
How much of the responsivity to stress is genetically determined and how much can be consciously controlled is not known. The set point of the stress response is to some extent genetically determined. An event that is physiologically highly stressful to one individual may be much less so to another, depending on each person’s genetic tendency to hormonal reactivity. The degree to which stress could precipitate or
exacerbate inflammatory disease would then depend both on the intensity of the stressful stimulus and on the set point of the stress system.

Psychological stress can affect an individual’s susceptibility to infectious diseases. The regulation of the immune system by the neurohormonal stress system provides a biological basis for understanding how stress might affect these diseases. There is evidence that stress does affect human immune responses to viruses and bacteria. In studies with volunteers given a standard dose of the common cold virus (rhinovirus), individuals who are simultaneously exposed to stress show more viral particles and produce more mucus than do nonstressed individuals. Medical students receiving hepatitis vaccination during their final exams do not develop full protection against hepatitis. These findings have important implications for the scheduling of vaccinations. People who are vaccinated during periods of stress might be less likely to develop full antibody protection.

Animal studies provide further evidence that stress affects the course and severity of viral illness, bacterial disease and septic shock. Stress in mice worsens the severity of influenza infection and affects both the HPA  axis and the sympathetic nervous system. Animal studies suggest that neuroendocrine mechanisms could play  a similar role in infections with other viruses, including HIV, and provide a mechanism for  understanding clinical observations that stress may exacerbate the course of AIDS. Stress increases the  susceptibility of mice to infection with mycobacteria, the bacteria that causes tuberculosis. It has been shown that an intact HPA axis protects rats against the lethal septic effects of salmonella bacteria. Finally, new understanding  of interactions of the immune and stress responses can help explain the puzzling observation that classic psychological conditioning of animals can influence their immune responses. For example, working with rats, Robert Ader and Nicholas Cohen of the University of Rochester paired saccharinflavored water with an  immunosuppressive drug. Eventually the saccharin alone produced a decrease in immune function similar to  that of the drug.

Stress is not only personal but is perceived through the prism of interactions with other persons. Social  interactions can either add to or lessen psychological stress and similarly affect our hormonal responses to it,  which in turn can alter immune responses. Thus, the social psychological stresses that we experience can  affect our susceptibility to inflammatory and infectious diseases and the course of a disease. For instance, studies have shown that persons exposed to chronic social stresses for more than two months have increased  susceptibility to the common cold.

Other studies have shown that the immune responses of long-term caregivers, such as spouses of Alzheimer’s patients, become blunted. Immune responses in unhappily married and divorcing couples are also blunted.  Often the wife has a feeling of helplessness and experiences the greatest amount of stress. In such a scenario,  studies have found that the levels of stress hormones are elevated and immune responses usually are lowered  in the wife but not in the husband.

On the other hand, a positive supportive environment of extensive social networks or group psychotherapy  can enhance immune response and resistance to disease—even cancer. Women with breast cancer, for  instance, who receive strong, positive social support during their illness have significantly longer life spans than women without such support.

Next : New Approaches to Treatment
- 0 comments

The Mind-Body Interaction in Disease : 5. CRH and Depression

By : Esther M. Sternberg and Philip W. Gold

Although the role of the stress response in inflammatory disease in humans is more difficult to prove, there is a growing amount of evidence that a wide variety of such diseases are associated with impairment of the HPA axis and lower levels of CRH secretion, which ultimately results in a hyperactive immune system.  Furthermore, patients with a mood disorder called atypical depression also have a blunted stress response and impaired CRH function, which leads to lethargy, fatigue, increased sleep and increased feeding that often produces weight gain.

Patients with other illnesses characterized by lethargy and fatigue, such as chronic fatigue syndrome, fibromyalgia and seasonal affective disorder (SAD), exhibit features of both depression and a hyperactive immune system. A person with chronic fatigue syndrome classically manifests debilitating lethargy or fatigue lasting six months or longer with no demonstrable medical cause, as well as feverishness, aches in joints and muscles, allergic symptoms and higher levels of antibodies to a variety of viral antigens (including Epstein-Barr virus).

Patients with fibromyalgia suffer from muscle aches, joint pains and sleep abnormalities, symptoms similar to  early, mild rheumatoid arthritis. Both these illnesses are associated with a profound fatigue like that in atypical depression. SAD, which usually occurs in winter, is typified by lethargy, fatigue, increased food intake and  increased sleep. Many of its symptoms are similar to those of atypical depression.

CRH, the Locus Ceruleus and Sympathetic Nervous System
HYPOTHALAMIC CRH produces changes important to stress and inflammation
adaptation in ways other than inducing cortisol release from the adrenal glands. Pathways
from CRH-secreting neurons in the hypothalamus extend to the locus ceruleus in
the brain stem. Separate pathways from other hypothalamic neurons to the brain stem
influence sympathetic nervous system activity, which modulates inflammatory responses
as well as regulating metabolic and cardiovascular activities. Stimulation by CRH of
the locus ceruleus produces protective behaviors such as arousal and fear (red indicates
stimulation, blue inhibition). The locus ceruleus, in turn, provides feedback to the hypothalamus
for continued production of CRH and also acts on the sympathetic nervous
system. Self-inhibitory feedback keeps the activities of CRH and the locus
ceruleus under control.
A deficiency of CRH could contribute to lethargy in patients with chronic fatigue syndrome. Injection of CRH into patients with fatigue syndrome causes a delayed and blunted ACTH secretion by the HPA axis. That same response is also seen in patients whose hypothalamus has been injured or who have a tumor. Also, fatigue and hyperactivity of the immune response are associated with cortisol deficiency, which occurs when CRH secretion decreases. The hormone levels and responses in patients with fatigue syndromes suggest— but do not prove—that their HPA-axis functions are impaired, resulting in a decrease in CRH and cortisol secretion and an increase in immune system activity. Together these findings suggest that human illness  characterized by fatigue and hyperimmunity could possibly be treated by drugs that mimic CRH actions in the brain.

In contrast, the classic form of depression, melancholia, actually is not a state of inactivation and suppression of thought and feeling; rather it presents as an organized state of anxiety. The anxiety of melancholia is chiefly about the self. Melancholic patients feel impoverished and defective and often express hopelessness about the prospects for their unworthy selves in either love or work. The anxious hyperarousal of melancholic patients also manifests as a pervasive sense of vulnerability, and melancholic patients often interpret relatively neutral cues as harbingers of abandonment or embarrassment.

Melancholic patients also show behavioral alterations suggestive of physiological hyperarousal. They characteristically suffer from insomnia (usually early-morning awakening) and experience inhibition of eating,  sexual activity and menstruation. One of the most widely found biological abnormalities in patients with melancholia is that of sustained hypersecretion of cortisol.

Many studies have been conducted on patients  with major depression to determine whether the excessive level of cortisol associated with depression correlates with suppressed immune responses. Some have found  a correlation between hypercortisolism and immunosuppression; others have not. Because depression can have a variety of mental and biochemical causes only some depressed patients may be immunosuppressed.

The excessive secretion of cortisol in melancholic patients is the result predominantly of hypersecretion of  CRH, caused by a defect in or above the hypothalamus. Thus, the clinical and biochemical manifestations of  melancholia reflect a generalized stress response that has escaped the usual counterregulation, remaining, as it were, stuck in the “on” position.

The effects of tricyclic antidepressant drugs on components of the stress response support the concept that melancholia is associated with a chronic stress response. In rats, regular, but not acute, administration of the  tricyclic antidepressant imipramine significantly lowers the levels of CRH precursors in the hypothalamus. Imipramine given for two months to healthy persons with normal cortisol levels causes a gradual and sustained decrease in CRH secretion and other HPA-axis functions, indicating that down-regulation of important components of the stress response is an intrinsic effect of imipramine.

Depression is also associated with inflammatory disease. About 20 percent of patients with rheumatoid arthritis develop clinical depression at some point during the course of their arthritic disease. A questionnaire commonly used by clinicians to diagnose depression contains about a dozen questions that are almost always  answered affirmatively by patients with arthritis.

Next : Stress and Illness
- 0 comments

The Mind-Body Interaction in Disease : 4. The Immune System’s Signals

By : Esther M. Sternberg and Philip W. Gold

The immune response is an elegant and finely tuned cascade of cellular events aimed at ridding the body of foreign substances, bacteria and viruses.

One of the major discoveries of contemporary immunology is that white blood cells produce small proteins that indirectly coordinate the responses of other parts of the immune system to pathogens. For example, the protein interleukin-1 (IL-1) is made by a type of white blood cell called a monocyte or macrophage. IL-1 stimulates another type of white blood cell, the lymphocyte, to produce interleukin-2 (IL-2), which in turn induces lymphocytes to develop into mature immune cells. Some mature lymphocytes, called plasma cells, make antibodies that fight infection, whereas others, the cytotoxic lymphocytes, kill viruses directly. Other interleukins mediate the activation of immune cells that are involved in allergic reactions.

The interleukins were originally named to reflect what was considered to be their primary function: communication between (“inter-”) the white blood cells (“leukins”). But it is now known that interleukins also act as chemical signals between immune cells and many other types of cells and organs, including parts of the brain, and so a new name—“cytokine”—has been coined. Cytokines are biological molecules that cells use to communicate.  Each cytokine is a distinct protein molecule, encoded by a separate gene, that targets a particular cell type. A cytokine can either stimulate or inhibit a response depending on the presence of other cytokines or other stimuli and the current state of metabolic activity. This flexibility allows the immune system to take the most appropriate actions to stabilize the local cellular environment and to maintain homeostasis.

Cytokines from the body’s immune system can send signals to the brain in several ways. Ordinarily, a  “bloodbrain barrier” shields the central nervous system from potentially dangerous molecules in the bloodstream. During inflammation or illness, however, this barrier becomes more permeable, and cytokines may be carried across into the brain with nutrients from the blood. Certain cytokines, on the other hand, readily pass through at any time. But cytokines do not have to cross the blood-brain barrier to exert their effects. Cytokines made in the lining of blood vessels in the brain can stimulate the release of secondary chemical signals in the brain tissue around the blood vessels.

Cytokines can also signal the brain via direct nerve routes, such as the vagus nerve, which innervates the heart, stomach, small intestine and other organs of the abdominal cavity. Injection of IL-1 into the abdominal cavity activates the nucleus of the tractus solitarius, the principal region of the brain stem for receipt of visceral sensory signals. Cutting the vagus nerve blocks activation of the tractus nucleus by IL-1. Sending signals along nerve routes is the most rapid mechanism—on the order of milliseconds—by which cytokines signal the brain.

Activation of the brain by cytokines from the peripheral parts of the body induces behaviors of the stress response, such as anxiety and cautious avoidance, that keep the affected individual out of harm’s way until full healing occurs. Anyone who has experienced lethargy and excess sleepiness during an illness will recognize this set of characteristic responses as “sickness behavior.”

Hypothalamus-Pituitary-Adrenal (HPA) Axis
HPA AXIS is a central component of the brain’s neuroendocrine response to stress. The
hypothalamus, when stimulated, secretes corticotropin-releasing hormone (CRH) into
the hypophyseal portal system, which supplies blood to the anterior pituitary. CRH
stimulates the pituitary (red arrows show stimulatory pathways) to secrete adrenocorticotropin
hormone (ACTH) into the bloodstream. ACTH causes the adrenal glands to
release cortisol, the classic stress hormone that arouses the body to meet a challenging
situation. But cortisol then modulates the stress response (blue arrows indicate inhibitory
effects) by acting on the hypothalamus to inhibit the continued release of
CRH. Also a potent immunoregulator, cortisol acts on many parts of the immune system
to prevent it from overreacting and harming healthy cells and tissue.

Neurons and nonneuronal brain cells also produce cytokines of their own. Cytokines in the brain regulate  nerve cell growth and death, and they also can be recruited by the immune system to stimulate the release of CRH. The IL-1 cytokine system in the brain is currently the best understood—all its components have been identified, including receptors and a naturally occurring antagonist that binds to IL-1 receptors without activating them. The anatomical and cellular locations of this IL-1 circuitry are being mapped out in detail, and this new knowledge will aid researchers in designing drugs that block or enhance the actions of such circuits and the functions they regulate.

Excessive amounts of cytokines in the brain can be toxic to nerves. In genetically engineered mice, transplanted genes that overexpress cytokines produce neurotoxic effects. Some of the neurological symptoms of AIDS in humans also may be caused by overexpression of certain cytokines in the brain. High levels of IL-1 and other cytokines have been found in the brain tissue of patients living with AIDS, concentrated in areas around the giant macrophages that invade the patients’ brain tissue.

Any disruption of communication between the brain and the immune system leads to greater susceptibility to inflammatory disease and, frequently, to increased severity of the immune complications. For instance, animals whose brain-immune communications have been disrupted (through surgery or drugs) are highly liable to lethal complications of inflammatory diseases and infectious diseases.


INTERACTION BRAIN AND IMMUNE SYSTEM
Brain and immune system can either stimulate (red arrows)
or inhibit (blue arrows) each other. Immune cells produce cytokines
(chemical signals) that stimulate the hypothalamus through
the bloodstream or via nerves elsewhere in the body. The hormone
CRH, produced in the hypothalamus, activates the HPA axis. The
release of cortisol tunes down the immune system. CRH, acting on
the brain stem, stimulates the sympathetic nervous system, which
innervates immune organs and regulates inflammatory responses
throughout the body. Disruption of these communications in any way
leads to greater susceptibility to disease and immune complications.
Susceptibility to inflammatory disease that is associated with genetically impaired stress response can be found across species—in rats, mice, chickens and, though the evidence is less direct, humans. For instance, the Lewis strain of rat is naturally prone to many inflammatory diseases because of a severe impairment of its HPA axis, which greatly diminishes CRH secretion in response to stress. In contrast, the hyperresponsive HPA-axis in the Fischer strain of rat provides it with a strong resistance to inflammatory disease.

Evidence of a causal link between an impaired stress response and susceptibility to inflammatory disease comes from pharmacological and surgical studies. Pharmacological intervention such as treatment with a drug that blocks cortisol receptors enhances autoimmune inflammatory disease. Injecting low doses of cortisol into disease-susceptible rats enhances their resistance to inflammation. Strong evidence comes from surgical intervention. Removal of the pituitary gland or the adrenal glands from rats that normally are resistant to inflammatory disease renders them highly susceptible. Further proof comes from studies in which the  transplantation of hypothalamic tissue from disease-resistant rats into the brain of susceptible rats dramatically improves their resistance to peripheral inflammation.

These animal studies demonstrate that disruption of the brain’s stress response enhances the body’s response to inflammatory disease, and reconstitution of the stress response reduces susceptibility to inflammation. One implication of these findings is that disruption of the brain-immune communication system by inflammatory, toxic or infectious agents could contribute to some of the variations in the course of the immune system’s inflammatory response.

Next : CRH and Depression
Sunday, October 9, 2011 - 0 comments

The Mind-Body Interaction in Disease : 3. Cross Communication

By : Esther M. Sternberg and Philip W. Gold

Both systems also rely on chemical mediators for communication. Electrical signals along nerve pathways, for instance, are converted to chemical signals at the synapses between neurons. The chemical messengers produced by immune cells communicate not only with other parts of the immune system but also with the  brain and nerves, and chemicals released by nerve cells can act as signals to immune cells. Hormones from the body travel to the brain in the bloodstream, and the brain itself makes hormones. Indeed, the brain is  perhaps the most prolific endocrine organ in the body and produces many hormones that act both on the   brain and on tissues throughout the body.

A key hormone shared by the central nervous and immune systems is corticotropin- releasing hormone (CRH); produced in the hypothalamus and several other brain regions, it unites the stress and immune responses. The hypothalamus releases CRH into a specialized bloodstream circuit that conveys the hormone  to the pituitary gland, which is just beneath the brain. CRH causes the pituitary to release adrenocorticotropin
hormone (ACTH) into the bloodstream, which in turn stimulates the adrenal glands to produce cortisol, the  best-known hormone of the stress response.

Cortisol is a steroid hormone that increases the rate and strength of heart contractions, sensitizes blood vessels to the actions of norepinephrine (an adrenalinelike hormone) and affects many metabolic functions —actions that help to prepare the body to meet a stressful situation. In addition, cortisol is a potent  immunoregulator and antiinflammatory agent. It plays a crucial role in preventing the immune system from overreacting to injuries and damaging tissues. Furthermore, cortisol inhibits the release of CRH by the  hypothalamus—a simple feedback loop that keeps this component of the stress response under control. Thus, CRH and cortisol directly link the body’s brainregulated stress response and its immune response.

CRH-secreting neurons of the hypothalamus send fibers to regions in the brain stem that help to regulate the sympathetic nervous system, as well as to another brain stem area called the locus ceruleus. The sympathetic nervous system, which mobilizes the body during stress, also innervates immune organs, such as the thymus,  lymph nodes and spleen, and helps to control inflammatory responses throughout the body. Stimulation of the  locus ceruleus leads to behavioral arousal, fear and enhanced vigilance.

Perhaps even more important for the induction of fear-related behaviors is the amygdala, where inputs from  the sensory regions of the brain are charged as stressful or not. CRH-secreting neurons in the central nucleus  of the amygdala send fibers to the hypothalamus and the locus ceruleus, as well as to other parts of the brain  stem. These CRH-secreting neurons are targets of messengers released by immune cells during an immune response. By recruiting the CRH-secreting neurons, the immune signals not only activate cortisolmediated restraint of the immune response but also induce behaviors that assist in recovery from illness or injury. CRH-secreting neurons also have connections with hypothalamic regions that regulate food intake and  reproductive behavior. In addition, there are other hormonal and nerve systems, such as the thyroid, growth and female sex hormones, and the sympathomedullary pathways, that influence brain–immune system  interactions.

Next : The Immune System’s Signals
- 0 comments

The Mind-Body Interaction in Disease : 2. Anatomy of the Stress and Immune Systems

by : Esther M. Sternberg and Philip W. Gold


When homeostasis is disturbed or threatened, a repertoire of molecular, cellular and behavioral responses comes into play. These responses attempt to counteract the disturbing forces in order to reestablish a steady state. They can be specific to the foreign invader or a particular stress, or they can be generalized and  nonspecific when the threat to homeostasis exceeds a certain threshold. The adaptive responses may themselves turn into stressors capable of producing disease. We are just beginning to understand the many ways in which the brain and the immune system are interdependent, how they help to regulate and counterregulate each other and how they themselves can malfunction and produce disease.

The stress response promotes physiological and behavioral changes that enhance survival in threatening or  taxing situations. For instance, when we are facing a potentially life-threatening situation, the brain’s stress response goes into action to enhance our focused attention, our fear and our fight-or-flight response, while simultaneously inhibiting behaviors, such as feeding, sex and sleep, that might lessen the chance of immediate  survival. The stress response, however, must be regulated to be neither excessive nor suboptimal; otherwise, disorders of arousal, thought and feeling emerge.

The immune system’s job is to bar foreign pathogens from  the body and to recognize and destroy those that penetrate its shield. The immune system also must neutralize potentially dangerous toxins, facilitate repair of damaged or worn tissues, and dispose of abnormal cells. Its responses are so powerful that they require constant regulation to ensure that they are neither excessive nor indiscriminate and yet remain effective. When the immune system escapes regulation, autoimmune and  inflammatory diseases or immune deficiency syndromes result.

The immune and central nervous systems appear, at first glance, to be organized in very different ways. The  brain is usually regarded as a centralized command center, sending and receiving electrical signals along fixed  pathways, much like a telephone network. In contrast, the immune system is decentralized, and its organs (spleen, lymph nodes, thymus and bone marrow) are located throughout the body. The classical view is that  the immune system communicates by releasing immune cells into the bloodstream that float, like boats, to new locations to deliver their messages or to perform other functions. The central nervous and immune systems, however, are in fact more similar than different in their modes of receiving, recognizing and integrating signals from the external environment and in their structural design for accomplishing these tasks. Both the central
nervous system and the immune system possess “sensory” elements, which receive information from the  nvironment and other parts of the body, and “motor” elements, which carry out an appropriate response.

Next : Cross Communication
- 0 comments

The Mind-Body Interaction in Disease : 1. Preview

By : Esther M. Sternberg and Philip W. Gold

The belief that the mind plays an important role in physical illness goes back to the earliest days of medicine. From the time of the ancient Greeks to the beginning of the 20th century, it was generally accepted by both
physician and patient that the mind can affect the course of illness, and it seemed natural to apply this concept in medical treatments of disease. After the discovery of antibiotics, a new assumption arose that treatment of infectious or inflammatory disease requires only the elimination of the foreign organism or agent that triggers the illness. In the rush to discover new antibiotics and drugs that cure specific infections and diseases, the fact that the body’s own responses can influence susceptibility to disease and its course was largely ignored by medical researchers.

It is ironic that research into infectious and inflammatory disease first led 20th-century medicine to reject the idea that the mind influences physical illness, and now research in the same field—including the work of our laboratory and of our collaborators at the National Institutes of Health—is proving the contrary. New molecular and pharmacological tools have made it possible for us to identify the intricate network that exists between the immune system and the brain, a network that allows the two systems to signal  each other continuously and rapidly. Chemicals produced by immune cells signal the brain, and the rain in turn sends chemical signals to restrain the immune system. These same chemical signals also affect behavior and the response to stress. Disruption of this communication network in any way, whether inherited or through drugs, toxic substances or surgery, exacerbates the diseases that these systems guard against: infectious, inflammatory, autoimmune and associated mood disorders.

The clinical significance of these findings is likely to prove profound. They hold the
promise of extending the range of therapeutic treatments available for various disorders,
as drugs previously known to work primarily for nervous system problems
are shown to be effective against immune maladies, and vice versa. They also help
to substantiate the popularly held impression (still discounted in some medical circles)
that our state of mind can influence how well we resist or recover from infectious
or inflammatory diseases.
Immune response can be
altered at the cellular level
by stress hormones.

The brain’s stress response system is activated in threatening situations. The immune system responds automatically to pathogens and foreign molecules. These two response systems are the body’s principal means for maintaining an internal steady state called homeostasis. A substantial proportion of human cellular machinery is dedicated to maintaining it.

Next : Anatomy of the Stress and Immune Systems
- 0 comments

The Persistent Mystery of Our Selves

By : John Rennie

Master detective Hercule Poirot, the hero of many an Agatha Christie novel, boasted repeatedly about the power of “the little gray cells” in his head to solve the toughest mysteries. For philosophers, writers and other
thinkers, however, those little gray cells have been the greatest mystery of all. How do a couple of pounds of spongy, electrically active tissue give rise to a psychological essence?

How do we emerge from the neural thicket? Empirical scientists may be relative newcomers to this investigation (unlike the philosophers, they’ve been on the case for only a few hundred years), but they have
taken long strides forward in that short time. In this special issue of Scientific American, some of the lead-in neuroscience and in psychology discuss how much is now known about the nature of consciousness, memory, emotions, creativity, dreams and other mental phenomena. Their answers suggest that some of these mysteries may be largely solved within our lifetimes—even if new ones are posed in the process.
But treat these articles as you would any good detective story: don’t turn right to the end for the answers. Half the fun is in tracing the deductions.

To be conscious that we are perceiving or thinking is to be conscious of our own existence. 
—Aristotle

Memory is the cabinet of imagination, the treasury of reason, the registry of conscience, and the council chamber of thought. 
—St. Basil

The whole machinery of our intelligence, our general ideas and laws, fixed and external objects, principles, persons, and gods, are so many symbolic, algebraic expressions. 
—George Santayana

I have a prodigious quantity of mind; it takes me as much as a week sometimes to make it up.
—Mark Twain
- 1 comments

The Evolution of Human Birth : 6. Growing Bigger Brains

By : Karen R. Rosenberg and Wenda R. Trevathan

IF BIPEDALISM ALONE did not introduce into the process of childbirth enough difficulty for mothers to benefit from assistance, then the expanding size of the hominid brain certainly did. The most significant expansion in adult and infant brain size evolved subsequent to the australopithecines, particularly in the genus Homo. Fossil remains of the pelvis of early Homo are quite rare, and the best-preserved specimen, the 1.6-
million-year-old Nariokotome fossil from Kenya, is an adolescent often referred to as Turkana Boy. Researchers have estimated that the boy’s adult relatives probably had brains about twice as large as those of australopithecines but still only two thirds the size of modern human brains.
BABY BORN FACING FORWARD makes it possible
for a monkey mother to reach down and
carefully guide the infant out of the birth canal.
She can also wipe mucus from the baby’s face
to assist its breathing.


By reconstructing the shape of the boy’s pelvis from fragments, Christopher B. Ruff of Johns Hopkins University and Alan Walker of Pennsylvania State University have estimated what he would have looked like had he reached adulthood. Using predictable differences between male and female pelvises in more recent hominid species, they could also infer what a female of that species would have looked like and could estimate the shape of the birth canal. That shape turns out to be a flattened oval similar to that of the australopithecines. Based on these reconstructions, the researchers determined that Turkana Boy’s kin probably had a birth mechanism like that seen in australopithecines.

In recent years, scientists have been testing an important hypothesis that follows from Ruff and Walker’s assertion: the pelvic anatomy of early Homo may have limited the growth of the human brain until the  evolutionary point at which the birth canal expanded enough to allow a larger infant head to pass. This
assertion implies that bigger brains and roomier pelvises were linked from an evolutionary perspective. Individuals who displayed both characteristics were more successful at giving birth to offspring who survived to pass on the traits. These changes in pelvic anatomy, accompanied by assisted birth, may have allowed the
dramatic increase in human brain size that took place from two million to 100,000 years ago.

Fossils that span the past 300,000 years of human evolution support the connection between the expansion of
brain size and changes in pelvic anatomy. In the past 20 years, scientists have uncovered three pelvic fossils of archaic Homo sapiens: a male from Sima de los Huesos in Sierra Atapuerca, Spain (more than 200,000 years old); a female from Jinniushan, China (280,000 years old); and the male Kebara Neandertal—which
is also an archaic H. sapiens—from Israel (about 60,000 years old). These specimens all have the twisted pelvic openings characteristic of modern humans, which suggests that their large-brained babies would most likely have had to rotate the head and shoulders within the birth canal and would thus have emerged facing
away from the mother—a major challenge that human mothers face in delivering their babies safely.

The triple challenge of big-brained infants, a pelvis designed for walking upright, and a rotational delivery in which the baby emerges facing backward is not merely a contemporary circumstance. For this reason, we suggest that natural selection long ago favored the behavior of seeking assistance during birth because
such help compensated for these difficulties. Mothers probably did not seek assistance solely because they predicted the risk that childbirth poses, however. Pain, fear and anxiety more likely drove their desire for companionship and security.

Psychiatrists have argued that natural selection might have favored such emotions—also common during illness and injury—because they led individuals who experienced them to seek the protection of companions, which would have given them a better chance of surviving [see “Evolution and the Origins of Disease,” by Randolph M. Nesse and George C. Williams; Scientific American, November 1998]. The offspring of
the survivors would then also have an enhanced tendency to experience such emotions during times of pain or disease. Taking into consideration the evolutionary advantage that fear and anxiety impart, it is no surprise that women commonly experience these emotions during labor and delivery.

Modern women giving birth have a dual evolutionary legacy: the need for physical as well as emotional support. When Sophia Pedro gave birth in a tree surrounded by raging floodwaters, she may have had both kinds of assistance. In an interview several months after her helicopter rescue, she told reporters that her mother-in-law, who was also in the tree, helped her during delivery. Desire for this kind of support, it appears, may well be as ancient as humanity itself.

* * *

KAREN R. ROSENBERG and WENDA R. TREVATHAN bring different perspectives to the study of human birth. Rosenberg, a paleoanthropologist at the University of Delaware, specializes in pelvic morphology and has studied hominid fossils from Europe, Israel, China and South Africa. About 15 years ago she began studying the pelvis as a way to reconstruct the evolution of the birth process. That’s when she met Trevathan, a biological anthropologist at New Mexico State University, whose particular interests include childbirth, maternal behavior, sexuality, menopause and evolutionary medicine. Both authors have experienced birth firsthand: Rosenberg has two daughters, and Trevathan is trained as a midwife.
- 0 comments

The Evolution of Human Birth : 5. Childbirth across Cultures

By : Karen R. Rosenberg and Wenda R. Trevathan

SQUATTING is one of the most typical positions for
women to give birth in non-Western cultures.
 
THE COMPLICATED CONFIGURATION of the human birth canal is such that laboring women and their babies benefit—by lower rates of mortality, injury and anxiety— from the assistance of others. This evolutionary reality helps to explain why attended birth is a near universal feature of human cultures. Individual women throughout history have given birth alone in certain circumstances, of course. But much more common is the attendance of familiar friends and relatives, most of whom are women. (Men may be variously forbidden, tolerated, welcomed or even required at birth.) In Western societies, where women usually give birth in the presence of strangers, recent research on birth practices has also shown that a doula—a person who provides social and emotional support to a woman in labor—reduces the rate of complications.

In many societies, a woman may not be recognized as an adult until she has had a baby. The preferred location of the delivery is often specified, as are the positions that the laboring women assume. The typical expectation in Western culture is that women should give birth lying flat on their backs on a bed, but in the rest of the world the most prevalent position for the delivery is upright—sitting, squatting or, in some cases, standing.

Next : Growing Bigger Brains
http://adfoc.us/198212980714
- 0 comments

The Evolution of Human Birth : 4. Walking on Two Legs

By : Karen R. Rosenberg and Wenda R. Trevathan

IN MODERN HUMANS, both bipedalism and enlarged brains constrain birth in important ways, but the first fundamental shift away from a nonhuman primate way of birth came about because of bipedalism alone. This unique way of walking appeared in early human ancestors of the genus Australopithecus at least four million years ago [see “Evolution of Human Walking,” by C. Owen Lovejoy; Scientific American, November
1988]. Despite their upright posture, australopithecines typically stood no more than four feet tall, and their
brains were not much bigger than those of living chimpanzees. Recent evidence has called into question which of the several australopithecine species were part of the lineage that led to Homo. Understanding the way any of them gave birth is still important, however, because walking on two legs would have constricted the maximum size of the pelvis and birth canal in similar ways among related species.

The anatomy of the female pelvis from this time period is well known from two complete fossils. Anthropologists unearthed the first (known as Sts 14 and presumed to be 2.5 million years old) in Sterkfontein, a site in the Transvaal region of South Africa. The second is best known as Lucy, a fossil discovered in the Hadar region of Ethiopia and dated at just over three million years old. Based on these specimens and on estimates of newborns’ head size, C. Owen Lovejoy of Kent State University and Robert G. Tague of Louisiana State University concluded in the mid-1980s that birth in early hominids was unlike that
known for any living species of primate.

The shape of the australopithecine birth canal is a flattened oval with the greatest dimension from side to side at both the entrance and exit. This shape appears to require a birth pattern different from that of monkeys, apes or modern humans. The head would not have rotated within the birth canal, but we think that in order for the shoulders to fit through, the baby might have had to turn its head once it emerged. In other words,
if the baby’s head entered the birth canal facing the side of the mother’s body, its shoulders would have been oriented in a line from the mother’s belly to her back. This starting position would have meant that the shoulders probably also had to turn sideways to squeeze through the birth canal.

This simple rotation could have introduced a kind of difficulty in australopithecine deliveries that no other known primate species had ever experienced. Depending on which way the baby’s shoulders turned, its head could have exited the birth canal facing either forward or backward relative to the mother. Because the australopithecine birth canal is
a symmetrical opening of unchanging shape, the baby could have just as easily turned its shoulders toward the front or back of its body, giving it about a 50–50 chance of emerging in the easier, face-forward position. If the infant were born facing backward, the australopithecine mother—like modern human mothers—may well have benefited from some kind of assistance.

Next : Childbirth across Cultures
http://adfoc.us/198212980709
- 0 comments

The Evolution of Human Birth : 3. Assisted Birth

By : Karen R. Rosenberg and Wenda R. Trevathan

OF COURSE, OUR ANCESTORS and even women today can and do give birth alone successfully. Many fictional accounts portray stalwart peasant women giving birth alone in the fields, perhaps most famously in the novel The Good Earth, by Pearl S. Buck. Such images give the impression that delivering babies is easy. But anthropologists who have studied childbirth in cultures around the world report that these perceptions
are highly romanticized and that human birth is seldom easy and rarely unattended. Today virtually all women in all societies seek assistance at delivery. Even among the !Kung of southern Africa’s Kalahari Desert—who
are well known for viewing solitary birth as a cultural ideal—women do not usually manage to give birth alone until they have delivered several babies at which mothers, sisters or other women are present. So, though rare exceptions do exist, assisted birth comes close to being a universal custom in human cultures.


Knowing this—and believing that this practice is driven by the difficulty and risk that accompany human birth—we began to think that midwifery is not unique to contemporary humans but instead has its roots deep in our ancestry. Our analysis of the birth process throughout human evolution has led us to suggest that the practice of midwifery might have appeared as early as five million years ago, when bipedalism constricted
the size and shape of the pelvis and birth canal.

A behavior pattern as complex as midwifery obviously does not fossilize, but pelvic bones do. The tight fit between the infant’s head and the mother’s birth canal in humans means that the mechanism of birth can be reconstructed if we know the relative sizes of each. Pelvic anatomy is now fairly well known from most time periods in the human fossil record, and we can estimate infant brainand skull size based on our extensive
knowledge of adult skull sizes. (The delicate skulls of infants are not commonly found preserved until the point when humans began to bury their dead about 100,000 years ago.) Knowing the size and shape of the skulls and pelvises has also helped us and other researchers to understand whether infants were born facing forward or backward relative to their mothers—in turn revealing how challenging the birth might have been.

Next : Walking on Two Legs
http://adfoc.us/198212980696
- 0 comments

The Evolution of Human Birth : 2. Tight Squeeze

By : Karen R. Rosenberg and Wenda R. Trevathan

TO TEST OUR THEORY that the practicetice of assisted birth may have been around for millennia, we considered first what scientists know about the way a primate baby fits through the mother’s birth canal. Viewed from above, the infant’s head is basically an oval, longest from the forehead to the back of the
head and narrowest from ear to ear. Conveniently, the birth canal—the bony opening in the pelvis through which the baby must travel to get from the uterus to the outside world—is also an oval shape. The challenge of birth for many primates is that the size of the infant’s head is close to the size of that opening.

For humans, this tight squeeze is complicated by the birth canal’s not being a constant shape in cross section. The entrance of the birth canal, where the baby begins its journey, is widest from side to side relative to the mother’s body. Midway through, however, this orientation shifts 90 degrees, and the long axis of the oval extends from the front of the mother’s body to her back. This means that the human infant must negotiate a
series of turns as it works its way through the birth canal so that the two parts of its body with the largest dimensions—the head and the shoulders—are always aligned with the largest dimension of the birth canal [see illustration at right].

BABY BORN FACING BACKWARD, with the back of
its head against the mother’s pubic bones,
makes it difficult for a human female to guide the
infant from the birth canal—the opening in the
mother’s pelvis (insets)—without assistance.
To understand the birth process from the mother’s point of view, imagine you are about to give birth. The baby is most likely upside down, facing your side, when its head enters the birth canal.  Midway through the canal, however, it must turn to face your back, and the back of its head is pressed against your pubic bones. At that time, its shoulders are oriented side to side. When the baby exits your body, it is still facing backward,
but it will turn its head slightly to the side. This rotation helps to turn the baby’s shoulders so that they can also fit between your pubic bones and tailbone. To appreciate the close correspondence of the maternal and fetal  dimensions, consider that the average pelvic opening in human females is 13 centimeters at its largest diameter and 10 centimeters at its smallest. The average infant head is 10 centimeters from front to back, and the
shoulders are 12 centimeters across. This journey through a passageway of changing cross-sectional shape makes human birth difficult and risky for the vast majority of mothers and babies.


If we retreat far enough back along the family tree of human ancestors, we would eventually reach a point where birth was not so difficult. Although humans are more closely related to apes genetically, monkeys may present a better model for birth in prehuman primates. One line of reasoning to support this assertion
is as follows: Of the primate fossils discovered from the time before the first known hominids, one possible remote ancestor is Proconsul, a primate fossil dated to about 25 million years ago. This tailless creature  probably looked like an ape, but its skeleton suggests that it moved more like a monkey. Its pelvis, too, was more monkeylike. The heads of modern monkey infants are typically about 98 percent the diameter of the
mother’s birth canal—a situation more comparable with that of humans than that of chimps, whose birth canals are relatively spacious. Despite the monkey infant’s tight squeeze, its entrance into the world is less challenging than that of a human baby. In contrast to the twisted birth canal of modern humans, monkeys’
birth canals maintain the same cross-sectional shape from entrance to exit. The longest diameter of this oval shape is oriented front to back, and the broadest part of the oval is against the mother’sback. A monkey infant enters the birth canal headfirst, with the broad back of its skull against the roomy back of the mother’s pelvis and tailbone. That means the baby monkey emerges from the birth canal face forward—in other words, facing the same direction as the mother.

Firsthand observations of monkey deliveries have revealed a great advantage in babies’ being born facing  forward. Monkeys give birth squatting on their hind legs or crouching on all fours. As the infant is born, the mother reaches down to guide it out of the birth canal and toward her nipples. In many cases, she also wipes mucus from the baby’s mouth and nose to aid its breathing. Infants are strong enough at birth to take
part in their own deliveries. Once their hands are free, they can grab their mother’s body and pull themselves out.

If human babies were also born face forward, their mothers would have a much easier time. Instead the evolutionary modifications of the human pelvis that enabled hominids to walk upright necessitate that most infants exit the birth canal with the back of their heads against the pubic bones, facing in the opposite
direction as the mother (in a position obstetricians call “occiput anterior”). For this reason, it is difficult for the
laboring human mother—whether squatting, sitting, or lying on her back—to reach down and guide the baby as it emerges. This configuration also greatly inhibits the mother’s ability to clear a breathing passage for the infant, to remove the umbilical cord from around its neck or even to lift the baby up to her breast. If she tries to accelerate the delivery by grabbing the baby and guiding it from the birth canal, she risks bending its back awkwardly against the natural curve of its spine. Pulling on a newborn at this angle risks injury to its spinal
cord, nerves and muscles.

For contemporary humans, the response to these challenges is to seek assistance during labor and delivery. Whether a technology-oriented professional, a lay midwife or a family member who is familiar with the birth process, the assistant can help the human mother do all the things the monkey mother does by herself. The assistant can also compensate for the limited motor abilities of the relatively helpless human infant. The advantages of even simple forms of assistance have reduced maternal and infant mortality throughout history.

Next : Assisted Birth
http://adfoc.us/198212980692
- 0 comments

The Evolution of Human Birth : 1. Preview

By : Karen R. Rosenberg and Wenda R. Trevathan

The difficulties of childbirth have probably challenged humans and their ancestors for millions of years—which means that the modern custom of seeking assistance during delivery may have similarly ancient roots.

GIVING BIRTH IN THE TREETOPS is not the normal human way of doing things, but that is exactly what Sophia Pedro was forced to do during the height of the floods that ravaged southern Mozambique in March 2000. Pedro had survived for four days perched high above the raging floodwaters that killed more than 700 people in the region. The day after her delivery, television broadcasts and newspapers all over the world featured images of Pedro and her newborn child being plucked from the tree during a dramatic helicopter rescue. Treetop delivery rooms are unusual for humans but not for other primate species. For millions of
years, primates have secluded themselves in treetops or bushes to give birth. Human beings are the only primate species that regularly seeks assistance during labor and delivery. So when and why did our female ancestors abandon their unassisted andsolitary habit? The answers lie in the difficult and risky nature of human birth.

Many women know from experience that pushing a baby through the birth canal is no easy task. It’s the price we pay for our large brains and intelligence: humans have exceptionally big heads relative to the size of their bodies. Those who have delved deeper into the subject know that the opening in the human pelvis
through which the baby must pass is limited in size by our upright posture. But only recently have anthropologists begun to realize that the complex twists and turns that human babies make as they
travel through the birth canal have troubled humans and their ancestors for at least 100,000 years. Fossil clues also indicate that anatomy, not just our social nature, has led human mothers—in contrast
to our closest primate relatives and almost all other mammals—to ask for help during childbirth. Indeed, this practice of seeking assistance may have been in place when the earliest members of our genus, Homo, emerged and may possibly date back to five million years ago, when our ancestors first began to walk upright
on a regular basis.

Next : Tight Squeeze
http://adfoc.us/198212980686
- 0 comments

Once were Canibals : 4. Understanding Cannibalism

By : Tim D.White

IT REMAINS MUCH more challenging to establish why cannibalism took place than to establish that it did. People usually eat because they are hungry, and most prehistoric cannibals were therefore probably hungry. But discerning more than that—such as whether the taste of human flesh was pleasing or whether cannibalism
presented a way to get through the lean times or a satisfying way to get rid of outsiders—requires knowledge not yet available to archaeologists. Even in the case of the Anasazi, who have been well studied, it is impossible to determine whether cannibalism resulted from starvation or was rooted in religious beliefs,
or was some combination of these and other things. What is becoming clear through the refinement of the science of archaeology, however, is that cannibalism is part of our collective past.




ETHNOHISTORICAL REPORTS of cannibalism have been recorded for centuries in
many corners of the globe. Although some involve
well-documented accounts by eyewitnesses—such as the Donner Party expedition—other accounts by explorers, missionaries, travelers and soldiers often lack credibility. For example, these two artists’ portraits depict cannibalism catalyzed by starvation in
China in the late 1800s and a European view of cannibalism in the New World (based on a woodcut
from 1497). Such ethnohistorical accounts do not
carry the weight of archaeological and forensic evidence. They may, however, serve as rich sources of testable hypotheses, guiding future archaeological excavations.


 * * *


TIM D. WHITE is co-director of the Laboratory for Human Evolutionary Studies of the Museum of Vertebrate Zoology at the University of California, Berkeley. He is also a professor in Berkeley’s department of integrative biology and a member of the National Academy of Sciences.
White co-directs the Middle Awash research project in Ethiopia. His research interests are human paleontology, Paleolithic archaeology, and the interpretation of bone modification in contexts ranging from prehistoric archaeology to contemporary forensic situations.

http://adfoc.us/198212980674
- 0 comments

Once were Canibals : 3. Early European Cannibals

By : Tim D.White

THE MOST IMPORTANT paleoanthropological site in Europe lies in northern Spain, in the foothills of the
Sierra de Atapuerca. The oldest known section so far is the Gran Dolina, currently under excavation. The team working there has recovered evidence of occupation some 800,000 years ago by what may prove to be a new species of human ancestor, H. antecessor. The hominid bones were discovered in one horizon of the cave’s sediment, intermingled with stone tools and the remains of prehistoric game animals such as deer, bison and rhinoceros. The hominid remains consist of 92 fragments from six individuals. They bear unmistakable traces of butchery with stone tools, including the skinning and removal of flesh and the  processing of the braincase and the long bones for marrow. This pattern of butchery matches that seen on the nearby animal bones, providing the earliest evidence of hominid cannibalism.

Cannibalism among Europe’s much younger Neandertals—who lived between 35,000 and 150,000 years  ago—has been debated since the late 1800s, when the great Croatian paleoanthropologist Dragutin Gorjanovi ˇc-Kramberger found the broken, cut-marked and scattered remains of more than 20 Neandertals
entombed in the sands of the Krapina rock-shelter. Unfortunately, these soft fossil bones were roughly extracted (by today’s standards) and then covered with thick layers of preservative, which obscured evidence of processing and made interpretation exceedingly difficult. Some workers believe that the Krapina bones show clear signs of cannibalism; others have attributed the patterns of damage to rocks falling from the cave’s ceiling, to carnivore chewing or to some form of burial. But recent analysis of the bones from Krapina and from another Croatian cave, Vindija—which has younger Neandertal and animal remains—indicates that
cannibalism was practiced at both sites.

In the past few years, yet another site has offered evidence. On the banks of the Rhône River in southeastern France, Alban Defleur of the University of the Mediterranean at Marseilles has been excavating the cave of Moula-Guercy for more than a decade. Neandertals occupied this small cave 100,000 years ago. In one layer the team unearthed the remains of at least six Neandertals, ranging in age from six years to adult. Defleur’s meticulous excavation and recovery standards have yielded data every bit the equivalent of a modern forensic crime scene investigation. Each fragment of fauna and Neandertal bone, each macrobotanical clue, each stone tool has been precisely plotted three-dimensionally. This care has allowed an understanding of how the bones were spread around a hearth that has been cold for 1,000 centuries.

HAMMERING
It is clear from the
archaeological record
that meat—fat or muscle
or other tissue—on the
bone was not the only
part of the body that was
consumed. Braincases
were broken open, and
marrow was often
removed from long bones.
In these two examples,
stone hammers split the
upper arm bones
lengthwise, exposing
the marrow.
Microscopic analysis of the Neandertal bone fragments and the faunal remains has led to the same conclusion
that Spanish workers at the Gran Dolina site have drawn: cannibalism was practiced by some Paleolithic Europeans. Determining how often it was practiced and under what conditions represents a far more difficult challenge. Nevertheless, the frequency is striking. We know of just one very early European site with hominid remains, and those were cannibalized. The two Croatian Neandertal sites are separated by hundreds of  generations, yet analyses suggest that cannibalism was practiced at both. And recently a Neandertal site in France was shown to support the same interpretation. These findings are built on exacting standards of  evidence. Because of this, most paleoanthropologists these days are asking, “Why cannibalism?” rather
than “Was this cannibalism?”

Similarly, discoveries at much younger sites in the American Southwest have altered the way anthropologists think of Anasazi culture in this area. Corn agriculturists have inhabited the Four Corners region for centuries, building their pueblos and spectacular cliff dwellings and leaving one of the richest and most fine-grained archaeological records on earth. Christy G. Turner II of Arizona State University conducted pioneering
work on unusual sets of broken and burned human skeletal remains from Anasazi sites in Arizona, New
Mexico and Colorado in the 1960s and 1970s. He saw a pattern suggestive of cannibalism: site after site containing human remains with the telltale signs. Yet little in the history of the area’s more recent Puebloan peoples suggested that cannibalism was a widespread practice, and some modern tribes who claim descent
from the Anasazi have found the idea disturbing.

The vast majority of Anasazi burials involve whole, articulated skeletons frequently accompanied by decorated ceramic vessels that have become a favorite target of pot hunters in this area. But, as Turner recorded, several dozen sites had fragmented, often burned  human remains, and a larger pattern began to emerge. Over the past three decades the total number of human bone specimens from these sites has grown to tens of thousands, representing dozens of individuals spread across 800 years of prehistory and tens of thousands of square kilometers of the American Southwest. The assemblage that I analyzed in 1992 from an Anasazi site in the Mancos Canyon of southwestern Colorado, for instance, contained 2,106 pieces of bone from at least 29 Native American men, women and children.

These assemblages have been found in settlements ranging from small pueblos to large towns and were often contemporaneous with the abandonment of the dwellings. The bones frequently show evidence of roasting before the flesh was removed. They invariably indicate that people extracted the brain and cracked the limb bones for marrow after removing the muscle tissue. And some of the long bone splinters even show end polishing, a phenomenon associated with cooking in ceramic vessels. The bone fragments from Mancos revealed modifications that matched the marks left by Anasazi processing of game animals such as deer and bighorn sheep. The osteological evidence clearly demonstrated that humans were skinned and roasted, their muscles cut away, their joints severed, their long bones broken on anvils with hammerstones, their spongy bones crushed and the fragments circulated in ceramic vessels. But articles outlining the results have proved controversial. Opposition has sometimes seemed motivated more by politics than by science. Many practicing anthropologists believe that scientific findings should defer to social sensitivities. For such anthropologists, cannibalism is so culturally delicate, so politically incorrect, that they find any evidence for it impossible
to swallow.

The most compelling evidence in support of human cannibalism at the various Anasazi sites was published in
2000 by Richard A. Marlar of the University of Colorado School of Medicine and his colleagues. The workers excavated three Anasazi pit dwellings dating to approximately A.D. 1150 at a site called Cowboy Wash near Mesa Verde in southwestern Colorado. The same pattern of findings that had been documented
at other sites, such as Mancos, was present: disarticulated, broken, scattered human bones in nonburial contexts. Excellent preservation, careful excavation and thoughtful sampling provided a chemical dimension to the analysis and, finally, direct evidence of human cannibalism. Marlar and his colleagues discovered residues  of human myoglobin—a protein present in heart and skeletal muscle—on a ceramic vessel, suggesting that
human flesh had been cooked in the pot. An unburned human coprolite, or ancient feces, found in the fireplace of one of the abandoned dwellings also tested positive for human myoglobin. Thus, osteological, archaeological and biochemical data indicate that prehistoric cannibalism occurred at Cowboy Wash.
The biochemical data for processing and consumption of human tissue offer strong additional support for numerous osteological and archaeological findings across the Southwest.

Next : Understanding Cannibalism
http://adfoc.us/198212980669
- 0 comments

Once were Canibals : 2. Standards of Evidence

By : Tim D. White

IT WOULD BE HELPFUL if we could turn to modern-day cannibals with our questions, but such opportunities have largely disappeared. So today’s study of this intriguing behavior must be accomplished
through a historical science. Archaeology has therefore become the primary means of investigating the existence and extent of human cannibalism.

One of the challenges facing archaeologists, however, is the amazing variety of ways in which people dispose of their dead. Bodies may be buried, burned, placed on scaffolding, set adrift, put in tree trunks or fed to scavengers. Bones may be disinterred, washed, painted, buried in bundles or scattered on stones. In parts of Tibet, future archaeologists will have difficulty recognizing any mortuary practice at all. There most corpses
are dismembered and fed to vultures and other carnivores. The bones are then collected, ground into powder, mixed with barley and flour and again fed to vultures. Given the various fates of bones and bodies, distinguishing cannibalism from other mortuary practices can be quite tricky.

Scientists have thus set the standard for recognizing ancient cannibalism very high. They confirm the activity
when the processing patterns seen on human remains match those seen on the bones of other animals consumed for food. Archaeologists have long argued for such a comparison between human and faunal remains at a site. They reason that damage to animal bones and their arrangement can clearly show that
the animals had been slaughtered and eaten for food. And when human remains are unearthed in similar cultural contexts, with similar patterns of damage, discard and preservation, they may reasonably be interpreted as evidence of cannibalism.

When one mammal eats another, it usually leaves a record of its activities in the form of modifications to the consumed animal’s skeleton. During life, varying amounts of soft tissue, much of it with nutritive value, cover mammalian bones. When the tissue is removed and prepared, the bones often retain a record of this processing in the form of gnawing marks and fractures. When humans eat other animals, however, they
mark bones with more than just their teeth. They process carcasses with tools of stone or metal. In so doing, they leave imprints of their presence and actions in the form of scars on the bones. These same imprints can be seen on butchered human skeletal remains.

The key to recognizing human cannibalism is to identify the patterns of processing—that is, the cut marks, hammering damage, fractures or burns seen on the remains—as well as the survival of different bones and parts of bones. Nutritionally valuable tissues, such as brains and marrow, reside within the bones and can be removed only with forceful hammering—and such forced entry leaves revealing patterns of bone damage. When human bones from archaeological sites show patterns of damage uniquely linked to butchery by other humans, the inference of cannibalism is strengthened. Judging which patterns are consistent with dietary butchery can be based on the associated archaeological record—particularly the nonhuman food-animal
remains discovered in sites formed by the same culture—and checked against predictions embedded in ethnohistorical accounts.

CHOPPING
Hack marks visible on the left side
of this fragment of a human tibia
are testament to the removal of
muscle and tendon. Tools were also
used to make finer slices, to remove
tissue or to sever heads from
bodies. Archaeologists have to be
careful in their interpretations,
however, because humans process
their dead in many ways; not all
slice or hack marks indicate
cannibalism.

This comparative system of determining cannibalism emphasizes multiple lines of osteological damage and contextual evidence. And, as noted earlier, it sets the standard for recognizing cannibalism very high. With this approach, for instance, the presence of cut marks on bones would not by themselves be considered evidence of cannibalism. For example, an American Civil War cemetery would contain skeletal remains with cut marks made by swords and bayonets. Medical school cadavers are dissected and their bones cut-marked. With the threshold set so conservatively, most instances of past cannibalism will necessarily go unrecognized. A practice from Papua New Guinea, where cannibalism was recorded ethnographically, illustrates this point. There skulls of the deceased were carefully cleaned and the brains removed. The dry, mostly intact skulls were then handled extensively, often creating a polish on their projecting parts. They were sometimes painted and even mounted on poles for display and worship. Soft tissue, including brain matter, was eaten at the beginning of this process; thus, the practice would be identified as ritual cannibalism. If such skulls were encountered in an archaeological context without modern informants describing the cannibalism, they would not constitute direct evidence for cannibalism under the stringent criteria that my colleagues and I advocate.

Nevertheless, adoption of these standards of evidence has led us to some clear determinations in other, older situations. The best indication of prehistoric cannibalism now comes from the archaeological record of the American Southwest, where archaeologists have interpreted dozens of assemblages of human remains. Compelling evidence has also been found in Neolithic and Bronze Age Europe. Even Europe’s earliest hominid site has yielded convincing evidence of cannibalism.

Next : Early European Cannibals
http://adfoc.us/198212980666
- 0 comments

Once were Canibals : 1. Preview

By : Tim D.White

New scientific evidence is now bringing to light the truth about cannibalism. It has become obvious that long before the invention of metals, before Egypt’s pyramids were built, before the origins of agriculture, before the explosion of Upper Paleolithic cave art, cannibalism could be found among many different peoples—as well as among many of our ancestors. Broken and scattered human bones, in some cases thousands of them, have been discovered from the prehistoric pueblos of the American Southwest to the islands of the Pacific. The osteologists and archaeologists studying these ancient occurrences are using increasingly sophisticated analytical tools and methods. In the past several years, the results of their studies have finally provided convincing evidence of prehistoric cannibalism.
Human cannibalism has long intrigued anthropologists, and they have worked for decades to classify the phenomenon. Some divide the behavior according to the affiliation of the consumed. Thus, endocannibalism refers to the consumption  of individuals within a group, exocannibalism indicates the consumption of outsiders, and autocannibalism covers   everything from nail biting to torture-induced self-consumption. In addition, anthropologists have come up with lassifications to describe perceived or known motivations. Survival cannibalism is driven by starvation. Historically documented cases include the Donner Party—whose members were trapped during the harsh winter of 1846–47 in the  Sierra Nevada—and people marooned in the Andes or the Arctic with no other food. In contrast, ritual cannibalism occurs   when members of a family or community consume their dead during funerary rites in order to inherit their qualities or  honor their memory. And pathological cannibalism is generally reserved for criminals who consume their victims or, more often, for fictional characters such as Hannibal Lecter in The Silence of the Lambs.

Despite these distinctions, however, most anthropologists simply equate the term “cannibalism” with the regular, culturally encouraged consumption of human flesh. In the age of ethnographic exploration—which lasted from the time of Greek historian Herodotus in about 400 B.C. to the early 20th century—the non-Western world and its inhabitants were scrutinized by travelers, missionaries, military personnel and anthropologists. These observers told tales of human cannibalism in different places, from Mesoamerica to the Pacific islands to central Africa.

Controversy has often accompanied these claims. Anthropologists participated in only the last few waves of these cultural contacts—those that began in the late 1800s. As a result, many of the historical accounts of cannibalism have come to be viewed skeptically.

In 1979 anthropologist William Arens of the State University of New York at Stony Brook extended this theme by reviewing the ethnographic record of cannibalism in his book The Man-Eating Myth. Arens concluded that accounts of cannibalism among people from the Aztec to the Maori to the Zulu were either false or inadequately documented. His skeptical assertion has subsequently been seriously questioned, yet he nonetheless succeeded in identifying a significant gulf between these stories and evidence of cannibalism: “Anthropology has not maintained the usual standards of documentation and intellectual rigor expected when other topics are being considered. Instead, it has chosen uncritically to lend its support to the collective representations and thinly disguised prejudices of western culture about others.”
The anthropologists whom Arens was criticizing had not limited themselves to contemporary peoples. Some had projected their prejudices even more deeply—into the archaeological record. Interpretations of cannibalism inevitably followed  any discoveries of prehistoric remains. In 1871 American author Mark Twain weighed in on the subject in an essay later published in Life as I Find It: “Here is a pile of bones of primeval man and beast all mixed together, with no more damning evidence that the man ate the bears than that the bears ate the man—yet paleontology holds a coroner’s inquest in the fifth geologic period on an ‘unpleasantness’ which transpired in the quaternary, and calmly lays it on the MAN, and  hen adds to it what purports to be evidence of CANNIBALISM. I ask the candid reader, Does not this look like taking advantage of a gentleman who has been dead two million years....”


In the century after Twain’s remarks, archaeologists and physical anthropologists described the hominids Australopithecus africanus, Homo erectus and H.neanderthalensis as cannibalistic. According to some views, human prehistory from about three million years ago until very recently was rife with cannibalism. But in the early 1980s an important critical assessment of these conclusions appeared. Archaeologist Lewis Binford’s book Bones: Ancient Men and Modern Myths argued that claims for early hominid cannibalism were unsound. He built on the work of other prehistorians concerned with the composition, context and modifications of Paleolithic bone assemblages. Binford emphasized the need to draw accurate inferences about past behaviors by grounding knowledge of the past on experiment and observation in the present. His influential work coupled skepticism with a plea for methodological rigor in studies of prehistoric cannibalism.

Next : Standards of Evidence
http://adfoc.us/198212980656
- 0 comments

Once We are NOT Alone : 4. Competing Scenarios

By : Ian Tattersall

IN ALL THESE WAYS, early Upper Paleolithic people contrasted dramatically with the Neandertals. Some Neandertals in Europe seem to have picked up new ways of doing things from the arriving H. sapiens, but we have no direct clues as to the nature of the interaction between the two species. In light of the Neandertals’
rapid disappearance and of the appalling subsequent record of H.sapiens, though, we can reasonably surmise
that such interactions were rarely happy for the former. Certainly the repeated pattern found at archaeological
sites is one of short-term replacement, and there is no convincing biological evidence of any intermixing of peoples in Europe.

In the Levant, the coexistence ceased—after about 60,000 years or so—at right about the time that Upper Paleolithic– like tools began to appear. About 40,000 years ago the Neandertals of the Levant
yielded to a presumably culturally rich H. sapiens, just as their European counterparts had.

The key to the difference between the European and the Levantine scenarios lies, most probably, in the emergence of modern cognition—which, it is reasonable to assume, is equivalent to the advent
of symbolic thought. Business had continued more or less as usual right through the appearance of modern bone structure, and only later, with the acquisition of fully modern behavior patterns, did H. sapiens become completely intolerant of competition from its nearest—and, evidently, not its dearest—co-inhabitors.

To understand how this change in sensibility occurred, we have to recall certain things about the evolutionary process. First, as in this case, all innovations must necessarily arise within preexisting species—for where else can they do so? Second, many novelties arise as “exaptations,” features acquired in one context before (often long before) being coopted in a different one. For example, hominids possessed essentially modern vocal tracts for hundreds of thousands of years before the behavioral record  gives us any reason to believe that they employed the articulate speech that the peculiar form of this tract permits.

And finally, it is important to bear in mind the phenomenon of emergence— the notion that a chance coincidence gives rise to something totally unexpected. The classic scientific example in this regard is water, whose properties are holly unpredicted by those of hydrogen and oxygen atoms alone. If we combine these various observations, we can see that, profound as the consequences of achieving symbolic thought may have been, the process whereby it came about was unexceptional .

We have no idea at present how the modern human brain converts a mass of electrical and chemical discharges into what we experience as consciousness. We do know, however, that somehow our lineage passed to symbolic thought from some nonsymbolic precursor state. The only plausible possibility is that
with the arrival of anatomically modern H. sapiens, existing exaptations were fortuitously linked by a relatively minor genetic innovation to create an unprecedented potential.

Yet even in principle this deduced scenario cannot be the full story, because anatomically modern humans behaved archaically for a long time before adopting modern behaviors. That discrepancy may be the result of the late appearance of some key hardwired innovation not reflected in the skeleton, which is all that fossilizes. But this seems unlikely, because it would have necessitated a wholesale Old World–wide replacement
of hominid populations in a very short time, something for which there is no evidence.

It is much more likely that the modern human capacity was born at—or close  to—the origin of H. sapiens, as an ability that lay fallow until it was activated by a cultural stimulus of some kind. If sufficiently advantageous, this behavioral novelty could then have spread rapidly by cultural contact among populations
that already had the potential to acquire it. No population replacement would have been necessary to spread the capability worldwide.

It is impossible to be sure what this innovation might have been, but the best current bet is that it was the invention of language. For language is not simply the medium by which we express our ideas and experiences to one another. Rather it is fundamental to the thought process itself. It involves categorizing and naming
objects and sensations in the outer and inner worlds and making associations between resulting mental symbols.

It is, in effect, impossible for us to conceive of thought (as we are familiar with it) in the absence of language, and it is the ability to form mental symbols that is the fount of our creativity. Only when  we are able to create such symbols can we recombine them and ask such questions as “What if...?”

We do not know exactly how language might have emerged in one local population of H. sapiens, although linguists have speculated widely. But we do know that a creature armed with symbolic skills is a formidable competitor— and not necessarily an entirely rational one, as the rest of the living world, including H. neanderthalensis, has discovered to its cost.

* * *

Ian Tattersall was born in England and raised in East Africa. He is a curator in the department of anthropology at the American Museum of Natural History. His books include Becoming Human: Evolution and Human Uniqueness (Harvest Books, 1999) and The Last Neanderthal: The Rise, Success, and Mysterious Extinction of Our Closest Human Relatives (Westview Press, 1999, revised). 

http://adfoc.us/198212980652