Sunday, October 9, 2011 - 0 comments

Once were Canibals : 2. Standards of Evidence

By : Tim D. White

IT WOULD BE HELPFUL if we could turn to modern-day cannibals with our questions, but such opportunities have largely disappeared. So today’s study of this intriguing behavior must be accomplished
through a historical science. Archaeology has therefore become the primary means of investigating the existence and extent of human cannibalism.

One of the challenges facing archaeologists, however, is the amazing variety of ways in which people dispose of their dead. Bodies may be buried, burned, placed on scaffolding, set adrift, put in tree trunks or fed to scavengers. Bones may be disinterred, washed, painted, buried in bundles or scattered on stones. In parts of Tibet, future archaeologists will have difficulty recognizing any mortuary practice at all. There most corpses
are dismembered and fed to vultures and other carnivores. The bones are then collected, ground into powder, mixed with barley and flour and again fed to vultures. Given the various fates of bones and bodies, distinguishing cannibalism from other mortuary practices can be quite tricky.

Scientists have thus set the standard for recognizing ancient cannibalism very high. They confirm the activity
when the processing patterns seen on human remains match those seen on the bones of other animals consumed for food. Archaeologists have long argued for such a comparison between human and faunal remains at a site. They reason that damage to animal bones and their arrangement can clearly show that
the animals had been slaughtered and eaten for food. And when human remains are unearthed in similar cultural contexts, with similar patterns of damage, discard and preservation, they may reasonably be interpreted as evidence of cannibalism.

When one mammal eats another, it usually leaves a record of its activities in the form of modifications to the consumed animal’s skeleton. During life, varying amounts of soft tissue, much of it with nutritive value, cover mammalian bones. When the tissue is removed and prepared, the bones often retain a record of this processing in the form of gnawing marks and fractures. When humans eat other animals, however, they
mark bones with more than just their teeth. They process carcasses with tools of stone or metal. In so doing, they leave imprints of their presence and actions in the form of scars on the bones. These same imprints can be seen on butchered human skeletal remains.

The key to recognizing human cannibalism is to identify the patterns of processing—that is, the cut marks, hammering damage, fractures or burns seen on the remains—as well as the survival of different bones and parts of bones. Nutritionally valuable tissues, such as brains and marrow, reside within the bones and can be removed only with forceful hammering—and such forced entry leaves revealing patterns of bone damage. When human bones from archaeological sites show patterns of damage uniquely linked to butchery by other humans, the inference of cannibalism is strengthened. Judging which patterns are consistent with dietary butchery can be based on the associated archaeological record—particularly the nonhuman food-animal
remains discovered in sites formed by the same culture—and checked against predictions embedded in ethnohistorical accounts.

CHOPPING
Hack marks visible on the left side
of this fragment of a human tibia
are testament to the removal of
muscle and tendon. Tools were also
used to make finer slices, to remove
tissue or to sever heads from
bodies. Archaeologists have to be
careful in their interpretations,
however, because humans process
their dead in many ways; not all
slice or hack marks indicate
cannibalism.

This comparative system of determining cannibalism emphasizes multiple lines of osteological damage and contextual evidence. And, as noted earlier, it sets the standard for recognizing cannibalism very high. With this approach, for instance, the presence of cut marks on bones would not by themselves be considered evidence of cannibalism. For example, an American Civil War cemetery would contain skeletal remains with cut marks made by swords and bayonets. Medical school cadavers are dissected and their bones cut-marked. With the threshold set so conservatively, most instances of past cannibalism will necessarily go unrecognized. A practice from Papua New Guinea, where cannibalism was recorded ethnographically, illustrates this point. There skulls of the deceased were carefully cleaned and the brains removed. The dry, mostly intact skulls were then handled extensively, often creating a polish on their projecting parts. They were sometimes painted and even mounted on poles for display and worship. Soft tissue, including brain matter, was eaten at the beginning of this process; thus, the practice would be identified as ritual cannibalism. If such skulls were encountered in an archaeological context without modern informants describing the cannibalism, they would not constitute direct evidence for cannibalism under the stringent criteria that my colleagues and I advocate.

Nevertheless, adoption of these standards of evidence has led us to some clear determinations in other, older situations. The best indication of prehistoric cannibalism now comes from the archaeological record of the American Southwest, where archaeologists have interpreted dozens of assemblages of human remains. Compelling evidence has also been found in Neolithic and Bronze Age Europe. Even Europe’s earliest hominid site has yielded convincing evidence of cannibalism.

Next : Early European Cannibals
http://adfoc.us/198212980666

0 comments:

Post a Comment