::: Todd Seavey
Which 2006 film depicts a sexy vampire, skilled at swordplay, gunplay, and/or martial arts, stuck in the middle of a war over half-vampire hybrids?
Trick question—four films this year do: in order of their release, BloodRayne, Underworld: Evolution, UltraViolet, and, coming at the end of the year, Perfect Creature.
The vampire novels of Anne Rice and the comic-book-inspired Blade movies were no doubt big motivators for creating the current crop—but the particular emphasis all four of the movies place on hybrids—and the possibility of war over genetic purity—suggests that something more than old-fashioned vampirism makes these films resonate with early-twenty-first-century audiences—or rather with the producers who green-lit the projects.
In a culture increasingly defined by our ability to mix and match at will—blending pirated elements of old songs to create “mash-ups,” blending DNA from different plants and animals, blending elements of various subcultures, eras, and ethnic groups—vampires have clearly ceased to be villains and have become one more exciting style to adopt. Sure, dressing like a vampire is fun for the goths, and rooting for vampires (they kill people, but they have such lovely angst) is a blast for Rice fans—but a true mix-and-match era should hold out the promise of actually being a vampire.
And that’s where genetic engineering comes in.
Each of these four films deals with the possibility of mixing vampire and human DNA, usually placing a partly-vampire heroine or hero at the center, with the result that the vampire villains in the films are engaged not so much in their traditional business of trying to kill lots of humans by sucking their blood but in the newer business of trying to maintain the genetic purity of the vampire race against intermingling with humans. Before our eyes, the default mission of vampires in popular lore is changing: instead of cannibals, they are now, in essence, white supremacists.
The details vary from film to film, of course. In BloodRayne (which is based on a videogame), the title character, Rayne (played by Kristinna Loken, the striking female Terminator from Terminator 3: Rise of the Machines), is a “damphir,” half-human, half-vampire, and involved with an organization seeking to maintain the balance between vampires and humans. Rayne initially kills humans but later vows to restrain her bloodlust and use her Matrix-like fighting skills to hunt down Kagan, the vampire king who killed her mother before her eyes.
In Underworld: Evolution (which some charge is a rip-off of the popular World of Darkness role-playing games), the main character, Selene (played by Kate Beckinsale), is a vampire who later becomes a vampire-human hybrid and is in love with a man who is a vampire-werewolf hybrid. The two of them are trapped at the center of a war between vampires and a human paramilitary society bent on defeating them. Selene hopes that artificial, manufactured blood will make it unnecessary for vampires to feed on humans but first must use her Matrix-like fighting skills to contend with the political fallout from her killing of Viktor, the vampire king who murdered her family.
In UltraViolet (which is closely modeled on Japanese anime), the main character, Violet (played by model/actress Milla Jovovich), is a rebel soldier fighting to defend a subculture of humans who have been turned into genetically-engineered pseudo-vampires called hemophages, meant to be the perfect warriors but now seen as a threat to normal humans. Violet uses her Matrix-like fighting skills to hunt down the government official who wants to destroy all the hemophages and kill the nine-year-old boy Violet is protecting. (Note: UltraViolet is not to be confused with Ultraviolet, the unrelated British TV series about a covert government team trying to prevent a war between humans and vampires.)
In Perfect Creature (scheduled for release at year’s end), the main character, Silus (played by X-Men’s Dougray Scott), is a vampire who partners with a human cop in a world where vampires and humans coexist in a delicate balance and vampires are seen as the next stage in human evolution. Silus must use his detective skills against one of his own race when a flu epidemic, caused by genetic engineering experiments, devastates the human population and one vampire breaks the truce by beginning to prey on humans.
If the movies were all just drawing on traditional vampire legends, there would be no reason to see a trend here. But it seems that some unspoken consensus—a fairly logical one—has arisen that vampirism, in this modern age, ought to be seen as a manageable disease and a manageable political problem. Once domesticated, of course, vampirism then becomes a lifestyle option, and a potentially fashionable one to boot. The bad guys, then, are the people who try to keep our heroes caged in the old tribal categories—vampires who still think of themselves as aristocrats with literal bloodlines to protect or humans who still think that extermination is the only solution to the vampire problem.
Fair enough. It’s encouraging, really, that even in fiction a certain inexorable dialectic moves toward peace, mutual tolerance, and freedom of choice. In the process, though, for good or ill, we have basically neutered the vampire legend, which—like stories of werewolves and zombies—had its roots in an ancient, primal fear: anxiety about what to do with ambiguous cases that teetered on the border of life and death.
That, at least, is what the archaeological evidence suggests. Some of that evidence is summarized in the book The Buried Soul by British archaeologist Timothy Taylor. Taylor’s book might as easily have been titled A Pre-History of Death, in the fashion of his earlier book, A Pre-History of Sex. Refusing to adhere to the cultural relativism long fashionable in archaeology and anthropology, Taylor frankly addresses the ancients’ often-horrifying bloodlust and argues that if we are to understand them, we should assume the ancients were as excited by violence and sex as we are, not that their now-shocking rituals—widespread cannibalism as a default funeral rite, gang-rape, child sacrifice—can simply be treated as psychologically interchangeable with any modern practice: say, burning joss sticks, playing Taps, or rubbing rosary beads.
Something as drastic as, for instance, the mass slaughter of thousands of children in a single ceremony atop a blood-soaked ziggurat by the ancient Aztecs must have stirred up terrible passions in ancient hearts, as it would in us. Grant this and we may then be a step closer to understanding why and how the Aztecs came to do such things.
It might be comforting to the modern, relativist mind—albeit not to the Aztec priests’ victims—to suppose that the children sacrificed on the ziggurats were somehow happy with their roles in society, in a way that we simply cannot grasp from our modern perspective. But as Taylor explains, the evidence suggests otherwise. Indeed, the evidence suggests that Aztec children went to their deaths feeling as miserable and frightened as twenty-first-century children would:
Wherever in the Mexica territories they came from, all the small children who had been born under a particular day-sign and who, at birth, had a double cowlick of hair, were sold into temple kindergartens by their mothers. Chosen to become “bloodied flowers of maize,” they were brought up by priests who eventually revealed to them exactly how they were going to meet their death, not by excision of the heart [in the fashion of other victims of the Aztecs], but by having their throats cut. The children had to recognize the horror of their own impending death because their tears and wailing were essential to the success of the ritual . . . The Mexica did not make any pretense that their victims died “happy”; they overtly recognized human emotional negatives, terror, and extreme distress, as a central part of what they deemed necessary for their rituals.
As Taylor briefly concludes, whatever the hypocrisies and flaws of European Catholicism, one cannot really lament that it has displaced Aztec culture, even if one laments the bloodshed and disease that accompanied imperialist expansion into the New World.
There has been an effort to pretty-up the ancients and today’s more primitive cultures, though—to gloss over or excuse such phenomena as cannibalism in Fiji in recent centuries or ritual child murders by South African witch doctors as recently as the 1990s. Taylor notes that some in South Africa believe to this day that, for instance, smearing fat from children on the tires of their taxi cab wheels will make their businesses more prosperous; even some anthropologists within Zulu culture, such as H. Ngubane, defend ritual child murders, comparing them to abortion and euthanasia—examples, she says, of sacrifice for the greater good.
A closer, unflinching look at one practice considered particularly taboo in the industrialized world—cannibalism—reveals both how the vampire myth may have arisen and how we moderns keep finding ways to conceal things too horrible to face.
Taylor points to one book, The Man-Eating Myth by anthropologist William Arens, published in 1979, as a striking example of how easily the cultural-relativist impulse can obscure the ugly facts of human history. Arens argued that cannibalism was not a routine part of life in any recorded human culture. As proof for this claim, Arens pointed not to the real practices of primitives around the world but to the imperialistic and stereotyping attitudes of the moderns and Westerners who had sometimes hastily labeled other cultures cannibalistic. Arens shamed guilt-prone Western intellectuals and, almost overnight, it became conventional wisdom that there had been almost no cannibalism in human history.
It is amazing that this revisionist view could take hold so quickly, though, given that as recently as the 1960s anthropologists such as Pierre Clastres wrote in great detail about living with cannibals—in his case, in Paraguay. Taylor quotes Clastres’s matter-of-fact report that the communally cannibalistic Atchei people of Paraguay “do not roast very young children for the simple reason that there would not be enough to go around. But when they are boiled in water with tangy [hearts of young pindo palm], everyone can get a normal helping.”
This whole matter would be a marginal academic squabble were it not for the fact that, as Taylor describes in great detail, it appears that cannibalism may in fact have been the most common means of disposing of the dead until approximately 30,000 years ago. A significant portion of the human bones recovered from before that time show the clear marks of butchering—not in the metaphorical sense but in the literal sense of separating meat from bone in preparation for consumption. We are all likely descended from cannibals, and as recently as the nineteenth century, Western colonial governments sometimes struggled to suppress vestiges of traditional cannibalism among subject peoples.
What interests Taylor, having dismissed with Arens’s countermyth, is how a practice once as widespread as cannibalism came to be replaced by the burial rituals with which we are more familiar (and nowadays far, far more comfortable). As he notes, there are multiple reasons that cannibalism might seem logical and appealing to people—such as simply not wanting to waste meat—if they had no long-standing cultural taboo against the practice. But since humanity has long shown a great anxiety and uncertainty about the exact relationship between dead bodies and the personalities (often beloved) who once animated those bodies, it is unlikely that cannibalism, however routine, was ever an emotionless affair, devoid of symbolism or meaning, at least not since humans evolved sufficient brain capacity to worry and wonder about such things. It is likely, then, that there were beliefs among our prehistoric ancestors—as among modern cannibals—about powerful spiritual energies residing in human meat, energies that could be reclaimed and reincorporated into the tribe through eating.
Done properly, the ancients believed, such eating made the deceased individual once more part of the life cycle of the community. Done improperly, though, it might result in the deceased’s unhappy spirit lingering to cause mischief—or even in the spirit attempting to reanimate the corpse. The period between death and dismemberment of the corpse has generally been regarded as a confusing and in some sense “spiritually high-risk” period by all cultures, a time when the proper forms must be observed lest havoc be unleashed. Interestingly, notes Taylor, almost every culture has some variation on the myth of the zombie—the dead body that comes back to eat the living instead of rotting in the ground or, we now surmise, instead of being eaten by the tribe, as was the “natural order” of things.
Zombies, vampires, werewolves—all the mythical undead are in some sense cousins, all rooted in our ancient anxiety about the proper disposal of corpses. Schlocky films like Van Helsing and stylish ones like Underworld: Evolution were not the first tales to link vampires and werewolves: ancient Europeans often blurred the distinction between the two types of creature, while other cultures depicted dead, improperly buried humans becoming leopards, bears, and other animals, trudging off angrily into the night to hunt instead of resting in their graves. (Though Taylor does not mention it, there is even speculation that the convention of using tombstones may have begun not to mark gravesites but to weigh down the dead and prevent them from rising to menace the living.)
Why the switch over the millennia, though, from eating our dead to burying them? Like so many other developments in human history, it may have been a side effect of rising wealth and changes in social hierarchy. There is ample evidence of ancient chieftains and generals being afforded lavish burials, and it is possible that for a time burial was for the great while cannibalism was for deceased commoners. Kings, indeed, were often buried with still-living or ritually sacrificed slaves and members of the royal court, sometimes accompanied by weaponry, horses, and treasure. Eventually, virtually everyone wanted burial.
Complicating the picture, perhaps, are the so-called bog bodies that have been found across northern Europe, almost perfectly preserved for centuries by natural chemical processes in peat bogs. At first, archaeologists believed that the bog bodies were a relatively random sampling of ancients who had stumbled into the bogs and injured their heads or drowned. Eventually, archaeologists began to notice that an oddly high portion of the bodies were in some way deformed or seriously diseased—and that others appeared to have been ritually abused with a severity suggesting that they had been regarded by the community as particularly evil or dangerous at the time of death.
Taylor theorizes that the bogs were neither accidental resting places nor conventional graveyards. Rather, at a time thousands of years ago when Europeans had, as it were, one foot in each of two different versions of the afterlife—reincorporation into the living via cannibalism or departure of the soul for the hallowed, otherworldly dwelling place of select ancestors—the bog, with its preservative properties, represented a sort of limbo between worlds for spirits too dangerous or warped to be trusted either among the living or among the honored dead. Freaks, criminals, sexual transgressors—and there are clues that instances of all of these types may be found in the bogs—could be kept from infecting living members of the tribe if they were not eaten, and could be kept from sullying the afterlife if they were interred in the decomposition-preventing goop of the bogs. The marginalized souls were meant to remain trapped in their brutalized yet intact bodies for eternity.
If we assume that ancient humans’ psyches were roughly like our own, we can ask common-sense questions when confronted with a ritually abused body—beaten, repeatedly stabbed, then drowned: questions such as “What could have driven the ancients to such ferocity?” It is precisely by asking such loaded questions that archaeologists like Taylor are able to gain some insight into the barbaric but crudely coherent cosmologies of these people. Dismissing the violence of the bog bodies’ ends as merely “different” from our own methods of burial would mean missing the insights the bog bodies offer—which would be a final, relativistic insult to these ancient victims. It would also mean we had failed to read the harsh message their killers intended to leave us.
The borderland between life and death was a site of terrible anxiety for the ancients (as it is for us), and brutality was one way of proving mastery over that borderland, protecting the entire tribe from uncertainty.
In one of the most striking parts of The Buried Soul, Taylor reprints at length the account given by an ancient scholar, Ibn Fadlan (who hailed from what is now Iraq), of a Viking chieftain’s funeral he witnessed near northern Russia around 921 A.D.—a funeral during which one slave-girl was given special treatment:
When the man whom I mentioned earlier died, they said to his slave-girls, “Who shall die with him?” and one of them said, “I shall” [and it is forbidden to take back the offer] . . . So they brought her to the ship [which would be lit as a funeral pyre] and she removed two bracelets that she was wearing, handing them to the woman called the “Angel of Death,” the one who was to kill her . . . The crone grabbed hold of her head and dragged her into the pavilion, entering it at the same time. The men began to bang their shields with the sticks so that her screams could not be heard and so terrify the other slave-girls, who would not, then, seek to die with their masters.
Six men entered the pavilion and all had intercourse with the slave-girl. They laid her down beside her master and two of them took hold of her feet, two her hands. The crone called the “Angel of Death” placed a rope around her neck in such a way that the ends crossed one another and handed it to two [of the men] to pull on it. She advanced with a broad-bladed dagger and began to thrust it in and out between her ribs, now here, now there, while the two men throttled her with the rope until she died . . .
The ship is then set afire and the remains of both the chieftain and the slave-girl turned to ash. Taylor theorizes that much as ritual child murders are believed by some peoples to unleash great mystical energy—the as-yet-undepleted potential of the child, you might say—the brutal treatment of the slave-girl was believed by the Rus Vikings to attract and transfix the spirit of their dead chieftain, keeping him rooted in the funeral boat (in a fashion similar to the transfixing of the bog spirits, but temporary) until the fire could destroy his body, ensuring his passage to the land of ancestors. Disturbingly, the Vikings may not have even believed that the sacrificed slave-girl was capable of accompanying a great man into the afterworld, though it was plainly her hope that this was possible that helped convince her to participate in the deadly ritual.
Again, some modern commentators on Fadlan’s account have treated the entire ritual as if it were, if not innocent, at least on a par with the myriad and essentially interchangeable funeral customs of other peoples, including us moderns. But as Taylor says, to pretend that the slave-girl was a contented participant in the customs of her people is to ignore Fadlan’s description of her screams and the deliberately deceitful efforts of her killers to drown out the sound in order to avoid alerting others who might one day share the same fate. Even to call the girl’s initial choice to participate in the ceremony voluntary would be perverse, argues Taylor, given her status as a slave. A death chosen by someone with such limited options—and facilitated by a powerful Viking alcohol, which was served to the girl frequently from the time of her decision to the time of her death—can hardly be seen as a free expression of faith, regardless of how alien that faith may be. Something darker than quaint ritual shaped our past.
I’m reminded of a high school friend of mine, Chuck Blake, who became a computer research scientist and then a stock analyst and keeps a tape of a trashy 1975 film called Mondo Magic as a piece of evidence to be deployed against people who try to convince him that modernity and science are not in any meaningful way superior to “primitive” cultures. Mondo Magic consists of documentary snippets of appalling customs from all over the non-Western world—such as “female circumcision,” the eating of the skulls of witch doctors, and ritual eyeball-plucking—that you’ll just have to see for yourself someday. While my friend periodically made his case to West-bashers, I found myself at left-wing Brown University in the late 80s, surrounded by cultural relativists, anti-imperialists, and more than a few people convinced that we should abandon this whole enterprise called industrial civilization in favor of nomadic berry-scrounging or, at best, life on an impoverished weaving collective in Honduras. I thought that modern, Western lifestyles were superior to the alternatives depicted in Mondo Magic, but I felt some liberalism-induced guilt about such thoughts. I was torn, you might say, between two forms of squeamishness: revulsion at biology-related horrors and wariness about my own judgmental-Westerner tendencies. A very ancient desire to recoil from that which “reads” as death and disease combines with a modern desire to avoid all untidiness and conflict.
Taylor argues that both the indigenous defense of mystery and the modern, antiseptic institutions of science are engaged in the creation of “visceral insulation”: whether through sacred rituals or via the use of body bags, drawn curtains, and basement morgues, we all try to separate ourselves from direct confrontation with the evidence of death and decay, particularly from confrontation with the confusing, disorienting cases, the in-between cases. Miscarriages are far less often talked about than births and funerals. Dead bodies that have not yet begun to decay are treated with a mixture of respect and fear.
Taylor opens his book with an account of the bizarre childhood experience that both insulated him from death and left him fascinated by it for life. He was kept away from the funeral of his grandfather. That was common for British children in the mid-twentieth century. What was uncommon was his mother’s repeated claim that he had been responsible for his grandfather’s death. His grandfather had a heart attack weeks or months after chasing a young and mischievous Taylor around the house. Taylor says he knows now, rationally, that he did not cause his grandfather’s death but nonetheless grew up thinking of himself as in some sense a murderer until he was well into his early twenties. Toward the end of the book, Taylor recounts one other, even stranger, event from his own life that mingled trauma with insulation from trauma: During a period of extreme academic stress when he was a young adult, he engaged in a single episode of self-mutilation, making artfully shallow cuts over much of his skin, then trying to hide the bleeding from worried friends, who took him to a hospital. He now sees this episode, which he says has not been repeated, as akin to the ritual scarification common during mourning periods in some primitive peoples, the deliberate creation of small, localized pains to shock one’s system into forgetting the larger, unbearable trauma of a loved one’s death or the near-ruin of an academic career.
In fact, Taylor’s professional interest in the dead is itself symptomatic, in a way, of visceral insulation. As he explains near the close of the book, death-related professions are all part of the necessary and ongoing process of the specialization of labor:
Visceral insulation did not arise overnight. As humans spread out around the world, a vast spectrum of cultural types was brought into being . . . Within civilizations, dirty and distressing jobs were delegated to people who could become habituated to them. The viscerally immersed specializations of slaughterers, tanners, butchers, embalmers, grave-diggers, and refuse collectors freed others to become insulated enough to specialize in the arts and sciences. Without visceral insulation there would have been no Johann Sebastian Bach and no Marie Curie.
Not everyone is cut out for the job of looking at the violent, morbid underbelly of existence. Most humans throughout most of history kept the horror at bay with magic and ceremony, with those traditions eventually leading, in most cultures, to careful burial or cremation of the dead. For most people in the developed world, death is kept hidden by hospital doors and morticians’ arts. For the intellectuals, there is the pretense that nothing more horrible than our antiseptic daily existences really exists in the wider world, nor in the deeper reaches of the subconscious, nor in the ancient past that is unearthed in archaeological digs.
All this leaves us with a big open question, though. Should we retain, even treasure, our capacity to be horrified by things that bridge life and death, things that mingle two different forms of life, and things that appear misshapen and unnatural? Or should we be striving to overcome this primitive instinct? Clearly, it can both impede and aid analysis. Bioethicist Leon Kass, an advisor to President Bush, has argued against certain forms of genetic engineering and cloning on the basis of “the wisdom of repugnance”—the almost inarticulable sense that something just isn’t as it was meant to be. Ironically, that capacity for horror that has for so long made us recoil from the ways of the ancient world is now being invoked to make us recoil from the imagined future.
There is something of a culture war on between those like Kass who think we’ve gone too far in altering nature—biotech, stem cell research, the quest for earthly immorality—and those like me, in my professional capacity as a writer-editor for a science-promoting non-profit, who hope that biotech and the mix-and-match culture that surrounds it will yield longer lives, hardier crops, healthier babies, better medicines, and advances not yet dreamed of.
In this war, my side has clearly now recruited the vampires. Or rather, the symbolism of vampires, so deep-rooted in our psyches that it may trace back to our earliest prehistoric death rituals, has been inverted, no longer serving as a warning about tampering with the boundaries of life and death but instead serving as a hip, sexy model for a new form of predator-mimicking coolness.
On balance, I think this is a good thing. The world could do with less fear and more choice. I welcome Rayne, Selene, Violet, and Silus to my side and look forward to a future of hybrids and metamorphs, not to mention clones, cyborgs, and artificial computer intelligences.
But I recognize, better than some, that we’ll be leaving behind an ultraconservative aristocrat who was a great symbol in his own right and a fine piece of pop culture art: Count Dracula. Vlad, old friend, you just don’t scare us anymore. Indeed, we’re making plans to be like you, at least in our imaginations and when the mood strikes us, and without all the ethical problems raised by killing innocent people—we might drink only tiny amounts of blood without people noticing, like Anne Rice’s vampires; stick to feeding on animals, like Blade; consume artificial blood, like Selene of the Underworld. In any case, we’d work something out, make it a manageable disease, come up with a political solution.
But I promise to remember that for all our gains, something has been lost here. You were a link, Vlad, to the most frightening nightmares of our earliest myth-making days on Earth, resonating in a profound way that Selene and Violet and Rayne, for all their obvious charms, never will. Rest, at last, in peace.