On Friday, I was less shocked than aggravated to hear the judge presiding over the Anna Nicole Smith body case -- which at this point has begun to take on a distinctly Burke-and-Hare feel -- compare the now noisomely deceased former person to Shakespeare's Ophelia. "She's a complex person," the judge said. Matt Lauer, in his metacommentary on the media's coverage of the story, cited greed, treachery, and "a climax more dramatic than an episode of Law and Order" as the similarities. His interviewee, Dr. Keith Ablow, said that the case "certainly does have the elements of a Shakespearean drama."
I realize that my instinct to rush to Shakespeare's defense is that of the intellectual elitist who forgets that a fair amount of the best work ever to appear in the English language did so between bear-baitings and amidst an unsavory collection of prostitutes and venereal diseases. Nevertheless, I weep to think that the 21st-century answer to Shakespeare might actually be Anna Nicole Smith. As my students make ever clearer, however, my hopes to beat the Lauers and Ablows of the world must ultimately come to naught. The best I can hope for, then, is to correct them.
Anna Nicole Smith has precious little in common with Ophelia. If comparisons to Hamlet must be drawn, it seems clear that she is much more like Gertrude--a rich widowed woman in a dubious relationship with a money-and-power seeking man, who also has an equally dubious relationship with a suicidal son, who didn't get on with said money-and-power seeking man. How the presiding judge in the case could overlook these obvious parallels is beyond me.
Stepping away from Shakespeare and focusing attention on the baby, I find the story to be rather entertainingly Dickensian. An orphaned child? Unknown paternity? A mother of questionable morals dead before her time? A contested will? A fortune in the balance? Bleak House, anyone? Great Expectations? Oliver Twist?
What say you? If we must defile what canon-ballers call Literature with comparisons to the tabloid tawdries of our own time, should we not take pains to make them accurate? What text do you think this coffin-load of unfortunates digs up?
This is the title of the most recent episode of Studio 60, a show to which I'll give another season's worth of Monday nights if the network will. Certainly it pales in comparison to Aaron Sorkin's early work on The West Wing, and there have been more than a few stumbles in the arc so far, but tonight's episode was, I think, a step in the right direction.
I do so love it when a show waxes literary (especially now that "Are You Smarter than a Fifth Grader" threatens to melt my TV screen with its stultifying stupidity, and after having seen just today adult humans replying "Asia" and "Amsterdam" when asked to name a country besides "America" that begins with "a"). The episode title, Matt tells his four-person writing team, refers to Samuel Taylor Coleridge's semi-somnambulant composition of "Kubla Khan."
The author thus continued for about three hours in a profound sleep, at least of the external senses, during which time he has the most vivid confidence that he could not have composed less than from two to three hundred lines; if that indeed can be called composition in which all the images rose up before him as things.
Matt is stuck, you see, and seeks a 4:00AM miracle to unstick him. He gets it in the return of Harriet from the clutches of a rival suitor. I am not, generally speaking, a romantic, and even less a romanticist, but there was something deeply appealing to me about Sorkin's use of what even to initiates must have been a fairly esoteric literary reference. I find the continuity reassuring; Sorkin and the world he writes has a bibliography a little in common with the one that shapes my own consciousness. It quite takes me back to my life before the Academy, when I delved into literature to find truth rather than arguments -- flesh and blood instead of wheels and metal.
I realize that it's just a TV show and not (yet) a very good one, but tonight it reminded me, however accidentally, that literature can still be something that happens to us. It does not have to be a thing upon which we merely act. Tonight I hope for restlessness!
The research remains in its early stages, but it is moving along quickly enough to bring up the old scientific stumbling-block of ethics. What will the technology be used for? Whose hands are the wrong hands? Can it be used to find out how many more blades Gillette will add to their razors before the madness ends?
The article brings up your Big Brother, Minority Report scenarios, and suggests that scientists are well aware of them. I am no Luddite, and I believe that advances in technology have done more harm than good (the jury remains out on the internet), but I was particularly struck by the following snippet:
The use of brain scanners to judge whether people are likely to commit crimes is a contentious issue that society should tackle now, according to Prof Haynes. "We see the danger that this might become compulsory one day, but we have to be aware that if we prohibit it, we are also denying people who aren't going to commit any crime the possibility of proving their innocence."It's probably safe to assume that Prof. Haynes didn't mean it quite the way it reads out of context. But it certainly sounds as though I will be somehow obliged to prove my innocence before I have done anything. One wonders precisely what I've done to get myself into that situation beyond looking shifty-eyed and making wisecracks like the one above about voting Republican. In any case, it seems that in the future, happily, those of us who don't intend to commit any crimes won't be denied the opportunity to prove it.
Here's another quotation that gave me pause:
"Do we want to become a 'Minority Report' society where we're preventing crimes that might not happen?," she [Barbara Sahakian, a professor of neuro-psychology at Cambridge University] asked. "For some of these techniques, it's just a matter of time. It is just another new technology that society has to come to terms with and use for the good, but we should discuss and debate it now because what we don't want is for it to leak into use in court willy nilly without people having thought about the consequences.I find this less than reassuring. The quotation suggests that there are scientists, and then there is society; the former will explore the frontiers of science, the latter will have to deal with the consequences and make the choices. Scientists -- not that I know many of them -- don't seem to be much in the habit of letting go once they've caught on to something. It's just a matter of time. It will be society's job, though, to use the technology for good instead of evil. Not to be an alarmist, but when has a technology that had the potential to do harm been developed and then done no harm? It looks to me like we're in for yet another of those uncomfortable trade-offs.
I bring this up because I have recently had my first face-to-face encounters with proper scientists in a graduate seminar I am auditing on Aesthetics and Cognitive Science. In our first session, someone brought up the possibility of determining a "minimum set of conditions" necessary to evoke an aesthetic response or experience. After we have worked our way through Burke, Hogarth, Kant, et al., we will move into neural science and psychophysics. The people in the room don't appear to me to be any of your MKUltra, Manchurian Candidate, Orwellian overlord types, but as of yet there has been no discussion about the consequences of discovery. Imagine what happens if we managed to define the set of conditions. Do we get art, or poetry, or novels, by algorithm? Computer authors? To what will the lowest common denominator reduce us all? How soon before the GOP and DNC latch on and start bringing out a more perfectly manipulative political ad? What will Gillette do with the knowledge?
The quest to convince (or manipulate) audiences has been the subject of philosophic and scientific discourse at least since the days of Longinus. Thus far, though, the quest has remained more philosophy than science; trial and error, precedent, data collected after the fact. Ad campaigns for people and products still fail. Those with the power and will to do "evil" have not had the benefit of a complete defined set of procedures based on the hard-wiring of the human brain. I am not necessarily suggesting that the course I am now taking will lead there. I am ignorant enough about psychophysics and cognitive science to speculate wildly.
Am I sitting in a room full of future Oppenheimers? What draws the line between proceeding cautiously and choosing not to proceed at all?
Yes, I am slightly chuffed by this. But in the words of an archery instructor from my early adolescence upon the occasion of my hitting a perfect bullseye ("centre gold, American bratling," she would say): "Once is luck. Twice is skill."
One would think that publication would have resulted in my having some measure of confidence in my academic fortitude. If anything, I have less now than ever. My experience so far of every triumph I have had in academia is that it immediately diminishes in size and significance immediately after achievement. Alps on alps arise.
I am only as good as whatever I'm doing next. When, I wonder, if ever, does one get beyond the prelude?
I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description [hard-core pornography]; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it.A pretty fair assessment, I'd say, and now that I've mentioned pornography perhaps my gracious reader will follow me to the real point of my inquiry: when does what we might normally call plagiarism, or intellectual property violation, or just flat-out idea-pilfering achieve sufficient "originality" to transcend accusations of hackery? When does the borrowing talent become the stealing genius? What does "the pickpurse of another's wit," to use Sidney's term, need to do to what he lifts in order to gain the "legitimacy" of homage? In attempting to answer these questions, I find myself deferring again and again to the judgment of Justice Stewart.
Last night, I went with M to see Pan's Labyrinth. Setting aside for the moment that the Powers-That-Be have once again deemed Americans too dumb to understand the original title (we don't know our fauns from our fawns, you see), I quite liked it. This, though, is besides the point. The point is the number of similarities between this film and others texts (many of which M pointed out, and some of which, I think, got in the way of her enjoyment). To wit (spoiler alert!):
- The Lion, the Witch, and the Wardrobe: In the fantasy world, the first creature Lucy encounters is a faun, Mr. Tumnus.
- The Shining: Capitan Vidal duplicates Jack's half-conscious, half-crazed pursuit of a small child through a labyrinth.
- Harry Potter and the Chamber of Secrets: A book given to Ofelia by the faun has blank pages upon which writing and images magically appear, much like Tom Riddle's journal.
- The Lord of the Rings: The Two Towers: That faun has some seriously Entish qualities.
- Beetlejuice: Ofelia uses a piece of chalk to draw a door that opens into another world, as do Alec Baldwin and Geena Davis.
- Almost every movie with a torture scene: Show the instruments first and do a little speech before we begin!
I think that, as with literature, there is a language of film that depends to some extent on both cultural literacy and cultural capital. Intertextuality demonstrates awareness and knowledge of textual history and links the new text to the old in the minds of the reader. The text places itself in history by presenting strains of the appropriate generic ancestors. Alternately, or perhaps simultaneously, it provides an outlet for Bloom's omnipresent Anxiety of Influence. If you can't escape Stanley Kubrick, then subsume him; pay homage. The scene, though similar to that in The Shining, takes place in different context and occurs as an organic part of the story; therefore, I would say this is an instance of emulation rather than imitation. The matter of the chalk-drawn doorways, on the other hand, strikes me as the reverse. Del Toro uses the device wholesale, for almost the same purpose as Burton, and with no significant difference in execution.
Again, this is not to say he should not have done so. A "new" text must have "old" elements within it as an aid to categorization and comprehension. Originality and genius, amongst other things, must depend in some part upon a ratio of old and new -- a demonstration of inheritance and mutation rather than mere cloning. I use such evolutionary terminology not by accident, though that usage may be problematic. Adaptation is, I think, as crucial to the survival of a text or genre as it is to a species.
For my part, I think Del Toro got the mixture right -- but only just.