Skip to content

Quick Look: “Dealer” by John Martyn (1977)

August 29, 2011

My goodness, it has been a crazy couple of months in my extra-blog life, and there is not much indication that things are going to quiet down anytime soon.  The promised post about the brave anti-terror dog is indeed in the works; in the meantime, please enjoy the first—okay, let’s call it the second—in a new series of brief (yes!) posts that will pretty much attempt to say only one thing ABOUT only one thing.  There’s a ton of distance between an 8,000-word commando raid on a pop hit and “Martin shared a link on your wall,” right?  And SOME of that territory’s gotta be worth checking out.

Thus, please enjoy with my compliments the following:

Here is a fruit fallen from a rather peculiar branch of the pop-music tree: John Martyn in 1977, performing “Dealer,” the first track from his soon-to-be-released LP One World.  Martyn began his career as an English folk and blues artist in the mold of Davey Graham, but later moved away from the clear diction and crisp acoustics of traditional folk in the direction of jazz and dub; the One World studio sessions followed a transformative encounter with the justly fabled Lee “Scratch” Perry and a brief interlude as a session player in Jamaica.  While Martyn had been using Echoplex tape delay for years as an occasional component of his sound, by the mid-70s it had become central to his live performances, helping to yield the constantly-accreting cascade of notes we hear in much of his output from this period.  His experiences in Jamaica, we can imagine, suggested even broader avenues for exploration: I think the subtractive logic of dub is pretty apparent, for instance, in “Small Hours,” the last track on One World, where the Echoplex and a volume pedal serve to elide the sound of Martyn’s attack on his strings, thereby separating the guitar’s sound from its source.  (I think this solo performance—from Reading University in 1978—is extraordinary, and with all due respect to Steve Winwood’s Moog noodling, I prefer it to the album version.)

While creatively fertile, 1977 was a dark time for John Martyn personally, and it was about to get darker: by the end of the decade he’d be divorced, and his already pronounced proclivity for alcohol and substance use and abuse would rapidly expand and escalate.  In retrospect, “Dealer” comes off a little like the view from the apex of the rollercoaster: the last clear glimpse of where things are going and what’s about to happen.

What strikes me as most remarkable about “Dealer” is how it can’t or won’t settle on how literally it’s meant to be about a vendor of narcotics.  If it’s not literal, then what is it a metaphor for, exactly?  I’ll bet you can think of several answers, and I’ll bet they’re all correct.  A slightly weaker but fascinating performance of the same song from a year later—the coaster now on its way down, picking up speed—makes it clear that the indictment that “Dealer” intends to hand down is pretty broad: Martyn is contemptuous toward his audience, almost combative, and the audience seems amused by this.  It’s clear that the encounter is tainted by bad faith, but it’s difficult, maybe impossible, to determine who the sucker is, who’s taking advantage of whom.

I think the second verse of “Dealer”—shifted to third in the 1978 performance, with a few pronouns tellingly shuffled—is particularly sharp and true: a corrosive blast of self- and other-loathing aimed at anyone who earns a living selling a product that people want, but don’t need.  That, needless to say, implicates popular music, and implicates Martyn himself:

They tell me that they dig my shit
so I sell it to them cheap.
They bring their scales and check the deal,
’cos they’re scared that I might cheat.
Well I’m just a spit and polish
on a fat man’s shiny shoe.
Well I think I hate them for it
and I think they hate me too.

Oh Abbottabad we are leaving you now

July 4, 2011

Okay, so . . . Osama bin Laden.  Not gonna miss the dude, frankly.

It’s been a little over two months now since Bin Laden got himself assassinated by U.S. Navy SEALs in Abbottabad, Pakistan.  My spouse and I made it an early night on Sunday, May 1, and as such we were unaware until the following morning that we’d been sleeping in a post-Osama world.

I didn’t chart my reaction to the news very rigorously.  I remember being a little surprised at exactly where the guy had turned up (nice neighborhood!) and otherwise just sort of generally relieved—relieved less that Bin Laden was no longer a threat than that the raid that killed him wasn’t a total fiasco, as it might well have been.  Mine was not a put-on-an-American-flag-cape-and-climb-up-a-tree type of reaction, or even a woohoo-Facebook-status-update kind of deal.  I felt neither more nor less safe, neither more nor less “confident in the direction of the country” as the pollsters like to say.  I guess I’d characterize myself as satisfied.

Now, I don’t like to think of myself as someone who happily receives news of extrajudicial killings paid for by my tax dollars . . . but there you have it, gang.  The best argument I can offer in my defense is the hope that Bin Laden’s assassination has marked the beginning of the end of a very bad time—not only the military engagement in Afghanistan, but an entire decade of U.S. foreign policy conducted in the manner of the wounded Polyphemus, blinded and drunk.  If our only post-9/11 retributive options were 1) to invade and occupy two sizable Asian countries—one of which had absolutely nothing to do with the 2001 attacks—with nearly half a million coalition troops and 2) to assassinate a bunch of suspicious individuals via special-forces hit squad and flying robot, then I’ll take Option Two, thanks.  (I hope you don’t need me to point out that, strictly speaking, these were NOT our only two options.)  Taken together, two recent policy statements—one by President Obama regarding a post-Bin-Laden troop drawdown in Afghanistan, and one by homeland-security wonk John O. Brennan announcing Obama’s hip new counterterrorism strategy—clearly indicate that this is how our nation’s dirty business will be conducted in the future . . . which is how we used to say “going forward.”  (In a small masterpiece of grammatical hedging, Brennan’s speech promises that the administration “will be mindful that if our nation is threatened, our best offense won’t always [!] be deploying large armies abroad but delivering targeted, surgical pressure to the groups that threaten us.”)

While the success of “Operation Neptune Spear”—I’m just gonna go ahead and [sic] that—has no doubt inspired the institutions charged with counterterrorism and counterinsurgency to try stuff like this even more often, it has also removed a major justification for such covert programs.  Additionally, unlike the other approximately 12,000 extrajudicial assassinations and arrests carried out by the United States during the past calendar year—and who knows how many more in the course of our nation’s mostly unspoken-of history—this one got a ton of press, which helps to make the practice visible as a policy and a strategy, instead of just as a thing that happens.  If this analysis strikes you as rather disappointingly glib and pragmatic, well, it strikes me that way too.  What can I say?  Sometimes the only way out is through.

Many people, it seems, had rather different and more emphatic reactions than I did to the news of Bin Laden’s death.  A significant number evidently celebrated the event as they might have a local team’s championship victory, i.e. by dancing and cheering in public spaces, cracking open beers, and/or having sex.  Another significant group observed the occasion by commenting on the unseemliness of the celebrations of the first group, and suggesting that this behavior was at best an unwise way for us to present ourselves to the world, at worst an indication of a damning flaw in the American character, or even in the human character.  (Among the best of these latter folks was Mike Meginnis, blogging at the journal Uncanny Valley, who connected the desire to celebrate Bin Laden’s death with kitsch; I have some quibbles with this connection . . . but I’ll get to that.)  At the time, I didn’t share any of these sentiments or concerns.  This is maybe a little embarrassing, but the three major reactions I can recall having had while reading the news in the days following Bin Laden’s killing were these:

1)  Abbottabad seems like a weird name for a city in Pakistan.

2)  OMG, the race is like SO ON to be the first person to write a horrid jingoistic children’s book about the brave anti-terror dog that took part in the raid!

3)  Geronimo?  Seriously?  We’re really going there?

Embarrassing or not, I’d like to spend a little more time over the next few weeks kicking around all three of these reactions.  Thus, I bring you part one of three: Abbottabad.

Seems like a really nice place: mild weather, picturesque hills, etc.  Based on some very rough projections from available demographic data, I’m imagining it as about the size of Pittsburgh.  It was evidently a stop on the Silk Road—or one of the Silk Roads, at any rate—and, as we all know by now, it’s presently the site of “Pakistan’s West Point.”

Behind the weird name, there is indeed a story.  The city was established in 1853 by Major James Abbott, following the annexation of the Punjab by the British East India Company in 1849.  (This episode of global history is widely known, of course, but not commented on as often as it ought to be, so I’ll spell that out: for about a hundred years, from the mid-Eighteenth to the mid-Nineteenth Century, most of the Indian Subcontinent was controlled by a corporation.)  Major Abbott—later General Sir Abbott, Knight Commander, Order of the Bath—was an English soldier, secret agent, administrator, adventurer, and writer, described by his superior Henry Lawrence (quoted in a charming obituary pasted into a copy of Abbott’s best-known work and inadvertently scanned by Google) as

made of the stuff of the true knight-errant, gentle as a girl in thought, word, and deed, overflowing with warm affection, and ready at all times to sacrifice himself for his country or his friend.  He is at the same time a brave, scientific, and energetic soldier, with a peculiar power of attracting others, especially Asiatics, to his person.

Abbott was one of Lawrence’s “young men,” a group of British East India Company operatives sent as “advisors” to the Sikh Empire after the First Anglo-Sikh War, essentially to gather intelligence and to keep the Punjab pacified; he was instrumental in enabling the eventual British annexation of India’s Northwest Frontier.  Earlier in his career, Abbott had travelled throughout Central Asia—to Uzbekistan, Kyrgyzstan, Afghanistan, and Russia—intriguing with and against agents of the Russian Empire as a participant in what came to be known popularly (thanks to Rudyard Kipling) as The Great Game.

The Great Game is one of those fun episodes that seems to presage an improbably large portion of the history that followed it.  Like the Cold War, it presents the spectacle of two global powers doing ostensible battle—mostly through proxies, by means of exacerbating and exploiting ethnic and religious conflicts—in theaters of war that neither calls home.  (Also like the Cold War, it looks in retrospect less like two nations fighting each other than like two empires bent on devouring the rest of the world, competing to exploit its resources more quickly and efficiently.)  In the Great  Game, the field of play was Central Asia, particularly Afghanistan—a region which of course came to feature prominently in late episodes of the Cold War, as well as in more recent events.  The Great Game also seems to have foreshadowed other more abstract conflicts: that of megacorporations versus nation-states, for instance, and that of Western neoliberalism versus Islamism and tribalism.  Even the phrase “The Great Game” has displayed an increasing propensity to slip the bonds of specific historical circumstance and become general verbal shorthand for covert action on a global scale.

Therefore, it’s not much of a stretch to suggest that James Abbott is among the very few guys with a plausible claim on having definitively steered the course of world history.  Did I mention he was also a poet?  He totally was!  Check out this little gem, composed in 1853, on the occasion of the author’s departure from the outpost that had come to bear his name:

ABBOTTABAD

I remember the day when I first came here
And smelt the sweet Abbottabad air
The trees and ground covered with snow
Gave us indeed a brilliant show
To me the place seemed like a dream
And far ran a lonesome stream
The wind hissed as if welcoming us
The pine swayed creating a lot of fuss
And the tiny cuckoo sang it away
A song very melodious and gay
I adored the place from the first sight
And was happy that my coming here was right
And eight good years here passed very soon
And we leave you perhaps on a sunny noon
Oh Abbottabad we are leaving you now
To your natural beauty do I bow
Perhaps your winds [sic] sound will never reach my ear
My gift for you is a few sad tears
I bid you farewell with a heavy heart
Never from my mind will your memories thwart

One of the things you may have noticed about this poem is that it COMPLETELY sucks: metrics sloppy, syntax twisted to force clunky rhymes, punctuation absent, words repeated carelessly—and then there’s the whole logical fallacy of the two opening lines, because, dude, the place is NAMED AFTER YOU, so it can’t have been called “Abbottabad” when you first . . . oh, never mind.

A catalogue of this poem’s technical shortcomings, however, does not fully—or even mostly—explain why it’s such a piece of crap.  It’s not only badly executed, but also badly conceived: bereft of any particularizing detail about either the departing speaker’s circumstances or the place he’s leaving, this is pretty much the most generic farewell poem imaginable.  It could be applied to just about anybody leaving any nonurban locale anywhere between the subtropics and the Arctic and Antarctic Circles.  It’s entirely possible that Abbott wrote these lines while overcome with genuine sorrow at leaving his namesake cantonment; it seems more likely that he just figured the occasion would benefit from some verse.  But neither of these motives—not sincere emotion, nor social necessity—in itself provides sufficient material for writing a halfway decent poem.

Believe it or not, this IS going to have something to do with the death of Osama bin Laden.

I should probably historicize my critique a little: Abbott’s literary missteps probably seem more blatant to a modern reader than they would have back in the day.  Among the courses my spouse currently teaches are surveys in reading poetry; she’s recently added “Abbottabad” to a list of really lame poems she uses to explain why syllabi tend to pass over certain eras in silence and haste—and also to demonstrate what cosmopolitan Anglophone modernist poets like Stein, Pound, and Eliot would later be writing in opposition to.  In 1853, the British Empire was conspicuously light on rigorous and effective poets: Tennyson’s freedom had been compromised by his hiring-on as Poet Laureate, Elizabeth Barrett Browning’s perceived sphere of authority was constrained by her gender, nobody was yet paying much attention to Robert Browning or Matthew Arnold, and William Wordsworth—Tennyson’s Poet Laureate predecessor—was three years in the grave.

The evil that men do lives after them, and Wordsworth may actually be the key figure in explaining why Abbott’s cultured contemporaries might have accepted “Abbottabad” as being worth even the teensiest, weensiest damn.  As you probably know, Wordsworth and his much cooler buddy Samuel Taylor Coleridge burst onto the scene in 1798 with a collection of poems called Lyrical Ballads, which set out (according to Wordsworth’s famous preface to the 1802 edition),

to chuse incidents and situations from common life, and to relate or describe them    [. . .] in a selection of language really used by men; [. . .] to throw over them a certain colouring of imagination, whereby ordinary things should be presented to the mind in an unusual way; and, further, and above all, to make these incidents and situations interesting by tracing in them [. . .] the primary laws of our nature: chiefly, as far as regards the manner in which we associate ideas in a state of excitement.

Wordsworth’s and Coleridge’s aim was to make English poetry—which in their not-unjustified view had grown elitist, stylized, calcified, and smug—accessible to and conversant with the experience of common folks.  Which, fine: this needed doing.  But their prescriptions—which elevated forthrightness over wit, the individual over society, simplicity over complexity, and emotion over technique—have proved to possess some unpleasant side effects.

Coleridge’s work often tended toward the bizarre and sensational—cursed wandering sailors, druggy Orientalist fantasias, hot lesbian vampires—and often sought to achieve psychological insight by way of freaky supernatural dread.  On the whole it looks rather sillier than Wordsworth’s output does, but also seems to have worn better over time—maybe because it doesn’t purport to be rooted in anybody’s authentic embodied experience, and therefore doesn’t overstep its authority.  (Coleridge, who coined the phrase “willing suspension of disbelief,” is a total whiz on how writing goes about earning authority over readers.)  Rereading Wordsworth’s preface—which explains that the poems in Lyrical Ballads take “low and rustic life” as their subject and as the source of their language because “in that condition of life our elementary feelings co-exist in a state of greater simplicity” and “the passions of men are incorporated with the beautiful and permanent forms of nature”—I am struck by how closely his arguments match the uncritical assertions of a particularly bad-news brand of populist conservatism: both maintain that passion is more trustworthy than erudition, that country folk live simpler lives than city folk do (and therefore have a better claim on moral and philosophical clarity), and that human character proceeds directly from nature (and is therefore always essentially the same, once removed from the perversions of culture).

Assertions like these HAVE to be made uncritically, of course, because they have no basis in fact, and can’t survive objective scrutiny.  Though he calls Lyrical Ballads an “experiment” in his preface, Wordsworth’s project isn’t rigorous, and the extent to which he himself buys into what he’s peddling isn’t clear: when he wrote it, he was ostensibly a political radical cheering on the French Revolution and opposing urbanization, industrialization, and the monarchy; in less than a decade, however, he would become an avowed reactionary nationalist—a role he’d inhabit plausibly enough to be named Poet Laureate in 1843.  (Cynical and/or fatuous contempt for consistency and logic is another quality I can’t help but associate with populist conservatism.)

The problem here, I think, is pretty obvious: Wordsworth’s flattering conception of the agricultural class—sincere though it may have been—these days comes off as presumptuous, self-serving, and disrespectful.  And here’s the thing: this flaw didn’t make Lyrical Ballads any less popular or influential.  In fact, it made it MORE influential—and more useful, at least in certain quarters—by suggesting and legitimating an approach to verse that was easy to write, easy to read, and easy to digest.  I have no doubt that Wordsworth genuinely sought a way to jolt English poetry from its sclerotic state; unfortunately, replacing high-flown versification with plain language just resulted in the establishment of a new standard poetic diction, folksier in tone but no less amenable to vacuity.  Wordsworth’s goal of presenting “ordinary things” in “an unusual way” is totally solid: this is where just about all avants-gardes start, with a desire to wake people up and make them critically aware of their situations.  But almost right away, we find ourselves in trouble again.  Dig:

1)  Almost by definition, cultural apparatus that propagate works of art do not share that art’s aim of disrupting the status quo; in fact, they always depend to some degree on the status quo, in sort of the same way that the pharmaceutical industry depends on sick people.

2)  One of the cultural apparatus’ favorite tricks—one that’s performed automatically, without anybody having to think about it—is to defuse radical works of art by promoting other works that are imitative of them: superficially similar, but less overtly challenging.  This imitation has the effect of making the derivative works seem novel and cutting-edge—owing to their resemblance to the uncompromised original—while at the same time being far more accessible to a casual audience.  Furthermore—and this is the best part—the success of the imitative works has the added effect of making the original work of art easier for that same casual audience to consume with comfort: instead of being received as a confounding and alienating indictment of that audience’s entire way of life and system of values, it can now be understood as a thing that’s, y’know, kind of like those other things.  (The Situationist International called this trick recuperation.)

The real problem with English poetry in the Nineteenth Century—and maybe with all art, in every century—wasn’t the calcification of its rhetoric, exactly.  Rather, it was the powerful tendency of dominant culture to refresh itself by devouring and digesting every work of art produced in opposition to it, and regurgitating that art as something that actually reinforces it.  Therefore any renewal based solely on updating language can at best be a temporary fix.  Wordsworth’s principled objections to the culture of his time led him toward certain subjects and gestures; these subjects and gestures got imitated and standardized as techniques; then, once readers learned to spot the techniques, they used them to define—and effectively to defang—a genre: English Romantic Poetry.

So that’s the big picture.  The practical effect of this phenomenon was that after Romanticism reintroduced earnestness and emotional directness to English poetry, that poetry started to become sentimental.  What I mean by “sentimental” is pretty much what Oscar Wilde meant in De Profundis—though the context of Wilde’s remarks was significant, and very personal.  “A sentimentalist,” wrote the imprisoned Wilde to his erstwhile lover Bosie Douglas,

is simply one who desires to have the luxury of an emotion without paying for it.       [. . .] You think that one can have one’s emotions for nothing.  One cannot.  Even the finest and most self-satisfying emotions have to be paid for.  Strangely enough, that is what makes them fine.  The intellectual and emotional life of ordinary people is a very contemptible affair.  Just as they borrow their ideas from a sort of circulating library of thought—the Zeitgeist of an age that has no soul—and send them back soiled at the end of each week, so they always try to get their emotions on credit, and refuse to pay the bill when it comes in. [. . .] And remember that the sentimentalist is always a cynic at heart.  Indeed, sentimentality is merely the bank holiday of cynicism.  And delightful as cynicism is from its intellectual side, now that it has left the Tub for the Club, it never can be more than the perfect philosophy for a man who has no soul.  It has its social value, and to an artist all modes of expression are interesting, but in itself it is a poor affair, for to the true cynic nothing is ever revealed.

The key thing to get here is that sentimentality of the kind that Wilde deplores is a) borrowed and b) unearned.  Rather than being rooted in an individual’s response to a particular situation—whether depicted or experienced first-hand—sentimentality involves a response that’s rehearsed and performed.  Instead of requiring any close attention to or sympathetic understanding of the specific circumstances, sentimentality provides  a canned social script that efficiently circumvents attention and understanding while reassuring us that we are indeed attentive, understanding people; i.e. we convince ourselves that we’ve responded sensitively when in fact we’ve ignored the circumstances in favor of focusing on our own capacity for, and facility with, emotion.

We can build on Wilde’s indictment with the useful definitions of sentimentality provided by I. A. Richards, writing rather more impersonally in his 1929 book Practical Criticism.  In trying to explain what people mean when they complain that something is sentimental, Richards identifies three subspecies: quantitative sentimentality (“A response is sentimental when it is too great for the occasion”), qualitative sentimentality (“A crude emotion, as opposed to a refined emotion, can be set off by all manner of situations [. . . p]oems which are very ‘moving’ may be negligible or bad”), and a third, somewhat trickier variety:

Sentiments [. . .] are the result of our past interest in the object.  For this reason they are apt to persist even when our present interest in the object is changed.  For example, a schoolmaster that we discover in later life to have been always a quite unimportant and negligible person may still retain something of his power to overawe us.  Again the object itself may change, yet our sentiment towards it not as it was but as it is may so much remain the same that it becomes inappropriate.  For example, we may go on living in a certain house although increase in motor traffic has made life there almost insupportable.  Conversely, though the object is just what it was, our sentiment towards it may completely change through a strange and little understood influence from other sentiments of later growth.  The best example is the pathetic and terrible change that can too often be observed in the sentiments entertained towards the War by men who suffered from it and hated it to the extremist [sic] degree while it was raging.  After only ten years they sometimes seem to feel that after all it was “not so bad,” and a Brigadier-General recently told a gathering of Comrades of the Great War that they “must agree that it was the happiest time of their lives.” [. . .] A response is sentimental when, either through the overpersistence of tendencies or through the interaction of sentiments, it is inappropriate to the situation that calls it forth.  It becomes inappropriate, as a rule, either by confining itself to one aspect only of the many that the situation can present, or by substituting for it a factitious, illusory situation that may, in extreme cases, have hardly anything in common with it.

Richards finishes his treatment of the topic with the important observation that although we tend to associate sentimentality with an excess of emotion, the real problem is often exactly the opposite:

Most, if not all, sentimental fixations and distortions of feeling are the result of inhibitions, and often when we discuss sentimentality we are looking at the wrong side of the picture. If a man can only think of his childhood as a lost heaven it is probably because he is afraid to think of its other aspects. And those who contrive to look back to the War as “a good time” are probably busy dodging certain other memories.

If the task of a work of art, as the young Wordsworth suggested, is to present ordinary things in an unusual way with the aim of making the audience more alert to and engaged with the experience of existing in the world, then the major challenge that art faces is the fact that people don’t actually WANT to be alert and engaged—at least not for more than a couple of hours at a time, in specific social settings.  Such heightened sensitivity swiftly becomes a real pain in the ass: the sort of thing that’s likely to cause us to miss deadlines on quarterly reports and forget to pick the kids up from daycare.  We do, however, want to feel the emotional intensity that comes with being alert and engaged—to borrow, as Wilde might say, feelings that we have not earned—and we always find no shortage of lenders.  This is the secret to sentimental art’s success; I think we can all appreciate the appeal.  And provided we’re able to recognize this stuff for what it is when we’re consuming it (which isn’t always easy) I don’t really see that it deserves to be stamped out, or campaigned against.  It’s not particularly valuable, but neither does it do a tremendous amount of harm.  It may be dishonest—to no one more than itself—but it isn’t deceitful.

“Abbottabad,” however, is another story.  It isn’t sentimental, exactly, although it contains sentimentality.  It’s characterized less by its lack of self-consciousness than by its deliberate omission of key context: specifically any reference to what Abbottabad actually is (a British cantonment in the recently-annexed Punjab) or to what Abbott himself is actually doing there (conducting a counterinsurgency campaign to pacify the local population).  These are pretty important details, and we can be pretty sure they weren’t omitted by accident—but this is not to say Abbott’s omission of them was deceitful.  Power doesn’t often deceive; it doesn’t need to.  Instead of making persuasive statements at variance with reality, power determines reality.  “Abbottabad” erases the insurgency that Abbott and his men had already suppressed: as described in the poem, the Punjab isn’t a region in conflict, but rather a civilized outpost of the British Empire, where gentlemen write heartfelt poems in observance of significant occasions.  In its cloying banality, “Abbottabad” is precisely an assertion of its author’s total control: aside from the flat statement that “coming here was right,” the poem advances no arguments, because there’s nothing to argue.  Move along, it says; there’s nothing to see here.

As Richards’ analysis suggests, sentimentality has a political dimension, and therefore a political application.  When a work of art—maybe we should just call it a cultural product—operates by taking deliberate advantage of its audience’s sentiments in order to recuperate dissent and reinforce an established social order, then something pernicious is afoot.  “Abbottabad,” therefore, is something rather more troublesome than sentimental verse: “Abbottabad” is kitsch.

I hope to get into what kitsch means, exactly—and how kitsch has and hasn’t manifested itself in our national reaction to the killing of Osama Bin Laden—in my next post, which will feature as its special guest star Cairo, the fearless anti-terror dog.  Until then . . . happy Independence Day!

Spring cleaning

April 4, 2011

Hello!  So how’s daylight saving time been treating you?

A ways down the front page you may have noticed—nestled between Norman Rockwell and Eddie and the Cruisersa little six-month gap.  New Strategies for Invisibility is hoping not to repeat that phenomenon anytime soon.  About it I will say only that I have been working on a couple of non-bloggy projects, about which I hope I’ll have occasion to report more in the months to come.  For now, I plan to resume testing your patience with more regularity.

It’s been a little under a year since nice folks at MAKE published the essay of mine that shares its name with this blog; I figure now’s a good a time as any for me to post that essay here.  Although the writing that appears in this space doesn’t always resemble it very closely, I’ve often relied on the essay as a kind of blaze while bumbling my way through other stuff.  It feels like it belongs here.  Thus, interested parties may now locate it by clicking the Thesis tab above.

In other news:

The awesome Greying Ghost Press recently published a limited-edition chapbook (is that redundant?) by my talented and accomplished spouse-person Kathleen Rooney.  Titled After Robinson Has Gone, the poems in the chapbook are inspired by the life and work of poet, painter, filmmaker, critic, jazz musician, and all-around midcentury cultural superhero Weldon Kees, who vanished in spectacular fashion in 1955.  (In fact, and quite by coincidence, the details of his disappearance are weirdly similar to those of the fictional Eddie Wilson’s in Eddie and the Cruisers.)  The chapbooks are individually numbered; each has a unique cover made from an old movie poster.  Greying Ghost only made a hundred of these, and considering labor and materials, they’re pretty much giving them away.  Pick one up if you can, keep it someplace safe, then flip it after my spouse wins the National Book Award and put your kids through college on the proceeds.  Is that some financial planning, or what?  I normally charge for that kind of advice.

(Yet more spouse-related goings-on: Kathleen is presently guesting on Harriet, the Poetry Foundation‘s blog, where every month—but particularly April—is National Poetry Month.)

Speaking of the MAKE essay . . . some readers will perhaps recall that it was accompanied in print by an illustration done by my friend Carrie Scanga, who in addition to being a creature of pure goodness is an extraordinarily inventive and skilled visual artist in a variety of forms and materials.  Carrie did the cover for K’s first book of poetry, and also for the first book released by Rose Metal Press, and she and her work have been hovering like a benevolent quasi-angelic presence over the creative goings-on in my and K’s household for so long that I’m pleased to now have another occasion to sing her praises.  If you are near St. Louis or can get there prior to April 24, please rearrange your affairs in order to visit her show Breathe, which is up at the Craft Alliance Grand Center.  Go on my behalf, as it looks unlikely that I’ll be able to make it.  Even encountered indirectly—by way of internet traces, and my previous familiarity with her stuff—Breathe looks to be brave and generous and extraordinarily attentive to the fleeting textures of our common embodied lives, and in these senses seems representative of Carrie’s entire project.  Your consciousness will be enriched by increased exposure to it.

Other loose ends: in my last post I meant to express a little more affection for the kids at WLUW, the student radio station at Loyola University Chicago, but I couldn’t work it in.  Of the two college-radio stations whose signals I drive through on my way home from work—the other one being WNUR—it’s the one I generally enjoy more, if only because its deejays seem sincerely and dorkily enthusiastic about what they’re playing.  I love me some dorks.

Speaking of college-radio deejays—whose annealed and rarefied sensibilities keep them constantly at the silk-hankie-slicing katana-edge of underground culture—have y’all seen the video for the latest Ke$ha joint, “Blow?”  (That was a joke, son.)

Let’s all just take a moment to process what we’ve just watched.  Okay?  Okay.

So . . . did you read that thing I wrote awhile back about “TiK ToK?”  About how I think it’s, like, basically kind of evil?  And how its success may be a symptom of the complete systemic failure of American democracy?  That thing?

Yeah, well, I totally stand by that.  But, see, here at New Strategies for Invisibility, we get no satisfaction from acting like a bunch of haters.  While “TiK ToK” is without question an atrocity that I’d like to see excised like a tumor from our collective cultural brain, I have said all along that Kesha Rose Sebert seems like a basically nice kid with a good head on her shoulders, and I have been sort of sincerely hoping that at some point she’d do something, y’know, good.

I’m not sure if the “Blow” video qualifies, but I will cop to being entirely entertained by it.  It seems like everybody involved had a great time making it, which earns a ton of goodwill from me.  (One of the things I hated about “TiK ToK” was its lack of genuine playfulness and self-indulgence; this seems to contain healthy quantities of both.)  The overall vibe suggests a video project made by bunch of smart, internet-savvy high school seniors with no higher priority than amusing themselves—and who also for some reason have a good production designer and some decent CGI at their disposal.  The end result seems rather like a James Bond parody directed by Jean Cocteau, and suggests not only that Ke$ha will be with us for a while yet—which I think by now we’ve all intuited—but that we might not be entirely sorry for this.

Can I make a suggestion?  Real quick.  Three words:  American Idol judge.  I’m just saying.

Oh, and I should add: I owe my awareness of the “Blow” video—although what you read here might suggest otherwise, I do not spend a great deal of time monitoring Ke$ha’s activities—to Tim Jones-Yelvington, by way of Facebook.  Has everyone been keeping up with Tim’s recent adventures?  If you haven’t been, you ought to be; suffice to say that Mary Hamilton’s oft-quoted observation that Tim is “the Lady Gaga of the Chicago lit scene”—while never less than dead-on—has become rather more true since she made it.  It’s increasingly easy to imagine Tim’s evolving project as the logical next step in a sequence that runs from Bowie to Madonna to Gaga and beyond.  (Where Bowie’s costumes and theater were designed to create slippage between the pop star’s mask and the face behind it, and where Madonna launched a thousand dissertations by embracing that role of pop-star-as-floating-signifier, and Lady Gaga has seemingly READ some of those dissertations and plugged their contents back into her own pop project, Tim is actually using pop forms to DO theory—which is more fun than it sounds like it might be.)

K and I were fortunate enough to be in the audience on November 3, 2010 at the recreation room event at which Tim “came out” as a multiplatform media phenomenon and debuted his Lit Diva Extraordinaire project.  In much the same way that literally millions of people claim they were at Woodstock, in much the same way that tens of thousands will tell you they saw the last Sex Pistols show at Winterland, in much the same way that back when I was living in Austin it seemed like every third person in the city swore they were at Liberty Lunch that night in 1994 when Oasis encored with “I Am the Walrus,” and, dude, they knew right then that those guys were gonna be huge, man, huge—people will one day tell such untruths about their presence at that November 3 rec room show.  I am not completely kidding about this.  And I am telling you right now: K and I were there.  And now we are Tim Jones-Yelvingtoning down the Sequined Way.  You should join us.  Better late than never.

An open letter to two WLUW student deejays trying to figure out which Bruce Springsteen song “Keep the Car Running” by Arcade Fire reminds them of

March 6, 2011

The short answer is: probably this one . . .


. . . but I’m gonna venture to guess that “Dancing in the Dark” is not the song you’re really thinking of.

There is, I suspect, a human tendency—chalk it up to efficiency, I guess—to credit major pop-cultural heroes with greater and more direct influence than they actually possess.  The somewhat counterintuitive fact of the matter is that these big names are often just too freaking good to be really useful to the artists who follow them: they’re too accomplished or innovative or sui generis to be productively borrowed from, too successful at their projects to suggest avenues for further exploration.

And this itself, of course, is not an original observation:  Harold Bloom argued back in 1973 that the influence of predecessors is something that artists (okay, he was writing specifically about poets, but still) must overcome as much as, or more than, they draw upon it: it’s an obstacle as well as a resource.  Bloom catalogues a bunch of approaches and methods by which folks can and have overcome the influence of their major inspirations, a process which he says involves the misprision—or misreading—of significant works.  Failure to deliberately misinterpret your predecessors, Bloom says, means your creative output is too faithful and too obviously derivative to contribute much of anything to the ongoing cultural conversation; it will be “weak,” i.e. less than or equal to the sum of its all-too-easily-recognizable parts.

What I don’t love about Bloom’s formulation is its implicit suggestion that the majority of cultural heavy lifting is always done by a handful of heroic figures: that in any particular historical moment it’s always a very small number of artists who move the game forward, and who are themselves always succeeded by another small group that manages to overcome its paralysis by getting its great predecessors’ achievements purposely and compellingly wrong.  Meanwhile, Bloom accords the plurality of people producing art at any given time the status of mere spectators, supernumeraries, poseurs, parasites.

I just don’t buy this as an accurate description of how culture actually works.  It occurs to me—as it has no doubt occurred to a lot of people—that another way to engage productively with your bigshot predecessors is to rip them off indirectly, specifically by approaching them through the output of their weak imitators: through work that is too obviously derivative to qualify as original, or which attempts a fusion of incompatible elements that doesn’t quite come off (q.v. the infamous woman-fish combo that Horace warns against), or which focuses on great works’ idiosyncrasies and pursues them down self-indulgent dead ends and obsessional culs-de-sac.  With all due respect to, like, Beethoven or whomever, mighty oaks do not tend to spring up without some nice rich humified soil to take root in.  We need a model of cultural production that accounts for the contributions of the entire ecosystem, right down to the grubs and molds.

Pop music in particular—dependent as it tends to be on collaborative effort and a bunch of constantly-obsolescing technologies—is advanced less by its towering geniuses than by a ton of toiling hobbyists, flameouts, and also-rans who regularly arc across the public consciousness with one really compelling idea and then vanish forever, or who worry a single peculiar notion in obscurity until their motivation finally gutters.  Sure, I’m talking about the kinds of phenomena that, for instance, Brian Eno allegedly identified occurring around the first Velvet Underground record (i.e. the almost-nobody-heard-it-but-they-all-started-bands phenomenon)—but I’m also talking about stuff that’s not underappreciated, that doesn’t earn or deserve a cult following, that just shows up and delivers its payload and disappears over the horizon: cheap trash, novelty acts, even some stuff that’s just really, legitimately bad.  In popular music, a sweeping vision like Bruce Springsteen’s—which expands and challenges everybody’s sense of what pop can and ought to do—cannot indisputably be assigned a greater value than a single instance of a particular beat perfectly matched to a particular riff:

And if, my young deejay friends, you’ll meditate for a moment on the Romantics’ “What I Like about You,” and you’ll consider (as others certainly have) how its basic rhythmic template might have been used to pump a little adrenal exuberance into the brooding blue-collar streetscapes of Springsteen’s early-80s oeuvre, I think you will arrive at the same conclusion I have—namely, that THIS is the song “Keep the Car Running” is actually reminding us of:

This is about as far removed as you can get from the major heroic figures of the 1980s and still remain inside the confines of what can be called popular music: a one-off hit concealed behind multiple scrims, with origins both circumstantially obscure and deliberately obscured.  John Cafferty and the Beaver Brown Band were—let’s count the strikes, shall we?—a Narragansett, RI act with a terrible name and no evident aspirations beyond being unacknowledged understudies to Bruce Springsteen and the E Street Band (which at the time was probably not a bad way to earn a living).  Here’s the crazy thing, though: when their turn in the national limelight came, it actually required their invisibility.  John Cafferty wrote “On the Dark Side” as the signature song of the soon-to-be-cult 1983 film Eddie and the Cruisers, which presented it as the eponymous band’s breakthrough hit: a real song by a fake group.  At one point in my suburban-Houston childhood I had in my possession a cassette full of songs I had taped off the radio—you kids are too young to remember this practice; we’d typically do it to pass idle evenings prior to stoking the potbellied stove and turning down the wicks on the gas lamps—and “On the Dark Side” was among these songs.  I’d dutifully printed its title on the folded cardstock insert, along with the name of its artist as I’d understood the deejay to give it: Michael Paré.  Paré, of course, was the actor who played the Cruisers’ lead singer in the movie.  In roughly this manner were John Cafferty and the Beaver Brown Band obscured within, and by, their own solitary hit: a song from a movie that is itself about a singer who scores one big hit and then literally vanishes.

This might be quickly written off as just another instance of Morissettean irony; the actual circumstances are a little more convoluted.  When it came out, Eddie and the Cruisers was pretty much a total flop; it slid off screens within three weeks of its September 1983 release date.  As its director Martin Davidson recalls (quoted by John Kenneth Muir in The Rock & Roll Film Encyclopedia), he had basically tried to purge the whole sad disaster from his mind when, out of the freaking blue, on the July 4th weekend of the following year, he got a call from some dude at CBS Records.  The guy reported that the film’s soundtrack album had suddenly started flying out of CBS’s warehouses: it would eventually come to be certified as triple-platinum by the Recording Industry Association of America.  What had happened?  Well, evidently Eddie and the Cruisers had entered heavy rotation on cable TV; cable had finally jolted it to cultural life and found it an adoring audience.

At least that’s how the story goes.  I’m not completely satisfied by this account either, given what it omits—namely any discussion of the music actually featured on that hit soundtrack.  Yeah, no doubt cable TV has turned box-office bombs into cult faves—The Beastmaster, anyone? C’mon, who’s with me?—but I’m not inclined to believe that cable sold three million original soundtrack albums without a little help from other cultural forces, any more than I’m apt to believe that seventeen million people watched and loved The Bodyguard.  In the summer of 1984, when folks heard the fictional Eddie singing “On the Dark Side” from their televisions, what exactly were they hearing?

An answer, I think, can be found in another event that occurred at about the same time: Columbia Records released an album called Born in the U.S.A. by an artist named Bruce Springsteen.  It hit retailers’ racks on June 4, 1984—exactly a month after the release of “Dancing in the Dark,” the first single from the album, which was then hastening up the charts; it would reach the top spot on Billboard’s Hot Mainstream Rock Tracks within days, and remain there for six weeks.  It was, therefore, the number-one rock song in America when Martin Davidson got that call from CBS Records with the news that his movie had risen from the grave, borne aloft by its soundtrack.  This is not a coincidence.

According to Muir’s valuable account, when Martin Davidson told his music supervisor to scare up some musicians to serve as the offscreen auditory manifestation of Eddie and the Cruisers, he explained that he was imagining a group that would sound like Dion and the Belmonts by way of the Doors, but that would always remain true to its roots as a New Jersey bar band.  It’s tempting, therefore, to summarize Davidson’s vision as Eddie = Dion + the Doors + the Boss, but that’s not quite right.  Springsteen had already incorporated Dion & the Belmonts and Jim Morrison into his own sound; he didn’t need Davidson’s made-up movie band to do it for him.  (Springsteen has proved no less adept at pastiche than his early-80s Top-40 peers Prince and Madonna, though the Boss’s appropriations have rarely been ironic, and have tended to evoke authenticity more than artifice.  In the present context it’s worth noting that his borrowings from the Doors were more successful for being indirect: double-filtered through Iggy Pop and Suicide; cf. “State Trooper” from Nebraska.)  It’s more accurate, therefore, to characterize Davidson’s vision as a reduction: Eddie = Springsteen minus Dylan, minus Guthrie, minus Morrison . . . the latter Morrison being Van, not Jim, of course.

Reductiveness has its advantages, as Springsteen himself can testify.  In July of 1984 “Dancing in the Dark” had established itself as Springsteen’s biggest chart hit ever; it remains so today.  By the accounts of everyone involved, the song was written to be exactly that: during the Born in the U.S.A. sessions, Springsteen’s manager Jon Landau came to him with the news that the album still lacked a lead single; Springsteen did not receive this news with enthusiasm.  He banged out “Dancing in the Dark” quickly and spitefully, and that speed and spite come through quite clearly in the finished product.  (“Dancing” “went as far in the direction of pop music as I wanted to go,” Springsteen writes in his book Songs, “and probably a little further.”  Eric Alterman quotes Steve Van Zant—the E Street Band’s self-designated Cardinal-Prefect of the Congregation for the Doctrine of Straight-Up Rock ’n’ Roll—as pretty much saying that the song only happened because he wasn’t around at the time to kill it.)  It remains a bit of an oddball in Springsteen’s output, and not just because it was a huge hit: Max Weinberg’s drums are terse and mechanical, and they, along with Roy Bittan’s plaintive keyboard riff, lend the song what Pandora would call a “synthetic sonority,” something one does not often hear in the Boss’s catalogue, at least not to this extent.  The rhythm is pushed rather than swung, closer to disco or New Wave than to the blues; the arrangement seems entirely of its moment, disengaged from cultural and historical precedents.

Although the lyrics seem deeply personal—Bill Flanagan has remarked on how the first line, “I get up in the evening,” is a signal to listeners that Springsteen, the frequent adapter of personae, is here speaking in his own voice (and tweaking and breaking with the blues tradition, too, by shifting “morning” to “evening” in accordance with his own rock-’n’-roll lifestyle)—they also seem pointedly lacking in focus and commitment.  Indeed, they are about lacking focus and commitment, as perhaps befits the lyrics of a song Springsteen didn’t really want to write.  Right off the bat, the song’s narrator tells us that he “ain’t got nothin’ to say;” he’s just tired and bored with himself, he’s sick of sittin’ ’round here tryin’ to write this hit—er, this book.  The placement of “Dancing in the Dark” on Born in the U.S.A.—track eleven of twelve—also reflects some ambivalence on the artist’s part: the song is not the introduction Springsteen wanted to offer guests at his album’s front door, but is rather more akin to a late-night lampshade-on-the-head moment as the festivities are starting to break up.

Needless to say, “Dancing in the Dark” DID serve as an introduction—not to the album, but to Springsteen himself, for millions of folks who’d never heard of him before or who’d never paid that much attention.  The song went down easy, and it successfully primed much of the listening public for the material that was to follow.  Over half the songs on Born in the U.S.A. eventually hit the Top Ten, and many of these—the tense and brooding “I’m on Fire,” the acerbic and heavily narrative “Glory Days,” the glum and conflicted “My Hometown, “ the indignantly anthemic title track—ain’t exactly bubblegum, hooky though they may be.  Still, plenty of the new fans won over by “Dancing in the Dark” did indeed prove willing to go where the Boss wanted to take them.

Plenty of them also didn’t—which is not to say they weren’t willing to go somewhere with him.  Much has been made of the ways in which Born in the U.S.A. was misinterpreted and misappropriated by conservatives, and while no doubt some of these misappropriations were opportunistic and dishonest, others were fairly innocent: touching and creepy in approximately equal measure, symptomatic of a peculiarly Reaganite capacity to ignore clear evidence in the interest of a good narrative and to presume concord without any reasonable basis for doing so.  George Will, for instance—an incongruously bowtied and earplugged presence at one of the Boss’s marathon concerts in the summer of ’84—saw the huge American flags, saw the disproportionately white and working-class audience, saw the overtly masculine and hetero singer grinning and belting out what sounded like triumphant fight songs, and he must have figured, perhaps not entirely unreasonably, How can this guy NOT be on our team?  To Will, Springsteen’s fans looked like exactly the folks who’d crossed historical party lines to land Reagan in the White House, and who were about to vote again to keep him there.  And Will—cautious enough to claim Springsteen for conservatism while disavowing any knowledge of the artist’s own politics—was not wrong about those fans.

Through various public statements, Springsteen immediately began to push back against what he regarded as politicians’ and pundits’ misreadings of his songbook—but this could be only so effective given the work itself, which to its credit partakes of an entirely different sort of discourse than does a typical election-season exchange of fire: it’s ambivalent and complex, evoking legacies of pride and disappointment, burdens of social coercion and individual responsibility, and the competing pulls of virtue and duty and impulse and desire, while declining to draw bright lines between any of these.  Complex things are by necessity easy to misread; as a result, Springsteen soon found himself contending with the biggest ideological disconnect between a performing artist and a ticket-buying audience this side of Barbra Streisand.

Now, I’d always sort of figured that this ideological disconnect came about due to Born in the U.S.A.’s title track, which is pretty easy to take as prideful and bellicose rather than anguished and aggrieved—particularly if you want to hear it that way, which plenty of people clearly did.  It’s worth recalling that in May of 1985, Sylvester Stallone—who hadn’t scored an unambiguous box-office hit doing something other than playing Rocky Balboa since, well, ever—managed to extend his lease on superstardom for another decade essentially by adapting the conservative misreading of “Born in the U.S.A.” to the silver screen.  Rambo: First Blood Part II (which had the chutzpah to rewrite not only Springsteen but a half-century of global history AND the movie it’s supposed to be a sequel to) depicts a world where the sufferings of America’s Vietnam combat veterans have been caused not by a lack of decent blue-collar civilian jobs and access to appropriate social services—nor by the, y’know, actual experience of war—but rather by a bunch of mendacious and cowardly bureaucrats.  Hell, as a matter of fact (the film seems to suggest) we ought to give our boys another crack at it—this time without all that high-minded best-and-brightest John F. Kennedy claptrap—and by god they’ll get the job done this time.  (Pretty much everybody Rambo kills in Vietnam is conspicuously not Vietnamese, i.e. not somebody with an understandable interest in defending home and family from foreign adventurers: Rambo’s major adversaries are all Soviet spetsnaz guys.  Suffice to say that the film does not spend a ton of time pondering the validity of the domino theory.)  Sadly, there can be no question that misreadings of Born in the U.S.A. helped make Stallone’s blockbuster film possible—misreadings of which in turn helped make the 1991 Gulf War possible, misreadings of which in turn helped make possible the invasions and occupations of Afghanistan and Iraq.

So . . . that’s not too cool.  But now that you kids have brought up this whole Arcade Fire issue, it suddenly occurs to me that I have perhaps been too hard on “Born in the U.S.A.” all these years, or at least that I’ve been asking it to shoulder an unfair share of blame for being conscripted by policies it meant to critique.  I think what cracked the door to the large-scale misreading of “Born in the U.S.A.” was, in fact, “Dancing in the Dark”—the song that initially seized everybody’s attention, and yet didn’t require anyone to have much of an opinion about it; the song that allowed America to get comfortable with Springsteen and to feel like they pretty much knew where he was coming from.  That comfort level actually made it much harder to listen attentively to and to parse the singles that followed it onto the radio.  I’m not going to try to argue that “Dancing in the Dark” is a failure—had it never been released, I’m not sure the E Street Band would, for instance, be playing Super Bowl halftime shows—but I DO think it inflicted permanent harm on Springsteen’s overall project in a way that can’t ever really be repaired or undone.

So what’s wrong with “Dancing in the Dark?”  Well, nothing: plenty of really great pop singles—probably the majority of them—work pretty much the same way that it does, and I wouldn’t want that to be otherwise.  Problems only crop up when the artist who records the pop single doesn’t really want to be regarded as a pop act, which proved to be the case here.  Most of Springsteen’s best songs are designed to reward close critical attention: they want you to consider whether the singer is speaking in his own voice or the voice of a character, and, if the latter, what that character’s circumstances might be; as we said earlier, the perspectives they open for the listener on these circumstances tend to be complex and ambivalent.  These songs function, in other words, as fictions in the proper sense (i.e. not simply in the sense that they’re “made up”).

“Dancing in the Dark” is a pretty good song, but it’s NOT complex, and it’s not ambivalent; instead it’s calculatedly ambiguous, evoking the specific textures of the narrator’s existence less than the obscure gravitational pull of latent offscreen possibility beckoning from the margins of day-to-day life.  “There’s something happening somewhere,” the narrator tells us; this is the same somewhere that haunts an entire American songbook of yearning, from “Over the Rainbow” on down the line.  Springsteen uses this kind of thrilling ambiguity all the time—I tear into the guts of something in the night; last night I met this guy and I’m gonna do a little favor for him; there’s a darkness on the edge of town; I guess there’s just a meanness in this world—but he rarely employs it in so pure a form as he does here.  “Dancing in the Dark” has depth, sure, but it’s also really simple: it’s brainstem music, no less so than “What I Like About You.”  Its topography is less that of verisimilar, mirror-on-the-high-road fiction than the misty nocturnal landscapes of myth; it evokes oceans of human mystery, but the closer you look at it, the less it actually discloses.

My point here, basically, is this: what “Dancing in the Dark” somewhat incautiously succeeded in doing upon its release in the spring of 1984 is conjuring among the record-buying public a vision of a new American pop hero—a cool, brooding exemplar of self-involved masculine subjectivity in the classic mold of Elvis Presley and/or Bob Dylan, Marlon Brando and/or Steve McQueen—whom Bruce Springsteen then gracefully declined to embody, or as least declined to limit himself to embodying.  In Springsteen’s mind the song may have been little more than what Dave Hickey might call a term paper, but its directness still evoked an iconic protagonist—a restless, hungry void—who cut a very attractive figure for many listeners.  (And this seems about right; at least one novelist would later set out to capture the character of the 1980s through a protagonist who is also a restless, hungry void.)  Unfortunately for those listeners, the remainder of Born in the U.S.A. doesn’t include any repeat appearances by this guy; its other songs are by turns too fraught, too specific, too menacing, or too droll, leaving the “Dancing”-smitten audience with nowhere to go for another round of urgent romantic emptiness—nowhere, that is, until they heard John Cafferty’s voice coming out of their TV sets, synched up with Michael Paré’s mouth.

“On the Dark Side” happily delivered on what Born in the U.S.A. withheld—and, perhaps more importantly, it also avoided the kinds of complications that Springsteen’s other songs insisted on delivering.  People who wanted to consume “Dancing in the Dark” as a pure pop artifact tended to get distracted by a need to situate it in the context of Springsteen’s entire project and body of work; with “On the Dark Side,” however, that kind of effort was not only unnecessary but impossible: the singer who performed it a) had mysteriously vanished and b) wasn’t a real person anyway.  Consequently there was no need to reconcile it with anything.  “It seems more real today,” Muir quotes Davidson as saying.  “Now if [people] hear ‘On the Dark Side,’ they say, ‘I remember that, that really was number one.’  But it was number one twenty years ago, not forty years ago.  The fiction has become a reality.”  As Cafferty’s song itself assures us in its opening line, the dark side is calling now nothing is real; if you’re looking for a contemporary lyric that really captures the character of its era—and I’m not just talking about “the Eighties,” but rather a period that begins roughly when the Federal Reserve takes over the national economy in October ’79 and ends in, oh, let’s say September of ’01—you could certainly do worse than this one.

Thus, through their contribution to the Eddie and the Cruisers soundtrack, John Cafferty and the Beaver Brown Band went from being locally-known musicians obscured by their weak faithful reading of Springsteen to nationally-unknown musicians obscured by somebody else’s deliberate misreading of Springsteen.  “On the Dark Side” made its way onto the national airwaves as a perfect solution to the Born in the U.S.A. problem: it was a pure hit with no artist, unburdened by any connection to the real world.  It may be no better than the sum of its parts (or the difference of its exclusions), but the very modesty of its ambition means that it’s pretty much free of its influences’ baggage; it’s a perfectly portable piece of pop, as straightforward and standardized as a screwdriver, readily available for the use of anyone who needs it.

Although they experienced their suburban-Houston childhood some ten years after I did my own, I feel certain that Win and Will Butler also heard “On the Dark Side” on the radio from time to time while growing up, along with the various hits from Born in the U.S.A.  Years later, as they and the other members of Arcade Fire worked on the song that would become “Keep the Car Running,” perhaps they were briefly beset by a moment of anxiety of a type that I have to guess many songwriters encounter after coming up with a great hook: This is awesome, I imagine them thinking, but are we ripping somebody off here?  I imagine them listening with care to their own song—sounds a little Springsteeny, huh?—then reviewing mental catalogues of influences, obsessions, and heroes living and dead, and finding, to their probable relief, no matches.

Let me be clear: I’m not looking to call Arcade Fire out for subliminally borrowing from “On the Dark Side.”  Neither am I here to argue that the genetic similarity of “Keep the Car Running” to MOR soundtrack fare in any way diminishes what I think is a pretty good song.  I just think it’s interesting to consider how Arcade Fire might have been able to use John Cafferty and the Beaver Brown Band—whom I will not refer to as one of Arcade Fire’s influences, any more than I will refer to the sandwich I ate for lunch as one of my internal organs—to engage productively and indirectly with the Boss.  If you’re a fan who understands why Springsteen is a great songwriter, as I believe the Arcade Fire kids are, then you’re going to approach him with too much reverence to ransack his songbook and steal what you need.  If, on the other hand, you happen upon somebody else’s approximation of Springsteen, then you’re probably going to think: I see what these guys are aiming at, and I see what they’re missing, and I’m pretty sure I could do better than this.  In such a manner does the football of art move down the field of cultural production.

Because, hey, let’s take a quick look at what’s going on in “Keep the Car Running.”  Its desperate and giddy urgency, its sense of flight from some unnamed or unnamable coercive force, its nocturnal setting and its automotive theme—these all seem very Bruce Springsteen.  Not much else about the song does, though: there’s no distinct persona narrating it, and Springsteen’s trademark rooted and gritty specificity is also nowhere in evidence.  These are exactly the omissions that defined “Dancing in the Dark”—and exactly the alterations to the basic Springsteen template that yielded “On the Dark Side.”  But while “Dancing in the Dark” made these omissions out of impatient, almost accidental candor, and while “On the Dark Side” was essentially a movie prop—a myth made to order, the audio equivalent of an empty façade on a studio backlot—“Keep the Car Running” takes them as a starting point for something more artful and deliberate.

Although it lacks the overtly fictional elements you might find, say, in a song from Nebraska—i.e. characters, setting, backstory, etc.—Arcade Fire still manages to goose “Keep the Car Running” with a surprising degree of plot-level suspense: it’s a car chase in search of an action film.  (In this it borrows yet another 1980s pop music device, namely the weird tradition of songs that claim situational drama yet contain little or no actual narrative: though it’s too detailed and specific to be typical, “Life During Wartime” may be the granddaddy here, with its DeLillo-esque evocation of floating-signifier domestic terrorism; “Love Vigilantes” is probably a little too conventionally fictional to qualify.  The representative examples are probably goofily portentous MOR hits like “In the Air Tonight” and “Silent Running;” I have no theory to explain why the post-Peter-Gabriel Genesis lineup would be so fond of running this particular play.)  “Keep the Car Running” also ducks fiction’s conventional requirements of by announcing itself as a dream song in its first line; this has a bracketing effect functionally similar to the presentation of “On the Dark Side” as a hit by a made-up artist.  As the song unmoors itself from references to everyday experience it becomes more stylized, more emotional and abstract, closer to the realm of fable or myth; this rhetoric is reinforced by Arcade Fire’s use of horns, strings, bouzouki, and hurdy-gurdy, folk instruments that are practically prehistoric, never mind pre-rock.

In the realm of pop, myth has a number of uses and misuses.  In the best circumstances, it allows artists to sweep aside the complications of verisimilitude to address fundamental things, and also provides a metaphorical language for talking about them.  On the whole, Arcade Fire is doing this pretty successfully in “Keep the Car Running.”  Sure, there are some unclimbable mountains and unswimmable rivers that do nothing but assert that we’ve entered a realm of quasi-Taoist mystery, as well as a few lines (“same place animals go when they die”) that are evocative but don’t actually evoke much of anything.  (This is all still quite a bit less silly than a song that informs us—and that only informs us—that a woman has stepped from the darkness and made the narrator feel crazy and mean, while bringing him to the realization that nothing is real.)

Still, there are some moments in “Keep the Car Running” where Arcade Fire do seem to have their hooks in something significant—not Springsteen’s sought-after “something happening somewhere,” nor quite his sinister “meanness in this world,” but something bad and difficult to apprehend, something bound up with language itself.  The city through which the narrator flees frustrates him by changing its name; we get the sense that perhaps, as in a fairy tale, learning its true name might permit him to escape it.  Meanwhile, the men who pursue the narrator know his name—he has told it to them—and we get the sense that their power comes from this knowledge, but also that there are limits to this power, and that its balance stands to be reversed.  “There’s a fear I keep so deep,” Win Butler sings.  “Knew its name since before I could speak.”  His and his bandmates’ voices then name that fear; its name is not a word.

The more Arcade Fire I hear, the more it seems like myth is intrinsic to their working methods—which I suppose makes sense, given the Butlers’ own suburban origins and their recent focus on suburban milieus as their subject.  In myth, events are ruled by fate rather than by accident; myth’s concept of time is cyclical (every night my dream’s the same / same old city with a different name) rather than sequential (got in a little hometown jam / so they put a rifle in my hand / sent me off to a foreign land / to go and kill the yellow man).  Myth, then, is the opposite of history.  Suburbs are always designed with the goals of preventing accident and escaping history; ergo suburbs inevitably suggest themselves as mythic landscapes.  Arcade Fire seems to have interesting things to say about the suburbs; it remains to be seen whether they can continue to do this inside the mythic language the suburbs gave them.

This is, I hope, something Butler, Butler, Chassagne & Co. will continue to get better at.  They can certainly look to Springsteen’s career for hints on how to do it effectively, and also for examples of missteps they might seek to avoid.  For a few years now, journalists have been making suggestions that the band is ill-at-ease with its success; I can only imagine that their recent Grammy win might amplify that.  The concern, evidently, is that as their audiences have grown, the band’s perceived capacity to really connect with them has shrunk.  Arcade Fire, it seems, is anxious about being misread.  We can only assume their friend and mentor Bruce Springsteen has assured them that this concern is indeed justified—has warned them how quickly your use of myth can turn into myth’s use of you.

Norman Rockwell: The Movie!

August 25, 2010

Our next scheduled post—on Gold Diggers of 1933—has been delayed so New Strategies for Invisibility can take a shot at Deborah Solomon.

You saw her feature on Norman Rockwell in the Sunday arts section of the NYT last month, right?   The article’s ostensible occasion was the opening of a Rockwell exhibition that’s up now at the Smithsonian American Art Museum, the contents of which are drawn from the private collections of filmmakers George Lucas and Stephen Spielberg.  The Sunday on which Solomon’s article ran was Independence Day—and if there’s anything more American than Norman Rockwell, it’s the combined luxury-good purchasing power wielded by George Lucas and Stephen Spielberg!  God bless the U.S.A.!

Deborah Solomon, as you are no doubt aware, is responsible for the brief, punchy, often confrontational, always entirely implausible “Questions for” feature that runs weekly in the New York Times Magazine.  “Questions for” is notable chiefly for its adaptation to print of the subtractive compositional techniques of reality television, i.e. the editing-down of long stretches of unscripted human interaction into episodes of maximal drama and minimal nuance.  I have been reading it for years with growing irritation and am now convinced—in keeping with today’s theme, and to quote Jon Stewart out of context—that it is hurting America.

But let’s not talk about “Questions for.”  Let’s talk about Norman Rockwell—about whom, the internet informs me, Solomon is presently completing a biography.  (We can add to the magic confluence of Rockwell, Spielberg, Lucas, and July 4th the NYT’s apparently inexhaustible willingness to let prominent contributors use their features to drum up interest in their own projects.)  It’s hardly crazy for Solomon to be working this beat—she has previously published biographies of Jackson Pollock and Joseph Cornell, and was for many years an art critic for that dynamo of the avant-garde, the Wall Street Journal—so I am willing exercise some charity and deference here with respect to her qualifications.

And honestly, as a piece of newspaper journalism, her piece isn’t a total waste of time.  If you ignore Solomon’s lame attempts to suggest inter-filmmaker resentment between Lucas and Spielberg—force of habit, eh, Debz?—she’s pretty perceptive about the values implicit in Rockwell’s work, and its place in the culture.

Oh, but then there’s this:

As beloved as [Rockwell] was by the public, he suffered the slings of critical derision, especially in the ’50s.  The dominant art movements of that era—Abstract Expressionism, Beat poetry and hard bop jazz—devalued craftsmanship in favor of improvisation and the raw, unmediated gesture.  Against this backdrop Rockwell was accused of purveying an artificial and squeaky-clean view of America, which remains a criticism of him today.

It is true that his work, for the most part, does not acknowledge social hardships or injustice.  It does not offer a sustained meditation on heartbreak or death.  Yet why should it?  Idealization has been a reputable tradition in art at least since the days when the Greeks put up the Parthenon, and Rockwell’s work is no more unrealistic than that of countless art-history legends, like Mondrian, whose geometric compositions exemplify an ideal of harmony and calm, or Watteau, who invented the genre of the fête galante.  Rockwell perfected a style of painting that might be called the American Ideal.  Instead of taking place in lush European gardens, his playful gatherings are in a diner on Main Street.

This is not so much completely as it is exactly wrong.  It is NOT true that Rockwell’s work “does not acknowledge social hardships or injustice.”  It just doesn’t depict them as insurmountable—provided we can approach them with sympathy for the perspective of our adversaries, and our adversaries can do the same.  This may not be likely, but it is certainly possible, and Rockwell demonstrates this possibility by not only depicting but encouraging it—and creating a rhetorical space in which it can occur—through his popular illustrations.

The suggestion that Rockwell’s work is inferior to the raw and gestural output of his modernist contemporaries is specious, given that he begins and proceeds with completely different assumptions and aims.  No, Rockwell doesn’t “offer a sustained meditation on heartbreak or death”—because heartbreak and death are by definition solitary, and sustained meditation on them presupposes a laudatory interest in the subjective experience of the heroic individual.  Rockwell has no such interest.  His work is comic, in the old Greek sense: it’s concerned with individuals only to extent that communities are made up of them.

A comic sensibility and an “idealistic” worldview are not remotely the same thing.  (I’m not even sure they can comfortably coexist in the same consciousness, since comedy depends on instances of gears not meshing.)  The first mistake that Solomon makes here—presumably as her editors are tapping their feet and glancing at their watches—is not so much to mischaracterize Rockwell’s work as to misapply her terminology: she’s conflating “idealistic” in its everyday sense (i.e. referring to those among us who won’t accommodate our lofty principles to the experience of actually living in the world) with “idealist” in its philosophical sense (i.e. referring, in epistemology, to the argument that we can never have certain knowledge of the external world but only the contents of our own minds, or, in metaphysics, to the argument that the external world has no absolute existence at all).

I suspect that Solomon realized as she was typing that she’d drifted across the center stripe, and I guess she deserves some semi-respectful tip of the hat for just putting the pedal down and going for it: viz. her unapologetically loopy comparison of Rockwell to Piet Mondrian, an honest-to-god theosophical idealist who sought in his work to distill the visible world into primary colors and right-angled lines.  Her reference to Antoine Watteau is more defensible, though still wrong: while Rockwell’s lightness and his stop-motion evocation of fleeting moments do recall Watteau—and there are parallels (probably misleading ones) to be drawn between Rockwell’s emergence as a brand and Watteau’s establishment of the genre of the fête galante—fundamentally they are two very different artists.


Watteau is rarely narrative in the conventional sense.  His canvases of frolicking aristocrats and costumed entertainers tend to give us the impression that we’re peeking through a summer haze to glimpse a titillating story-in-progress, but the key to their effectiveness and their appeal is that we can never quite figure out what the hell’s going on.  In Rockwell, what’s going on may not be immediately apparent, but we are damn sure meant to figure it out—and the longer we study each of his images, the more detail emerges to add nuance to their implied narratives.  Watteau’s paintings invite us into an imaginary world: diffuse and muted, languorous and ephemeral—akin, maybe, to the “floating world” depicted in Japanese prints—and in any event distinct from the quotidian realm in which we all live and strive.  Rockwell’s illustrations, on the other hand, locate us imaginatively IN the everyday world.  They depend for their effectiveness on the cultural competency of their viewers: our ability and our willingness to catch the artist’s references and to appreciate his implications.  For most artists working in a recognized genre—whether we’re talking about directors shooting post-Hitchcock slasher films, or novelists writing post-Tolkien fantasy, or painters producing post-Watteau fêtes galantes—the big advantage of genre is that it allows you (and your audience) to bypass a bunch of issues related to verisimilitude and “realism.”  That’s not at all what Rockwell is up to.  Contrary to Solomon’s suggestion, Rockwell is not generic but rhetorical: he engages with verisimilitude head-on, advocating, demonstrating, arguing for a particular kind of perceptiveness.  If we can look at the world with the same sympathy and suspended judgment that we look at his illustrations, Rockwell seems to suggest, our attention stands to receive the same rewards.

Key to Rockwell’s skill as a narrative illustrator is his ability to portray spaces that are less architectural or theatrical than social: he excels at conveying a sense of people thinking, interacting, regarding each other.  This is not something that Watteau does, or seems very much interested in doing.  When I think about Rockwell’s canonical influences—worth thinking about, since art-historical allusions crop up regularly in his work—my brain tends to gravitate toward the hyperreal images of the Mannerists, particularly those of Paolo Veronese, whose fondness for depicting startling perspectives of distorted figures in unholdable poses Rockwell seems to share, along with a gift for suggesting complex multi-character narratives in a single frozen instant through expression, attitude, and gaze.

I also think of Rockwell in relation to a distinct lineage of painters that begins with Caravaggio: Frans Hals, Rembrandt, La Tour, Joseph Wright of Derby . . . guys that David Hockney would identify as “optical” painters, i.e. painters who either used optical devices (lenses, curved mirrors, cameras obscura and lucida) to produce their work, or else tried to emulate the effects of such devices.

Hockney’s speculations about these painters’ methods are controversial.  I find them pretty persuasive myself, and anyway compelling as hell.  About Rockwell’s own use of optics, however, there is no question: he famously based his paintings on carefully-composed photographs that he’d trace onto his canvases with the aid of a Balopticon projector, a process that helps account for their distinctive sharpness.  For me, Rockwell’s reliance on photographs strongly recalls some of Hockney’s assertions about Caravaggio, who left behind a bunch of paintings but not a single sketch, who was accused by his contemporaries of being unable to paint without models present, and who has recently been alleged to have used not only optical devices but primitive chemical fixatives to capture projected images on his canvases.

The part of Hockney’s theory that’s most interesting to me (and almost certainly to him too) is not its “gotcha” aspect, i.e. the issue of which Old Masters used optics and which didn’t—a gossipy debate that has flickered intermittently over the past decade through the pages of various haute-bourgeois magazines in a kind of leisure-class parallel to the pro-sports doping scandals.  Rather, it’s the account his theory offers of how a particular style of 2D representation came not only to dominate the Western pictorial tradition but also to be universally and uncritically accepted as the most accurate kind of 2D representation available to us.  Hockney points out that we don’t remotely see the moving, blurry, peripherally-glimpsed, selectively-focused-upon world around us in the same way that a camera lens does; in a famous 1984 interview with the New Yorker he derisively characterized photography as “looking at the world from the point of view of a paralyzed cyclops—for a split second.”

I don’t really want to unpack Hockney’s entire argument here.  I will say that it’s worthwhile in a general sense to think about what kinds of visual experience photographs are good and not so good at capturing, and also about how photos—and images that resemble them—go about convincing us of their integrity, veracity, and authority.  More specifically, and in the present context, it’s interesting to ask: why did Norman Rockwell paint the way he did?  Why did his illustrations look the way they looked, and why did he use his particular methods to achieve that look?  Why was he a painter in the first place, instead of, say, a photographer?  Or—not to put too fine a point on it—why didn’t the magazine editors who made him rich just hire themselves a photographer instead?

Partly, sure, there’s the somewhat vulgar but still undeniable appeal of Rockwell’s full-on, holy-cow virtuosity: the dumb satisfaction—known to generations of Yngwie Malmsteen fans—of watching somebody take aim at something and just hit the living crap out of it.  On a technical level, dude was just a scary good painter.  (And as Hockney is always at pains to point out, the camera is not a shortcut for the painters who use it; it just introduces a whole new batch of technical challenges.)  But virtuosity is not Rockwell’s whole appeal, nor even the bulk of it.  When we call something “Rockwellesque,” we don’t mean that it’s extremely sharply rendered, or adroitly executed.  We mean something else.

To make an obvious but still important point, the difference between a photograph and a representational painting is that the painting contains no accidents.  Because every mark has been made laboriously by the painter, each must be assumed to be intentional, and therefore potentially relevant to our interpretation of the image.  Imagine if you will a photographic print hanging alongside a painting done after it which reproduces it so precisely that viewers must stare hard at the surface of each to determine which is which.  In terms of what they represent, their content is exactly the same.  But we don’t look at them the same way—and they aren’t saying the same thing.

What is significant in this distinction is not so much the images’ content but their grammar.  To be more specific (and jargonistic), the difference is analogous to grammatical mood: it lies not in what the images are saying, but in how—and why—they are saying it.  Photographs document, memorialize, and evoke particular acts of human perception; therefore they tend to address us in the declarative mood: they tell us, foremost, that they represent actual phenomena, i.e. objects that were present and/or events that were occurring in physical space at the instant the shutter was opened.  Paintings work in approximately the opposite way: because we know that what they show us has been filtered through the painter’s eye and mind and brush, we also know that we have no independent access to whatever real-world phenomena (if indeed there were any) the painter set out to represent, and that the factual basis for the representation is unavailable to us.  Therefore paintings speak to us in moods other than the declarative: subjunctive, imprecative, optative, inferential, mirative, speculative, hypothetical.  They always provide more reliable evidence about the subjectivity of their creators than they do about the phenomena they seem to depict.


When painting and photography begin to converge—when paintings conceal their makers’ brushstrokes and decisions, and when photos become more synthetic and controlled—then things start to get interesting, grammatically speaking.  In such cases the images engage actively with our expectations, trying to anticipate and to get in front of what we already know and think and feel about whatever it is they show us.  In doing so, their speech becomes compromised, modulated, dynamic: the images reason and argue with us; they persuade, warn, seduce, cajole, and deceive us; they mock, joke, suggest, and wonder.  Rather than allowing us to bypass issues of verisimilitude, images like these put those issues squarely in our faces, and insist that we consider whether the world they represent is the same world we inhabit.  These are the kinds of images that Norman Rockwell produced.  When I say that Rockwell is always rhetorical, this is what I mean.

Although this kind of visual rhetoric becomes easy to spot when we look at painting and photography in relation to each other, I think it’s important to note that it predates the birth of chemical photography (though not that of modern optics) by kind of a lot.  As Dave Hickey points out in his great essay on Robert Mapplethorpe in The Invisible Dragon (“Nothing Like the Son”), Caravaggio was already working this way at the turn of the 17th Century: rather than awing and overwhelming his audience through enormous scale and startling special effects, his ecclesiastical paintings present the mysteries and the miracles they depict as tactile, intimate, and natural—not as cataclysms disrupting the texture of ordinary life, but as possibilities latent in the everyday.

“Just as Christ opens his wound to Saint Thomas,” Hickey writes,

Caravaggio (presuming to persuade us from our own doubt and lack of faith) opens the scene to us, in naturalistic detail.  And we, challenged and repelled by the artist’s characterization of us as incredulous unbelievers (and guilty in the secret knowledge that, indeed, we are), must respond with honor, with trust, by believing—and not, like Thomas, our eyes.  (To look is to doubt.)  To free ourselves from guilt, and from Caravaggio’s presumption of our incredulity, we must transcend the gaze, see with our hearts, and acquiesce to the gorgeous authority of the image, extending our penitential love and trust to Christ, to the Word, to the painting, and, ultimately, to Caravaggio himself.

Pretty cool trick—if it’s 1602, that is, and you’re Caravaggio, better equipped than anybody else in Europe to spring that kind of image on your unsuspecting, visually innocent audience.  If, on the other hand, it’s 1943 and you’re Norman Rockwell, then your circumstances are rather different: advances in photographic and print technology have made photo-oriented magazines like Life and Look and National Geographic commercially viable and increasingly popular, and American readers have grown accustomed to seeing content accompanied by photos instead of illustrations.  Suddenly it seems less of a given that editors will keep sending work your way.

(Being stylistically out-of-date wasn’t Rockwell’s only reason to be nervous in ’43; the editorial page of his cash cow, the Saturday Evening Post—which printed his famous Four Freedoms series after the Office of War Information had initially passed on them—had consistently opposed both the New Deal and American involvement in World War II.  Since the entire purpose of Rockwell’s Four Freedoms was to promote the sale of war bonds by illustrating principles articulated in FDR’s 1941 State of the Union, one imagines that this made for some lively meetings in the Post’s editorial offices.)

In a few fairly pointed passages in his book Secret Knowledge, David Hockney draws a connection between “lens-based” images (e.g. optically “realistic” painting, sculpture, photography, and film) and the great despots of the mid-20th Century, who he says demanded a stock of such images “to consolidate their power.”  This is an important observation, but it implies that lens-based images are inherently antithetical to freedom, which I think overstates the case.  Sure, images like these ARE potentially dangerous, if only because they’re politically operative—but this is just another way of saying that they’re rhetorical.  A painting like Caravaggio’s The Incredulity of Saint Thomas (above) certainly DOES serve to consolidate the power of the Church that subsidized its creation, in much the same way (as Hickey demonstrates) that Mapplethorpe’s sleek and elegant porn valorizes submission of a very different kind (and to no established order).  But Rockwell—who paints not at the cutting edge of image-making technology but in an avowedly outmoded style, with the evident aim of encouraging not capitulation to authority and/or uncritical jingoism, but only an expansion of positive liberty—is up to something else.

To help bring out this contrast, let’s take a quick look at a couple of Rosies:

That, obviously, is Rockwell on the bottom, and J. Howard Miller’s iconic badass on the top.  Context accounts for some of the differences: Miller’s was a 1942 factory poster for Westinghouse, while Rockwell’s was a 1943 Saturday Evening Post cover.  Miller meant his audience to look at his poster and see themselves; as such, his image is skimpy on particularizing detail.  It’s direct, stylized, almost cartoonish; you can imagine him producing it without the use of a model, never mind a projected photograph.  Its purpose is straightforward, and so is its rhetoric: the image reassures and inspires workers via appealing example, aiming to defuse any cognitive dissonance arising from the idea that assembly line work is unfeminine.  (Its clarity and simplicity also make it portable, arguably more potent today, repurposed as a campy feminist signifier, than it ever was as war-effort propaganda.)

Rockwell’s Rosie is targeted at exactly the same cognitive dissonance, but in an entirely different set of brains: those of the Post’s bourgeois readers, who hadn’t yet necessarily decided how they felt about the war, never mind the sweeping cultural changes it brought about—and who certainly weren’t sending their daughters into the factories.  Rockwell doesn’t set out, as Miller does, to nip in the bud anyone’s discomfort about nice young ladies wielding pneumatic hammers.  In fact, this discomfort is what this image uses for fuel, what makes it work.

Unlike Miller’s maid of Westinghouse, Rockwell’s Rosie neither offers nor requires encouragement: she’s in her element, unmistakably working-class.  Odds are good that she had a factory job before the war, and she damn well expects to keep punching the clock after the boys have come home.  Rather than addressing social worries about women headed for the assembly lines, Rockwell draws our attention to women who have already been there, making them visible, then elevating them as contemporary icons—partly in jest, but mostly not.  None of the image’s intertextual jokes—the billowing flag, the Mein Kampf footrest, the halo floating above the face-shield, the compositional allusion to Michelangelo’s Isaiah—are made at Rosie’s expense, or at anybody’s, really, except for maybe Hitler’s.  Instead they serve to inch the image away from “realism” in the direction of fantasy and gentle parody, a move that creates a safe, non-confrontational rhetorical space in which the Post’s subscribers can encounter this imposing alien being.  While Miller sets out to naturalize his Rosie—to insist that there’s nothing weird about ladies in factories—Rockwell acknowledges the strangeness of his image, foregrounds it, amplifies it, and then reassures his audience that it’s totally okay, that this is different but not bad, and that big girls with rivet guns are part of what makes this crazy nation of ours so freaking great.

As Richard Halpern writes in his Rockwell book,

Rockwell’s Rosie is such a compelling performance because it is such an ambiguous one.  Rockwell participates in the [Office of War Information] propaganda campaign without entirely subordinating himself to it.  It is not that he shrinks from his imposed task; rather, his very enthusiasm pushes him to produce something unexpected.  He gives even more than he is asked, and that “more” complicates and ennobles the image.  His Rosie thus sends an officially sanctioned message without being contained by it.

The addition of this complicating and ennobling “even more” is Rockwell’s signature move, something we see all over his work.  It’s what gives his rhetoric its distinctive flavor, and is also, I would argue, what makes him a major American artist.  The move basically works like this: Rockwell zeroes in on an instance where some set of codified social norms is scraping up against some other such set.  Then he depicts the scene—in a style that suggests the alert and impersonal objectivity of a photograph—from a perspective that privileges neither set of norms.  The resulting image suggests and demonstrates that these contradictory norms can tolerate each other, can peacefully and even productively coexist.  Once you start noticing this signature move (something the conservatives who idolize these images seem incapable of doing) it becomes apparent in a hurry that Rockwell’s real subject is never the norms themselves, but rather the all-but-invisible liberal-democratic society in which these encounters occur, and which makes them possible in the first place.

The key thing to catch here is that Rockwell’s images deliver this message without making any discernable effort to convince, without any recourse to debate-club tactics geared solely toward scoring easy points.  His images never reduce or simplify for the sake of argument; they never pump us up with myths or fascinate us with superhuman iconography.  Instead they go about their work honestly and in good faith, deftly conjuring—by means of their perceptive xenophilia, their documentarian preciseness, and a profligate surplus of imaginative detail (which all together constitute the “even more” that Halpern identifies)—a vivid sense of a world distinct from but directly adjacent to the one we inhabit, a world that could be constructed here through nothing more than a collective effort of will and materials we already have on hand.

What I’m trying to say is that Rockwell’s visual rhetoric is not propagandistic, as many of his detractors claim; neither is it idealistic, as Deborah Solomon suggests in the NYT.  Rather, it is very specifically fictional.  Fiction (like obscenity) is one of those concepts that everybody thinks they understand—I know it when I see it!—but then has a hell of a time actually defining when push comes to shove.  Fiction just means stuff that’s not true, right?  Well, no, not exactly.  Fiction stakes no fast claims on “truth” or “reality;” it just asks its audience to set such considerations aside and roll along with whatever it has to say, deep into wildernesses of grammatical mood.  Fiction’s primary aim is not to get its audience to think (although the audience probably will think) nor to feel (ditto) but rather, in a broad sense, to imagine.  If a particular fiction works on us, our experience is not necessarily one of being convinced, or emotionally moved, but rather of being transported.  Successful fiction leaves us with the feeling that—although the movie ends after the last frame, the book after the last page, the painting at the edges of its canvas—the invented world that it has put into our heads somehow just keeps going.  (This is no less true if the invented world is, say, contemporary Baltimore than it is if it’s the entirely fanciful Kingdom of Florin.)  If, as Bismarck said, “politics is the art of the possible,” then fiction is far more politically efficacious than any overtly political discourse could ever be, since expanding the scope of what’s possible is fiction’s bread and butter.

Once we’ve identified Rockwell’s aims and methods as fictional, it becomes clear why his images look the way they do.  Although Rockwell is pursuing his goals at an advanced level, his project is basically an extension of the traditional task of a certain type of commercial illustrator: one who specialized in producing naturalistic images to accompany narrative texts, very often prose fiction.  (Or maybe advertising, which is a subset of fiction.)  These images’ object was to provide imaginative access to the worlds these texts described—worlds in which pirates chase galleons off the Spanish Main, worlds in which shirts stay wrinkle-free without ironing—by being convincing, which typically meant depicting key episodes with near-photographic clarity and richness of incidental detail, such that readers could imagine themselves as eyewitnesses.  (This is, of course, a mass-market upgrade of the same tricks that Caravaggio built his career on.)

In this sense, Rockwell can be placed in a long line of American illustrators that includes Winslow Homer, Thomas Eakins, Howard Pyle, Frederic Remington, Frank E. Schoonover, and N. C. Wyeth, all of whom labored to evoke mythic and alien settings: the open sea, the American West, feudal Britain.  (Some of their near-contemporaries among European academic painters—Delacroix, Gérôme, Alma-Tadema—specialized in classical and oriental scenes executed in a similarly detailed fashion, to comparable ends.  Good rule of thumb: anytime you see an image that looks like a photograph, but isn’t, you are probably looking at somebody’s fantasy of something.)  Here it’s useful to contrast the working methods of these illustrators with those of another group that specialized in caricature, i.e. the stylized and exaggerated images that accompanied news and topical commentary, featuring reductions and simplifications intended to reassure readers that complex issues were within their grasp.  Rockwell’s achievement, of course, involved something akin to a synthesis of these approaches: like the caricaturists, he lay claim to the “real world” as his subject—but instead of offering analysis about it, or making sweeping proclamations, he focused on capturing the quotidian experience of living in that world, and made that experience seem as intense and as outlandish as the jousts and cattle drives and battles at sea that were his colleagues’ stock-in-trade.

If Rockwell seems old-fashioned—which for a long time he has—it’s due not so much to the irrelevance of his sensibilities as it is to the irrelevance of his methods.  It’s not hard to find images that work like Rockwell’s do (many a New Yorker cartoon, for instance, sings in the same bemused and affable key), but good luck finding images that look like his.  If Rockwell has been out of fashion since the Sixties, then Pyle and Schoonover and N. C. Wyeth are at this point pretty much esoterica.  These artists’ slide into irrelevance has closely tracked the decline of the kinds of stories they used to illustrate—by which I mean prose fiction in general, but also a particular type of prose fiction, the type that revels unselfconsciously and expansively in its own exoticism and limitless capacity for invention, a type that is disinclined to spike its wonder with any irony.  I’m talking about geeky fiction.  It is surely significant that about the only places you’re likely to see the work of the aforementioned illustrators emulated are the covers of sci-fi and fantasy novels; these days even romance publishers seem sheepish about their long dalliance with Fabio, more apt to go the demure flowers-embossed-on-the-cover route instead.

But let’s not be too hasty here.  Intemperate lust for tales of high adventure and weird fantasy didn’t just evaporate from American life at about the time JFK was shot.  Sure, geeks had it rough through most of the cool and callous Seventies, with little save Star Trek in syndication to sustain them, but they bided their time in underground exile, regrouped, and bounced back with a vengeance in about—oh, gosh, let’s see—May of 1977.

Looking back, it seems clear enough that the principal mission of George Lucas’s life—a mission at which he has basically succeeded—has been to achieve a seamless fusion of painted and photographic images.  It seems difficult to overstate the degree to which the products of Lucasfilm, Ltd. (along with those of Industrial Light & Magic and Pixar, respectively its current and former subsidiaries) have altered the big-budget filmmaking process, making it not only possible but routine for directors and cinematographers to marry lens-based and purely synthesized footage into something that audiences will buy as an entirely plausible evocation of another world.

Fan lore identifies a screening of abstract experimental films that Lucas caught in the mid-1960s as his big teenage a-ha moment, and that story basically checks out: we can certainly see Stan Brakhage in Star Wars, if only manifest in the idea that film is a writable surface that can capture the images of other writable surfaces (an idea that starts to get pretty interesting, and lucrative, once you have access to a bluescreen).  But by this point we should ALSO be able to recognize the perverseness of Brakhage’s painted-upon celluloid as a reverse analogue of another perverseness surely encountered by Lucas in his childhood, namely that of the commercial illustrators who adorned the pages of the adventure stories he loved—whose key working methods often involved staging elaborate photographs of models in costume, then projecting those photographs onto blank surfaces and repainting them in minute detail.  Boiled down to their essences, Brakhage’s goal and the goal of these illustrators was the same: to keep ahead of audience expectations by playing a shell game with reality and artifice, accident and intention; to pass off as organic and autonomous what has actually been laboriously constructed; in short, to help us trick ourselves.  It seems to me that Lucas’s gifts as technician and storyteller—though vastly amplified by his epoch-making technological vision—are roughly comparable to those of Howard Pyle or N. C. Wyeth: what he has contributed to the image-making toolbox will endure long after his films—which, by and large, are quite bad—have all been forgotten.

Lucas’s friend, collaborator, and fellow Rockwell-gatherer Stephen Spielberg, on the other hand, is by a wide margin the most significant American filmmaker of his generation.  In her NYT piece, Deborah Solomon alludes to the fact that although Lucas owns many more Rockwells than Spielberg does, Spielberg owns more good ones.  “He paid more,” Lucas explains.  Okay, maybe.  Then again, it could be that Spielberg has a clearer understanding of how Rockwell actually works—and a better sense of what (aside from their unabashed sentimentality) distinguishes his images from those of the other naturalistic commercial illustrators we discussed above.

I sort of suspect that the latter may be the case, because a lot of Rockwell’s notable traits and signature moves—which at this point have pretty much vanished from the realm of static visual art—pop up in Spielberg’s films all the time.  One aspect of this influence is purely technical, and widely shared: even as early as the Thirties, plenty of filmmakers were looking to Rockwell (who was himself looking back to Pyle, Homer, La Tour, Rembrandt, Caravaggio, et al.) for ideas on how to effectively light and block and frame their shots.  What Spielberg seems to have picked up better than most are the narrative (rather than purely aesthetic) functions of Rockwell’s techniques; he seems to aspire not only to compose his shots like Rockwell’s images, but also to edit them in such a way as to suggest the experience of looking at a Rockwell.  Since film can’t provide the luxury of studying an image until all of its telling detail has emerged, Spielberg has learned to deftly guide our attention exactly where he wants it to be, and to do so in such a way that we feel like we got there on our own.  He’ll typically accomplish this by means of efficient reaction takes—broadly acted, but carefully chosen—which is exactly how Rockwell’s narratives work.  As a result of this approach, Spielberg’s most effective scenes often feel like animated Rockwells, like Rockwells sprung to life and into motion.

(It’s interesting to note—and I wish I could take credit for this observation, but I heard it a few years ago from the artist Adam Frelin—that the balance of indebtedness seems to have tipped back lately in the direction of the art world: Spielberg’s films are obvious and acknowledged points of reference for Gregory Crewdson and many other contemporary photo artists—and it’s not too tough to find evidence of his influence in other places you wouldn’t necessarily think to look, like the mid-career work of Cindy Sherman and Jeff Wall.)

Technical similarities notwithstanding, Spielberg’s principal debt to Rockwell isn’t aesthetic or narrative.  It’s ideological—and, of course, it’s rhetorical.  More than just about anybody else I can think of, Spielberg (at his best) shows us the world through Rockwell’s lens: he displays the same fascination with and affection for otherness that we find in Rockwell, and he too seeks to embody and encourage toward such otherness an attitude of openness, generosity, and gentle humor.  Spielberg is one of the all-time great portrayers of friendship—not of longstanding friendships in the bromantical or Sex-&-the-City veins, but rather of unexpected, often fleeting comradeship between entirely dissimilar individuals.  The cinematic gold standard here is probably the geopolitically fraught relationship in Casablanca between Rick Blaine and Capt. Louis Renault, but Spielberg has made some admirable additions to the field: think of Roy Scheider’s transplanted big-city cop and Richard Dreyfuss’s dorky marine biologist in Jaws, or Liam’s Neeson’s Nazi industrialist and Ben Kingsley’s Jewish accountant in Schindler’s List, or even the brief exchange between Tom Hanks and Matt Damon in Saving Private Ryan.  Think most of all, however, of Close Encounters of the Third Kind and E.T., Spielberg’s two most personal films, as jaw-dropping a pair of fantasies as you’re ever likely to see about escaping the prison of the alienated self through sympathetic contact with a radically alien other.

Spielberg’s achievements, like Rockwell’s, are crowned with a couple of asterisks—probably best regarded as caveats, not dealbreakers, but still worthy of note.  No matter how pure its intentions, a work of art that’s motivated by love of the unfamiliar and the exotic will not tend to encourage understanding of whatever cultures or individual it depicts.  (Understanding, in fact, spoils all the fun.)  This limitation creates certain dangers that artists ought to guard against, and that they probably won’t: see, for example, the gee-whiz, aw-shucks, devil-may-care racism of Indiana Jones and the Temple of Doom.  (Temple was, of course, a collaborative venture between Spielberg and Lucas; I’m not going to try to argue that Spielberg was blameless in this mess—but still, one of these two guys went on a year later to make The Color Purple, while the other one gave the world Jar Jar Binks.  That’s all I’m saying.)  Rockwell’s track record is actually somewhat better than Spielberg’s on both gender and race—particularly after 1963, when Rockwell finally left the Saturday Evening Post (which had a longstanding prohibition of images of African-Americans, unless they were depicted as servants) and began working for Look.  (His great civil rights images, such as The Problem We All Live With—apt to shock anybody who just knows Rockwell from, like, that Thanksgiving turkey picture—all date from this period.)  Whether Rockwell and Spielberg can legitimately claim the authority to depict whatever and whomever they please is a valid question, and one that ought to be periodically revisited—although I predict that we’re consistently going to find that, yes, they can indeed.  Even so, we’d do well to keep in mind that their rhetorical posture—like any rhetorical posture—obliges them to be silent on certain subjects, even as it frees them to speak eloquently on others.

The other standard charge levied against Rockwell and Spielberg is that their output is kitsch.  How accurate that critique is, and how damning it is, depends, I suppose, on what you mean by kitsch.  If your gripe is that their work is purely commercial, deliberately shallow, and has no goal other than promoting itself and its associated products, then I don’t think either of these artists qualifies as kitsch: both obviously aspire to engage and communicate, not simply to accrue revenue.  If, on the other hand, you regard as kitsch any art that makes an appeal to the sentiments of a broad audience, then Rockwell and Spielberg absolutely qualify.  To argue, however, that their art is not legitimate, or is actually somehow detrimental, seems a bridge too far: that’s a critical position that just doesn’t hold up to scrutiny.  Yet Rockwell and Spielberg have no shortage of detractors who will say exactly this, and who will reflexively condemn the whole of their respective outputs—a position that, I suspect, accords with the unexamined tendency of self-identified elites to favor the niche over the common, to gird themselves with fashion and irony, and to avoid at all costs the potentially humiliating risk of actually espousing some belief from which they might not later be able to retreat.  And this makes me sad—not for Rockwell, who’s dead, or for Spielberg, who’s rich, but for a culture that is afraid to openly value much of anything.  Because here’s the deal, gang: any critical project that has obliged itself to automatically and categorically dismiss ALL positive assertions of value self-destructed a long-ass time ago.  It now amounts to little more than a baleful modernist ghost ship—the very sort that Howard Pyle might have painted—adrift on the teeming sea of cultural production.

In his 1960 autobiography My Adventures as an Illustrator (quoted by David Kamp in an excellent Vanity Fair piece, which I am indebted to Boman Desai for bringing to my attention), Rockwell sets forth his goals and methods with unapologetic directness:

Maybe as I grew up and found that the world wasn’t the perfectly pleasant place I had thought it to be I unconsciously decided that, even if it wasn’t an ideal world, it should be and so painted only the ideal aspects of it [. . .] If there was sadness in this created world of mine, it was a pleasant sadness.  If there were problems, they were humorous problems.  The people in my pictures aren’t mentally ill or deformed.  The situations they get into are commonplace, everyday situations, not the agonizing crises and tangles of life.

We should be cautious as we read this, lest we outsmart ourselves, as Deborah Solomon seems to have done: the fact that Rockwell sets out to paint the ideal aspects of the world doesn’t make him idealistic.  Rather than being unwilling to accommodate his values to the experience of living in a degraded world, all Rockwell does is accommodate: first seeking out lived moments that point the way toward an America that he’d like to see realized, then inventing ways to make those moments portable, signposting them for the rest of us.  Far from a serene and pious idealist, the Rockwell who addresses us through his images is a stubborn Pelagian optimist, offering reassurance and encouragement.  He’s working hard at this, because he knows what’s at stake—knows that the whole shebang could still break either way.

Because Dave Hickey is rapidly emerging as the chief household god of New Strategies for Invisibility—and because I haven’t been able to type a word of this post without looking over my shoulder at him—I’d like to sum up by quoting at some length from “Shining Hours / Forgiving Rhyme,” his deservedly legendary essay on Norman Rockwell.  (I’ve quoted from it before, and I’m sure I will again.)

The people who hate Rockwell [. . .] accuse him of imposing norms and passing judgments, which he never does.  Nor could he ever, since far from being a fascist manipulator, Rockwell is always giving as much as he can to the world he sees.  He portrays those aspects of the embodied social world that exist within the realm of civility, that do not hurt too terribly.  But it is not utopia.

People are regularly out of sync with the world in Rockwell’s pictures, but it is not the end of the world.  People get sick and go to the doctor.  (Remember that!)  Little girls get into fights.  Puppies are lost, and jobs too.  People struggle with their taxes.  Salesmen languish in hotel rooms.  Prom dresses don’t fit.  Tires go flat.  Hearts are broken.  People gossip.  Mom and Dad argue about politics.  Traffic snarls, and bankers are confused about Jackson Pollock.  But the pictures always rhyme—and the faces rhyme and the bodies rhyme as well, in compositions so exquisitely tuned they seem to have always been there—as a good song seems to have been written forever.  The implication, of course, is that these domestic disasters are redeemed by the internal rhymes of civil society and signify the privilege of living in it, which they most certainly do.

You are not supposed to forget this, or forget the pictures either, which you do not.  I can remember three [Saturday Evening] Post covers from my childhood well enough to tell you exactly what they meant to me at the time.  One is a painting of a grandmother and her grandson saying grace in a bus-station restaurant while a crowd of secular travelers look on.  The second depicts an American Dad, in his pajamas, sitting in a modern chair in a suburban living room on a snowy Sunday morning.  He is smoking a cigarette and reading the Sunday Times while Mom and the kids, dressed up in their Sunday best, march sternly across the room behind him on their way to church.  The third depicts a couple of college co-eds changing a tire on their “woody” while a hillbilly, relaxing on the porch of his shack, watches them with bemused interest.  The moral of these pictures: Hey!  People are different!  Get used to it!

Note Hickey’s use of the word “moral” in that last sentence, which I doubt very much was casual or unconsidered.  Rockwell is never moralistic—but his illustrations certainly depict a moral universe, one inhabited by people who are always struggling to figure out where value ought to be located, how they ought to behave.  Rockwell doesn’t preach, doesn’t prescribe, doesn’t pretend to have any answers.  The positive values he espouses are few, and carefully considered, and he broadcasts them like a beacon in every weather: decency, hospitality, tolerance.  Were he with us today, one imagines that he would be rather shocked and upset to hear these values regularly being decried as anti-American—as dangerous, even—when they are by any reasonable measure the cornerstones and necessary preconditions of virtually every worthwhile aspect of American life.  It cheers one, however fleetingly, to think of how a risen-from-the-grave Rockwell would engage with the contemporary cultural landscape (immigration reform, gay marriage, Islamic centers in Lower Manhattan) once he’d set up a studio and hired himself some models and a decent photographer.  It almost hurts to feel the lack of his living eye and hand coaxing us out of our fears and suspicions, trying to awaken us to what a long-dead American president once named—in darker times than these—the better angels of our nature.



“The storm propels him into the future, to which his back is turned [. . .] this storm is what we call Ke$ha.”

April 30, 2010

For those of you keeping tabs on my growing infamy:

The new issue of MAKE: A Chicago Literary Magazine is now available, and it contains an essay of mine called “New Strategies for Invisibility,” the writing of which sort of indirectly gave me the notion to start the blog you are now reading.  (This was back when I expected to focus here on issues of authorship and interpretation, rather than the output of pop stars with dollar-signs in their names, but our beginnings never know our ends.)  Having just now finished reading my contributor’s copy of MAKE, I can highly recommend that you check it out, and not just because I’m in it.  The thing looks great, not only in terms of the visual art it includes—though there’s a bunch of great art, including an illustration for my essay provided by my friend and personal hero Carrie Scanga—but also with respect to its design and production.  As much as I admire journals that present themselves as rare and precious objets d’art, MAKE looks like it actually expects to be read. 

It ought to be.  High points for me: 1) a short essay by Jenny Boully—it’s the source of the text on the issue’s cover—that manages to be both affecting and almost entirely unconventional in its structure; 2) “poetry” by Nate Zoba that consists of a copy of Tough Guys Don’t Dance by Norman Mailer with almost all of the text razored out of it; 3) a frank masochistic reading of Gone with the Wind by Kate Zambreno; 4) poems by Brandon Downing, some of which are collages in which cut-up text (taken in part from George Meredith’s The Amazing Marriage; I can’t figure out his other sources) are pasted over old Earth Science illustrations to unsettling effect; 5) a short story by Luis Sepúlveda (translated by Paul Grens) that got from me at several points the same head-shaking seriously? reaction as when I read Michael Chabon, or Paul Auster, or Raymond Chandler, and I do not intend this as a criticism; 6) poetry by Dara Wier and Cathy Park Hong and Nick Demske that I will not embarrass myself by trying to describe succinctly.

 

The issue also features work by people I actually know: a great collaborative story by Lily Hoang and my spouse-person Kathleen Rooney—which takes some surprising liberties with the Book of Ruth—as well as an essay on Bruce Springsteen’s The Rising by Michael Kobre, with whom I studied in grad school.  For my money, Mike is one of the best writers presently working the art-&-democracy beat—certainly a major inspiration for what I’m trying to do here—and I am honored to share a binding with him.

 

 

In other news . . . to my almost-entirely-pleasant surprise, my “TiK ToK” post got picked up by Metafilter, and then in rapid succession by the L.A. Weekly blog, by the Hathor Legacy, by Chicago’s own Gapers Block (glancingly, but still), and also by prominent Iowan John Deeth.  Responses were less savage than I might have anticipated.  (I AM a little blown away by the alarm that many, many folks have expressed at the length of the post . . . alarm which in some instances shades past bemusement and even irritation into what sounds like concern, as if I might be using the internet wrong, in much the same way that one might recklessly apply, like, a belt sander or something.  Dude, you are in no danger.)

 

 My thanks to everybody who has contributed to the “TiK ToK” discussion, and thanks too to the folks who were kind enough to bring the following treasures to my attention:

 

My sis-in-law Megan (via my spouse) mentions a recent post on Jezebel about Ke$ha’s recent SNL appearance; the post’s author notes—not without sympathy—that cracks are starting to appear in Sebert’s popstar armor.  (She also positions Ke$ha in relation to Lady Gaga in a way that’s interesting, if not completely satisfying.  If nothing else, it indicates solidifying consensus that Gaga is the new pop gold standard.)

 

Megan also reminds us that Ke$ha made a surprise guest appearance on NPR’s Planet Money awhile back, offering her professional assessment of a rap video designed to present contrasting Keynesian and Hayekian economic perspectives.  The Planet Money team presents this as a zany collision of disparate worlds; those of us who’ve spent time with “TiK ToK,” of course, understand that Planet Money is EXACTLY where it belongs.  The party don’t really start until Ke$ha walks in at about 10:45, but the front end is worth a listen, too, if only so you’ll understand what the hell is going on.  Quibbles: Alex Blumberg completely misses the joke at the beginning of the “TiK ToK” video—dude, it’s not her house!—and the discussion of the economic implications of the song fades out just as it’s getting good.  Still: cute.

 

Also cute—from the Princeton Tiger, via C. Dale Young’s blog, via my spouse—is this:

 

 

Paul Muldoon: too good a sport?  Discuss.

 

Finally, the intrepid Tim Jones-Yelvington directs our attention to The Rumpus, where Ian Crouch examines the recent country-music ascendancy of Darius Rucker, late of the much-reviled Hootie and the Blowfish.  Crouch basically nails the way contemporary country music works, and the circumstances under which it can traffic lucratively with pop; there are interesting parallels and perpendiculars to be drawn between Rucker’s career and those of Kesha “Love to Make a Country Album Someday” Sebert and Miley “Never Heard a Jay-Z Song” Cyrus . . . but you don’t need me to sketch them for you, I’m sure.

 

Okay, that’s enough for now.  In the pipe: the long-promised return to The Birds, at least one post on Dave Hickey, and—how long can I resist?—maybe something on Lady Gaga.  But of most pressing concern at the moment is, of course, Gold Diggers of 1933.  I hope to weigh in on that landmark of film within the week.  Contain yourselves!

Speaking of “Party in the U.S.A.” . . .

March 23, 2010

. . . I had this great idea for a post about the song in which I would praise the elegance of its structure, particularly the repeated lines at the ends of the verses, which build up tension ahead of the arrival of the chorus in a manner that recalls for me nothing so much as Jacques Brel’s classic “Ne Me Quitte Pas” (although Brel’s song is even trickier, since the last line of each of ITS verses is also the first line of each chorus, making it ambiguous where one stops and the other starts; Cyndi Lauper exploits this adroitly in her reading of the dodgy but standard Rod McKuen translation; lyrics-wise, Momus’s version is closer to the original) . . . and then I would point out how well-matched the lyrics of “Party in the U.S.A.” are to that structure, with the former consisting of an apologia for and explanation of the latter (hey Miley fans, if you feel uneasy about her move away from Nashville-slash-country to L.A.-slash-power-pop, don’t worry, cuz she’s even more uneasy about it than you are—although such a carefully-constructed script only works to the extent that the leading lady follows it; see sugarhigh! for more details*) . . . and I would FURTHER argue that in its evocation of social difference ameliorated by common cultural references, the song almost succeeds—despite itself, despite the shortcomings sugarhigh! points out, despite a core slickness I can’t help but suspect of cynicism—in both describing and embodying the “luminous devotion to the possibility of domestic kindness and social accord” that in an earlier post I quoted Dave Hickey as advocating . . . and then I would sum up by saying that the most appealing thing about “Party in the U.S.A.” is its implied suggestion that this possibility of kindness and domestic accord can best be accomplished not through consciousness-raising in the Enlightenment tradition, but rather through the kind of ecstatic escape from subjectivity and selfhood that the best pop hooks encourage and enable.

However, I came to realize that the song’s key lyric—which I had been hearing as “I’m not in my head, like: yeah”—is actually “I’m noddin’ my head, like: yeah.”

So, uh, never mind.  Go back to whatever you were doing.

*    I still maintain that I hear a different Nelly being borrowed from in “Party” than jane dark does—c’mon, listen to the background licks Cyrus sings starting at 2:37 and tell me I’m wrong—but sugarhigh! is hardly offbase in its assertions . . .