Separate but equal

John Irving once said the key to making a movie out of one of his novels is to throw away nine-tenths of the novel. Given the size of a typical Irving book, that seems like a fair ratio. The trick is to keep the right 10 percent.

It’s a common conceit that a big-screen (or small-screen) adaptation of a literary work is going to be inferior, and to an extent I get that. The two media have different aims and different means — books have an unlimited capacity to immerse us in a story, while movies traditionally are expected first and foremost to entertain. I’d point to Steven Spielberg’s “Schindler’s List” as a classic example of cinema’s inability to convey the drama and pathos of the remarkable narrative that unfolds on the printed page. In the hands of a more artful director, it might have worked, but I’d recommend the book to anyone who wants to gain a full appreciation of Oskar Schindler’s story.

That doesn’t make it a rule. This week I caught a screening of an Italian adaptation of Stephen Amidon’s “Human Capital,” at which the suggestion came up that the movie is actually superior to the book. I haven’t read it, so I’ll have to take that assertion at face value, but what struck me about the film was the depth of its characters. That’s a telltale sign of a book as the source material for the script, and I applaud the producers for understanding its value. There’s a decent thriller plot to drive the story along, so a lesser director (I’m looking at you, Spielberg) could’ve easily latched onto suspense and narrative twists at the expense of character study.

Luckily, that didn’t happen, and we care what happens next while caring about the people it happens to. Whether that makes it “better” than the book, I can’t say. I’m more inclined to argue that the best we should hope for is “equal.”

The most “equal” screen adapation of a book I’ve seen is Lawrence Kasdan’s “The Accidental Tourist,” primarily because of the lead performance by William Hurt. Interestingly, I saw the film first, so I’ll never know how I might have envisioned the Macon Leary put forth in Anne Tyler’s novel. Hurt forever defined the character for me. His puzzled and sometimes pained delivery captures the character’s maddening attempts to stoically ride above life’s turbulence. That came through when I read the book, but was it because Hurt put it there? Like I said, I’ll never know. That’s extraordinary casting and an extraordinary performance in a film that’s worthy of its literary counterpart.

“Wonder Boys” benefits from similar casting brilliance, but for the film version of  Michael Chabon’s book, I’d substitute “equal” with “different.” Michael Douglas is the perfect choice for the lead role of befuddled, drifting Professor Grady Tripp, but I found room for my own interpretation of the character when reading the book. Perhaps this is because there are significant differences between how the two versions of the story play out.  To me, they were equally enjoyable. In the spirit of Irving’s nine-tenths rule, the filmmakers understood that Tripp is the beating heart of the novel. As long as the they stayed true to that character, other differences wouldn’t matter.



The good fight

It is said that war is hell, but popular music historically has been more interested in the premise that war is wrong. While policy makers in this and other countries have found no shortage of causes for which to take up arms, musicians, like most artists, have tended to oppose them. Whether it was the Woodstock generation’s very focused effort to end the U.S.’s involvement in Vietnam or the general peace-loving ethic of a John Lennon or Pete Seeger, the idea was that, whatever the conflict may be, it’s not worth fighting or dying (or killing) for.

But after this country’s ignominious withdrawal from southeast Asia, overt antiwar positions became increasingly regarded as unpatriotic or even traitorous. A love-it-or-leave-it backlash found its way into popular music, most notably through country artists like Charlie Daniels and Lee Greenwood, and a healthy strain of red, white and blue remains woven into the fabric of today’s music. Nuanced songwriters like Bruce Springsteen have shown the ability to engage both camps — hate the war, but support the soldiers fighting it — but in general the view is you’re either with America or against it.

That’s a shame, because artists can and should do better. Conflicts associated with love, marriage, friendship or family are terrifically complicated and have inspired some great music worthy of their depth. While the dynamics of the battlefield would seem to be comparatively simple (kill or be killed), there is music that challenges us to dispense with the rallies (either protest or pep) and look at war through the eyes of the people fighting it. Pink Floyd’s “Us and Them,” for example, points out the absurdity from the perspective of those on the front line:

‘Forward!’ they cried from the rear
And the front rank died
The generals sat, and the lines on the map
Moved from side to side

Then, amid the frenzy of Reagan-era fist-pumping, Dire Straits offers a touching characterization of the bond between soldiers in “Brothers in Arms.” The questionable morality of fighting a war is supplanted by an underlying code of the battlefield:

Through these fields of destruction
Baptisms of fire
I’ve witnessed your suffering
As the battle raged higher
And though they did hurt me so bad
In the fear and alarm
You did not desert me, my brothers in arms

The Decemberists take this bond a step further in “The Soldiering Life,” which dares to suggest an undeniable exhilaration of the battlefield experience in a refrain that’s almost celebratory:

But I
I never felt so much life
Than tonight
Huddled in the trenches
Gazing on the battlefield
Our rifles blaze away
We blaze away

Critics may charge these songs with treading uncomfortable ground. Surely, you can’t adequately address the horrors of war with mundane complaints about incompetent superiors across layers of a government bureaucracy. Or a solemn death on a smoldering field, at peace in the knowledge that our closest comrades haven’t abandoned us. Or jubilation over the amazing display of firepower that can only be experienced at the front. No, these songs won’t support the simple proposition that war is hell when it is, in fact, so much more.

Independence Night

Raise your hand if you’ve ever been invited out for a seemingly casual night of bar trivia, only to look around you and realize everyone else in the room is, well… nuts. These are the people who have committed to memory the periodic table, the line of succession within various 15th century European monarchies and, most frighteningly, the lyrics to every Spice Girls song. They are the ones whose superior snorting rings in your ears as you slink out of the bar without even placing.

Such characters and their disturbing milieu get some long overdue comic and dramatic treatment in “Trivia Night,” a shoestring-budget independent film that made its public debut at the Green Bay Film Festival on March 5. Movies like this are why I love independent cinema. It was produced, written, directed and performed largely by people donating their time and funded through a Kickstarter campaign. How do I know? Two of them were at the screening as they accompany the film through the festival circuit in hopes of earning publicity and, presumably, a distribution deal. How often do you get the chance to watch a movie, then afterwards talk to the people who made it? It took me a minute to realize that one of them, the movie’s primary screenwriter, also played the lead character.

Indie cinema, at least in the Midwest, tends to be regarded by mainstream audiences as something between electric cars and tofu. A number of preconceptions apply, typically involving a snobbish elitist who turns his nose up at the whiz-bang blockbuster or predictable rom-com in favor of something dark and serious, socially conscious, shot in black and white with a handheld camera, and preferably with subtitles. I admit that I have a reflexive aversion to the multiplex, but I also have no affinity for the cinematic equivalent of self-flagellation.

Both lighthearted and insightful, “Trivia Night” delivers a Hollywood-quality production without the tiresome Hollywood formulas. The guy doesn’t get the girl because, would you believe it, the guy isn’t that interested in the girl. Or it’s not really about a guy and a girl. And the ending doesn’t see the guy vanquishing past trivia demons with a glorious victory, nor does he lose his final challenge, instead achieving a sort of stalemate. Let’s see Hollywood screenwriters wrap their heads around that. They’re more concerned with delivering a satisfying conclusion while plucking emotional heartstrings with a lesson learned and maybe some character growth.

Despite its somewhat outlandish premise, “Trivia Night” comes across as genuine, its characters like many people I know. That’s because the producers haven’t (yet) been forced into the Hollywood trap, and the result is a film that, for all the baggage surrounding the “indie” label, is fun to watch.

Meet the new boss

There’s no denying the digital age has revolutionized the production and distribution of music in ways we couldn’t have imagined even 20 years ago. Just when we thought we had some stability with iTunes, streaming services have again upended the industry’s business model, leaving artists and audiences scrambling to navigate a new landscape. Who survives and how they do it will be interesting to watch.

That said, this isn’t the first revolution in music. The earth truly shook, fingers pointed and hands wrung when, as explored in a recent Smithsonian article, Thomas Edison unveiled his phonograph in 1877. In his autobiography “Life,” Keith Richards regards the advent of recording technology as a great social leveler. Before recording, the only way for composers to share their work was to notate it on a written sheet and have professional players perform it live in a public setting. This assumes a certain level of wealth and education, reserving the artistry of music for certain social classes. That changed with the ability to record songs, which allowed people to bypass the concert hall and produce a composition on their own terms. Most importantly, Richards argues, it removed sheet music from the equation. You could learn guitar lick by listening, not reading, and anyone with talent, “literate” or not, was no longer restricted from participating.

That’s all well and good for performers, but what about audiences? Not surprisingly, consumption of music also underwent a seismic shift thanks to a previously unheard of option: on-demand playback. A new kind of listener emerged — one whose devotion to a work or genre no longer was limited by the availability of performances. If you liked a record, you could listen to it over and over. You could gather its nuances and moods in ways that wouldn’t be possible at a one-off Saturday night concert.

Consequently, a new kind of musical order emerged — one divided into specific genres, e.g blues, jazz, opera, hillbilly. Because people were able to pick and choose what they listened to and how often they listened to it, winners were separated from losers, with a broad spectrum of losers collecting dust on the shelf as a favored repository of winners commanded heavy rotation on the phonograph. That’s a phenomenon that continues today with the availability of dedicated satellite channels and customized Spotify playlists (with an unexpected consequence; more on that later).

And finally, a new kind of behavior emerged — the act of listening to music alone. This was unsettling to early critics of the phonograph, who saw music appreciation as an essentially social experience. I spend a lot of time listening to music alone — I think of it as something akin to reading — and judging by the number of people I see wearing earbuds at the office, on the bus, or running in the park, I’m not the only one. But I also understand the elevated sensation of a shared musical bond — even if it’s a cover band cranking out “Brown Sugar” as we all bounce and sway in unison and grinningly wail the lyrics at each other. The concert performance hasn’t died — in fact it has seen a renaissance of sorts as musicians realize its income potential vs. the dwindling returns from download sales or streaming royalties. You could even argue recorded music benefits its live cousin because of the familiarity it establishes between artist and audience.

Bottom line: Recorded music wasn’t the end of the world that its critics imagined. Just a different world. And that world is changing again with the prominence of streaming services like Spotify and Pandora, which have already produced a curious effect. The Smithsonian article suggests that listeners, particularly younger ones, are now rejecting the genre-based categories we’ve become accustomed to as the ease of sampling a wider variety of styles lures us out our comfort zones. That’s a development I would not have predicted, and I think it’s a good one.

What’s a little more troubling is the paltry financial compensation artists get for licensing their works through streaming services. Proponents of this model argue that the exposure those services bring is the true payoff. That’s a quandary artists, labels and the services still need to resolve, but it’s reminiscent of the inequities that existed in the early days of recording. The revolution, it would seem, has come full cycle.