Photo power

They say a picture is worth a thousand words, but sometimes it speaks in a single scream.

Afghan photojournalist Massoud Hossaini shot just such a photo in the immediate aftermath of a suicide bombing during a religious festival in Kabul. Rarely have I seen an image so powerful that it immediately brought tears to my eyes, and the Pulitzer Prize Board agreed, awarding Hossaini the prize for breaking news photography.

Hossaini and several of his colleagues are featured in the 2015 documentary “Frame by Frame,” which looks at Afghanistan’s fledgling free press following the 2001 fall of the Taliban and its prohibition on photography of any kind. It’s not an easy job navigating a volatile political and social environment that is one of the most dangerous in the world. Insurgent attacks and government reprisals are common features of the daily news landscape. Female photographers carve themselves a niche in focusing their lenses on the nation’s women, taboo subject matter for their male counterparts, but that, too, comes with peril. When Farzana Wahidy brings her camera into a hospital burn ward to investigate the practice of self-immolation by women in the western city of Herat, officials balk at her presence, citing fears of Taliban violence.

The stories in “Frame by Frame” show a journalism community in its infancy, with its inherent optimism among the young crop of photographers determined to establish a free and vibrant Afghan press after decades of warfare and repression. True, much of that freedom relies on the dwindling U.S. presence in the region, without which the country could easily fall back under Taliban rule. But these photojournalists’ commitment to their homeland — as a Pulitzer winner, Hossaini could go easily leave the country for better, safer work — remains a beacon of hope that Afghanistan can reach its potential for a  peaceful, open society.

When a demo will do

Now our luck may have died and our love may be cold but with you forever I’ll stay

Bruce Springsteen’s 1982 album “Nebraska” was a singular phenomenon. The release represented a thematic reversal for Springsteen’s career, both in its lyrics and sparse musical arrangement, but in a greater sense that arrangement signaled a stunning break from conventional wisdom with its rejection of traditional studio production techniques.

Springsteen recorded “Nebraska’s” tracks on a simple 4-track cassette device in his New Jersey home. The intention was to re-record these cuts with his band for the next studio album, but Springsteen decided to stick with the original demos, which would become the defining sound of a dark and barren album. It was a gutsy move for the Boss, who would quickly resume full-throttle studio productions with 1984’s  more commercially viable “Born in the U.S.A.” For one album, listeners were treated to a rarity in rock ‘n’ roll — 40 minutes of a man and his guitar (and harmonica) fully exposed against a stark and depressing lyrical landscape.

I’ve often wondered why albums like “Nebraska” are the exception rather than the rule. True, a demo recording is a first draft of sorts, but you could argue it’s the truest representation of the artist’s vision before a production team refines the song into something more palatable to public consumption. “Nebraska’s” rough edges comprise its greatest charms and in many ways outline the themes the album explores. The uneven mixing and flat instrumental and vocal tones exude a sense of loneliness and distance — and at the same time an intimacy — that fits the mood of the album, which is perhaps why Springsteen chose to go this route. A full album of, say, Britney Spears demos probably wouldn’t work out so well and would be better off in the hands of a skilled producer.

Occasionally I’ll hear demo versions of well-known pop hits that slip out of the vaults, either unintentionally or via official “stripped down” re-releases. Some leave me amazed at the voodoo that studio engineers do so well in turning a humdrum guitar-and-vocal piece into something truly remarkable. But more often than not, I’m disappointed at the intrusion of so much polish at the expense of the heart and soul of a song put forth in the artist’s original recording. When to turn on the studio magic and when to lay off is obviously the artist’s call, in consultation with his or her producers and/or collaborators. But my suspicion is that commercial concerns weigh heavily into these decisions, leaving me only to wonder how “Nebraska” would sound had Springsteen not had the audacity, and more importantly the clout, to follow his instincts.


Here’s an example of a highly polished, fully produced recording in John Lennon’s original studio release of the single “Woman,” from 1980’s “Double Fantasy.” The acoustic guitar is softened, presumably to suit the mood of a love song, and the voice is equally sweetened to puppy-love levels better suited to a high school dance.

Now listen to the “stripped down” version re-released in 2010. It’s technically not a demo, but a “remastered” version of the above studio recording that focuses on a simpler arrangement. The guitar isn’t so effected, and the voice, though not always in perfect tune, is pure John Lennon. This is probably more faithful to his original conception of the song, if not the demo he would’ve provided to the producer.

Tuning in

What can art teach us about politics? More than you’d think. Signals offered in creative expression can tell us a lot about the mood in the real world. But you have to be listening.

Clearly I wasn’t, and many of us weren’t, when Donald Trump pulled off his stunning victory in last month’s presidential election. Few pollsters, media analysts and pundits on the left or right saw it coming. Count me among them. Right up through the early evening of Nov. 8 there was no doubt in my mind that Hillary Clinton would be the next president of the United States.

Well, there was a small blip that, in retrospect, might have been a clue for me. That came in July, when rabble-rousing documentary filmmaker Michael Moore predicted a Trump win in what I dismissed as Moore’s typically outlandish political pessimism designed as a call to arms for complacent liberals. As an unapologetic partisan, Moore blurs the line between art and politics, but I’ve seen enough of his films to acknowledge his unique grasp of blue collar America. “Roger & Me” cataloged the human cost of Michigan’s disappearing industrial economy at a time when most of the media were focusing on the tech-driven economic revival of the 1990s. While terrorism and security dominated headlines in the early 2000s, “Sicko” called attention to what would become the defining policy debate of Barack Obama’s presidency — health-care reform. From the perspective of 2016 blue collar America, neither economic transformation nor crippling health care costs have been adequately addressed by leadership. Despite his annoying penchant for spinning the documentary form into screed, Moore has demonstrated an effective finger on the pulse of a disaffected constituency that was likely to buy into Trump’s vision and, as it turned out, was instrumental in delivering crucial swing states for the Republican.

In the aftermath of November’s election, Moore’s prophecy forced me to recalibrate my antennae for this constituency. It’s not an alien one, in fact quite familiar — generally (but not exclusively) white, lower educated, rural-based and highly vulnerable to the forces of globalization. It’s a group of people with whom I’m well acquainted through the music of John Mellencamp and Bruce Springsteen. Like Moore, both artists embrace liberal politics, and both have an uncanny understanding of the issues affecting this demographic. Listen to “Pink Houses” and you’ll hear the despair of people for whom America represents a failed promise. Given Mellencamp’s political leanings, I’d assumed his descriptions to be grounded in a liberal vision, but listening more closely, those lyrics outline a displaced class of people — high school educated industrial laborers and farmers — for whom a Trump presidency is a plausible alternative.

Springsteen has long billed himself as the voice of the working man, and nowhere was that voice more authentic than on his landmark “Born in the U.S.A.” album. The songs document the symptoms, on a human scale, associated with the demise of the manufacturing sector that once fueled the Rust Belt. But what they’re really about is a loss of a way of life. Where do people who took a union job out of high school at the local factory fit into an economy that suddenly requires new skill sets or new education levels? The hard answer, of course, is they don’t. They must adapt, and it’s those who haven’t who are susceptible to the Trump message. You can argue that it’s beyond the power of any president to control or roll back economic transformation, but it’s not a matter of choice for the characters in “Born in the U.S.A.” The world changed, and they feel abandoned. The implicit social bargain of blue collar America has been broken. As Springsteen says in “My Hometown,” “Foreman says these jobs are going, boys, and they ain’t coming back.”

It goes without saying the clues offered here could never tell the whole story of the 2016 presidential race. There are too many complex forces in play in any election to lazily point to a few movies or songs as definitive social barometers. But after an event that many of us swore we didn’t or couldn’t see coming, let me be the first to admit that I missed — or disregarded — signals hiding in plain sight within the pop culture. After all, I’d been hearing it for years. I just wasn’t listening.

Defying convention

There was a time when television news was boring. When stone-faced, gray-haired anchors soberly announced, without any particular flair, the events of the day and then signed off. No commentary. No punditry or analysis. Just the facts.

It was not a particularly lucrative model for networks. In fact, it was a money loser but considered a public service duty as a condition of their FCC licenses. Johnny Carson and Dick Van Dyke brought in the profits, while the evening news was essentially a charitable write-off.

Today, television news is dominated by highly profitable cable networks who have developed a model that presents news as entertainment through a variety of programming innovations. We see younger, more attractive anchors. The coverage leans toward more sensational topics. And less sensational topics are presented with a sense of urgency (“BREAKING:”) that overhypes the subject. And finally, with 24 hours to fill each day, networks devote an increasing amount of airtime to reflection, analysis and debate. These three activities, and particularly the last, have given birth to what I’d call the punditry class — a group of people, typically ex-politicians, military leaders, scholars, authors, minor celebrities and other public figures who bring whatever expertise they have to the discussion at hand. And watching them hammer away at each other adds a uniquely entertaining flavor to what is otherwise an academic exercise.

The punditry phenomenon didn’t emerge as an epiphany from the offices of brilliant cable news executives. But you can bet they were paying attention when it was so skillfully demonstrated, long before the advent of cable, during the 1968 presidential campaign. It was then that relative news neophyte ABC, in a money-saving move, eschewed gavel-to-gavel coverage of the Republican and Democratic conventions and instead staged debates in what is now the classic left-vs.-right format.

Representing each side were the preeminent political thinkers of their time, William F. Buckley and Gore Vidal, who agreed to a series of televised debates as a way of digesting the convention coverage. Featured in the 2015 documentary “Best of Enemies,” the conservative Buckley and liberal Vidal were intellectual giants, unmatched in any forum, until they met each other. Their hubris and arrogance was self-righteous at best, insufferable at worst, earning them devotees and critics alike who were all too happy to see either of them knocked down a peg on national TV.

The premise must’ve seemed tempting for viewers accustomed to the mindless minutia of a political convention, but the appeal only grew as the personal hostility between Buckley and Vidal became sharply evident. Throughout the summer the debates devolved into shouting matches of a particularly vindictive and, amazingly for such great minds, childish nature. Viewers became spectators, tuning in for knockout zingers rather than a succession of mental thrusts and parries. It all boiled over when Buckley, goaded by Vidal’s “crypto-Nazi” taunt, countered with a vicious tirade punctuated by a homosexual epithet and physical threat. The stunned network shut down the show, but as the lights came down, the smirking Vidal, presuming himself the winner, quipped to his still flushed rival that “we gave them their money’s worth.”

And that’s what it was all about. The debates earned ABC derision among its established, “proper” competitors, but the network got the last laugh. The experiment effectively ended gavel-to-gavel coverage of conventions, and plans for the 1972 campaign included enthusiastic nods to the Buckley-Vidal model. It was cheaper, less labor intensive and, most importantly, a ratings winner.

That’s the model cable networks have picked up on and what has since largely dominated television news. Is it a win for the viewers? Few of the pundits on the air today have the intellectual chops of a William F. Buckley or Gore Vidal, and with the substance of the discussion often getting lost in the shouting, the scale tilts heavily towards entertainment at the expense of information. But it’s not boring.

Power to the people

“If your mother says she loves you, check it out.”

I can’t tell you how many times I’ve wanted to punch the editor who fancied himself a latter day Lou Grant or Ben Bradlee with this quip. We get it, boss. It’s an important, if dated, maxim by which good journalists operate.

It’s something worth remembering as the furious post-election scramble for moral superiority playing out on social media promulgates information that is misleading, out of context or flat-out false.  And any social media user would be wise to consider the benefits of checking it out, if for no other reason than saving themselves some embarrassment at the Thanksgiving dinner table.

Why I am telling you this? Be assured it’s not to lecture anyone. Throughout my career, there have been more occasions than I care to admit when I’ve gotten it wrong. It’s a horrible feeling. Yes, it’s annoying because I have to correct the record, my ego takes a ding and it’s likely to come up in my performance review. But what bothers me more is the damage I’ve done, not only to my news organization’s credibility, but to the aggrieved parties named in the article or headline I’ve edited. It cannot be undone, and it’s a responsibility I take very seriously.

But in a greater sense, I’m speaking to you as fellow journalists. That’s right. In whatever capacity you engage in social media, you are — despite the objections of those sticklers in “traditional” media — as much a journalist as I am. You have to power to retrieve and distribute information that has no small role in influencing the decisions of the friends, family, colleagues and anyone else who follows you. Through their (hopefully) positive personal association with you, your recommendations can carry more weight than the New York Times or Fox News.

Don’t believe me? Several years ago, I was stunned to hear colleagues relate a new phenomenon of people whose sole source of news and information was their Facebook feed. That’s not to say traditional media wasn’t part of the mix, but their diet was restricted to however they chose to set up the feed. If we weren’t on it, they weren’t reading it. It was a purely anecdotal observation, so I was skeptical. But it got me thinking about the possibilities. Isn’t it plausible that people inform themselves exclusively via social channels?

Why wouldn’t they? The internet offers an incomprehensible volume of information and ideas available to anyone interested in any topic. Thanks to Google, you can find the ones that fit you. And thanks to social media, you can eliminate the ones that don’t. It’s an incredibly seductive place — a place where you pick and choose the information you like, and you never have to be wrong. The temptation to operate at this level becomes evident during highly politicized times such as an election year, when we find opposing sides beating each other over the head with different sets of facts. It’s what we’re seeing play out now as winners and losers from last week’s election desperately try to buttress their cases with supporting documentation scoured from the depths of the internet. For all of our legitimate concerns over proper vetting and verification of source material, this is the new marketplace of ideas. It’s how — and where — people are choosing to engage each other. Social media is leading the discussion, while traditional media, still brooding over how it so badly misread the electorate, plays catch-up. The public forum has migrated, and in the process transformed into myriad microforums.

Let’s be clear: We aren’t going to step back from this experiment and return traditional media to its gatekeeper role. Those days are over. Somewhere in the last decade, the balance shifted irrevocably. Traditional media isn’t going away, but despite what critics from the far left and right say, its capacity to influence public opinion is severely diminished and probably has been for some time. My suggestion for our industry is to stop thinking of ways to fit social media into what we do, and instead work on how the things we do can fit into social media. There will still be a place for our content, but it will be increasingly at the discretion of social media users to share or dismiss as they see fit.

So while news organizations lick their wounds from a brutal election cycle, let’s accept their limitations in an arena they no longer dominate. Conservatives convinced of a perpetual liberal bias can rejoice (although the victory is bittersweet for radio host Charlie Sykes). Liberals who once demanded “power to the people” now have an opportunity to seize it. From any perspective, it’s a seismic shift of Jacksonian proportions. The floodgates have opened. The power once vested in the Fourth Estate now belongs to all of you, my fellow journalists. Use it wisely.

Best Western?

Most people enjoy a good Western for the bygone values the genre represents — good honest men taking on bad guys, Indians, unruly horses and open terrain in a massively satisfying exercise of taming a wild but beautiful realm.

Anyone who has studied the settlement of the American West knows this to be an exaggeration, but it endures as a testament to a simpler, if more violent, society that, for some reason, some people long for. In today’s urbanized, industrialized, globalized world, a shootout at the O.K. Corral and the occasional tangle with Apaches seems oddly comforting. Differences are settled in honorable fashion (poker games, one-on-one fistfights and, of course, the duel on Main Street that commences on the first draw). Ladies are divided into two camps — the Eastern transplants of chaste Victorian purity, and the native Western whores. It’s all so wonderfully uncomplicated, and deliberately unmodern.

Modernity is an important concern when it comes to Westerns. The traditionalist approach would seem to dictate that it be ignored, or at least marginalized. But there are films that acknowledge the greater historical context within which the Western resides. Their stories tend to be more complicated, the conclusions more troubling. But that’s life. To pretend that a continuous undisturbed settlement of ranches and towns occurred in a vacuum may be a fantasy that many of us are willing to indulge, but artistically it’s disingenous.

“The Good, the Bad and the Ugly” offers glimpses of modernity — in the form of the American Civil War — in an otherwise classically constructed Western. “Butch Cassidy and the Sundance Kid” goes further, posing a plausible end game for a Western archetype — a gang of bank robbers — with irrefutable logic. Butch and Sundance have settled on a lifestyle of raiding and robbing, but how long can this really go on? At some point, the law can’t stand for it, and the two are chased across the West and into South America before meeting their deaths in classic Western fashion — with guns blazing. The message: Their era was over. Were they ever going to settle down and get normal jobs? Of course not. But their hijinks were little more than a futile attempt at stopping the world from changing.

Though set in the Middle Eastern desert during World War I, the 2014 Jordanian film “Theeb” has parallels to the American Western, the intrusion of modernity among them. Bedouin tribes live life on the move. Danger lurks on the trail, and, just like in the Old West, sidearms and rifles are standard equipment. Stranded in the desert together, a boy and the man who killed his brother must learn to cooperate in order to survive. But then comes the historical context. The killer formerly made his living as a guide, taking Muslim pilgrims through dangerous open country, before suddenly finding himself rendered obsolete by the railroad. Now, he gets by as a raider. That upheaval, along with the arrival of war via the Ottoman empire and other foreign interests, raises serious questions about the Bedouins’ ability continue their traditional lifestyle.

But these are questions traditional Western fans have little interest in considering. They expect the “good honest men” paradigm to define not only a place and time, but a state of mind, one that, like Dickensian London or Grimm’s fairy tales, defies the march of history. Fair enough. It can do one’s heart good to escape to a world we can believe in, even if, when we’re really honest with ourselves, reality tells us different.

‘Faithfully’ yours

One of the perils of achieving “classic rock” status is the brand that carries you — your sound — can also imprison you. Listeners come to expect a certain package that makes you you, and if you deviate, you’re not you. Make sense?

Rare is the classic rock act that’s able to zig when it’s expected to zag. Fleetwood Mac famously did it with its jarring “Tusk” album to follow up the radio-friendly smash”Rumours.” Lou Reed did the same with the near-career killer “Metal Machine Music.” And Bruce Springsteen shook up his rousing, full-throttle catalog with the barren collection of demo cuts that comprised “Nebraska.”

But a band like, say, Journey, isn’t capable of such leaps. In its 1980s heyday, Journey had lots going for it, churning out a mix of rock songs and ballads performed with a deft balance between Neal Schon’s screaming guitar and Jonathan Cain’s keyboard hooks. Tying the two together was arguably the band’s biggest asset — Steve Perry’s soaring vocals. Whether at a school dance or on the boombox behind the gas station counter, it was through Perry that you immediately identified a Journey song as a Journey song. It was the one feature the band could not afford to lose.

But that’s exactly what happened. Between the usual “creative differences” squabbles and Perry’s increasing struggles to keep his vocal range as he aged, Journey reluctantly parted ways with its singer on the expectation that he could be replaced. It was easier said than done. The band muddled through two decades of lineup shuffles and comeback attempts before hitting the jackpot with Filipino cover singer Arnel Pineda.

Even for casual or non-Journey fans, the story is an amazing one, and worthy subject matter for the PBS Independent Lens documentary “Don’t Stop Believin’.” After years of getting by with temporary Perry soundalikes, Schon and Cain scoured through YouTube to discover their gem in Pineda, whose vocal resemblance to Perry was uncanny. He successfully auditioned, joined the group on tour and cut a new album with them. With Schon and Cain as sharp as ever on their instruments, Pineda provided the final piece of Journey’s wayback machine, bringing fans as close to 1983 as they’ll ever get.

But, as tends to be the case with Independent Lens documentaries, there’s more to this story. We see how Pineda, a considerably younger and less experienced musician, fits in with the band. Schon and Cain tutor their prized protege with the detailed attention of Henry Higgins to Eliza Doolittle. It’s not so much a partnership between co-equals as a business arrangement. It’s clear from the outset — Pineda will enjoy wealth, fame, even some degree of musical development, but with the ironclad condition, and one he willingly accepts, that he must sound like Steve Perry. If he can’t deliver, night in and night out, he’s done. And surely even Schon and Cain realize that without Pineda — or miraculously discovering yet another Perry soundalike — they’re finished as well.

So while “Don’t Stop Believin'” highlights the degree to which a rock group’s brand controls its identity, the deeper philosophical question is the surrender of one man’s identity to another’s. What’s it like to know that one’s raison d’être is to be someone else? By outward appearances, Pineda seems well adjusted to this reality. It certainly beats the alternative of singing covers at Filipino karaoke bars. But it’s a chilling conclusion that his fortunes — and Journey’s — forever answer to the tune called by Steve Perry.