Familiar but fresh

Here in Wisconsin (and I suspect elsewhere in the United States) cover music, especially live cover music, has a suspect reputation. Our local markets are flooded with bands whose primary talent is presenting other, more famous bands’ original recordings, with pristine fidelity. The idea is, if you’ve never had a chance to see the Rolling Stones or the Beatles, here’s the next best thing. This gets the house a-rockin’, but artistically it leaves something to be desired. It’s simply an amped-up version of what you’d hear on the radio.

But cover music has its charms when there’s interpretation going on. It can be a tightrope act, particularly if the original band or song belongs to the canon of the beloved. If you’re going to take on the Stones or Beatles, you have to bring something new to the mix. No one will do what they did as well as they did it. But approached in the spirit of creativity, a cover version can give us two songs for the price of one. Or in the case of Bob Dylan’s “If Not for You,” three songs — versions by Dylan, George Harrison and Olivia Newton-John that are as unique as the artists themselves.

It’s easier when the acts carry distinctly different backgrounds — musically, culturally or even geographically. Consider Erma Franklin’s brassy, bluesy “Piece of My Heart.” Big Brother and the Holding Company’s Janis Joplin brought her own blues chops to the song, but her guttural screeching and the band’s wailing guitars steer the song into heavy metal territory. Listen to English bands like the Moody Blues taking on Bessie Banks’ “Go Now” or the Stones doing Irma Thomas’ “Time is on My Side,” and you’ll hear subdued, somewhat refined vocals where their predecessors unleashed full-throated, bruising pipes.

Sometimes nationality isn’t the distinction. The Who found Marvin Gaye’s bouncy “Baby Don’t You Do it” a perfect fit for their manic energy, while Gaye’s fellow North Americans The Band re-imagined the song in their trademark rootsy groove. It’s so interesting to hear the different ways people hear the same music.

And it’s not just music. Cinema has a similar tradition, and while I’m not as studied in film as music, most of the remakes I’ve seen — “Sabrina” comes to mind — stick to the script (pun intended). To be sure, you’ll see cosmetic changes, for instance to update clothing, gender roles and other cultural references from one era to another, but there’s no substantive structural alteration. The essential creative direction and message of the film remain intact.

But this winter, I had the opportunity, on back-to-back frosty December nights, to check out the Norwegian psychological thriller “Insomnia,” along with Christopher Nolan’s American remake. Told either way, it’s a taut whodunit involving a hotshot detective dispatched to the near-Arctic to solve a grisly murder, only to come unglued by insomnia in land of the midnight sun (in the American version, Alaska).

Again, I noticed some cosmetic variations that I’d attribute to the remake being a Hollywood product. The cold-blooded killing of a dog and suggestions of sex involving teenage schoolgirls are scrubbed from the American film. It makes sense. The intended audience is more multiplex than art house.

What’s most striking is the Europeans’ grasp of subtlety, and it begins with the detective. His background, motives, abilities, even his memories become wrapped in layers of ambiguity as sleeplessness peels away at his reality. His American counterpart, played by Al Pacino, is more clearly defined. He carries his own baggage from the Lower 48, but that backstory comes into focus as the plot moves toward resolution, rendering the confusion of insomnia a secondary concern. Equally palpable is the arch-villain Robin Williams, whose actions and intentions, while twisted, make sense. The film’s most effective display of subtlety lies in the blurring, through a shared guilty past, of the evil and good that Williams and Pacino are meant to represent. In that way it’s an existential film, in which two men struggle to preserve their identities amid the moral fog of endless arctic daylight. Insomnia doesn’t cause the confusion, it merely brings it to the fore.

The Norwegian original, meanwhile, pays less attention to the protagonist-antagonist relationship in favor of the detective’s internal struggle for sanity. Solving the crime becomes less important as day after sleepless day wear down the veneer of certainty. Stellan Skarsgard plays the role with a characteristic nordic coolness that vacillates between intriguingly mysterious and frustratingly passionless. It’d be interesting to see how the hot-headed and expressive Pacino would’ve approached the role in the Norwegian production.

And then there’s the ending. (Spoiler alert: Stop here if you plan to watch either film.) That Pacino’s detective finds redemption through death is a common American — especially Hollywood — cinematic resolution. That is to say, neat and tidy. Death settles all scores. It acknowledges the detective’s flawed character, finally free of the dark secrets that threatened to consume him, redeemed through heroically bringing Williams — also dead and perhaps free of his demons as well — to justice. No such luck with Skarsgard, whose transgressions are uncovered but ultimately overlooked by the local authorities, leaving him to return home to his day job, and the audience to return home with more questions than answers. How typically European.

So which film works better? Which approach, which cinematic tradition wins out? That’s obviously not the point. It’s the interpretations that matter and the differences that interest me. They help us understand not only where the filmmakers are coming from — both literally and artistically — but how they perceive their audiences. Without those differences, they’re just copying an original, like a flawless, soulless Rembrandt reproduction. Or in musical terms, a cheesy cover band.


Bringing it home

While I’ve never been an academic, I appreciate the connection scholarly research shares with my chosen profession of journalism. They are cousins in their ability to dispel myths and call basic assumptions into question.

IMG_E2596Take sociology, where Matthew Desmond’s “Evicted” is a tutorial on the ability of comprehensive field research to recalibrate our understanding of the world. Journalists who haven’t read his book (and I suspect there aren’t many) would be well advised to do so as a refresher on the finer points of objective observation. Not that Desmond completely succeeds in that objectivity (more on that later), but his few failures also hold lessons for us.

Based on Desmond’s field research project in the largest city of my home state — Milwaukee, Wisconsin — “Evicted” investigates the transient nature of the private low-income housing market. Chapters chronicle the struggles of both black and white folks — the former lodged in the city’s near-north side neighborhoods and the latter in a south-side trailer park — to keep a household in this famously segregated city. Desmond introduces us to a renters’ roster of welfare recipients, single moms and drug addicts who shuffle eternally through the city’s low-income housing stock, along with the landlords who engage them in a tense but familiar dance of lease and eviction.

Along the way we encounter heartbreaking mixes of bad luck and deliberate choices that lead people into drug use, prostitution, petty crime and, ultimately, homelessness. How society views these “losers” in the game of life is important, because our perceptions inform the policies we devise as a response. And that’s where the misconceptions, and Desmond’s ability to counter them with his findings, come into play.

For instance:

  • Rents in the inner city — which features dilapidated housing in crime-ridden neighborhoods — rival those of middle-class areas of the city and suburbs. Yet for a number of reasons that range from a rental applicant’s checkered past to outright racism, those middle-class vacancies remain off limits to inner city residents, and landlords take full advantage of a captive market.
  • Because of this, the inner city rental market can be highly profitable. It is not the low-margin business we might expect. Those who are willing to deal with the considerable headaches that come with low-income tenants can find it worth their while. As the book points out, “the ‘hood is good.”
  • The housing is dilapidated because there’s a disincentive for landlords to make repairs for tenants who are behind on rent and halfway out the door, and there’s an equal disincentive for tenants to make their own improvements to a home from which they may soon be evicted. As a result, they live with backed-up sinks that attract vermin, and wear overcoats around the house to weather the winter months.
  • With the imminent threat of eviction, there’s little incentive for tenants who are behind on payments to catch up. Call them deadbeats, but it makes sense. A $700 gambling windfall is better spent on a security deposit for your next place than any back rent to a landlord who’s going to throw you out anyway.
  • Like many cities, the Milwaukee police’s nuisance-call policy discouraged residents from reporting crimes in their homes — particularly domestic abuse. Under the policy (since changed), multiple police visits to a residence would trigger nuisance-property fines for landlords, who would promptly initiate eviction proceedings. The lesson to tenants: Don’t call 911 for help in an emergency, or risk eviction.
  • South-side trailer park residents are often enticed into leases that “give” them ownership of the trailer. They’re only responsible for renting the space in the park. But the tempting illusion of easy home ownership is just that — an illusion. Because the home is not his, the arrangement lets the landlord off the hook for any repairs. The tenant is responsible for maintenance and, in the event of an eviction, removal of the unit. Most can’t afford that and simply abandon the trailer, which falls back into the landlord’s possession.
  • The predominantly white residents of the south side trailer park lived at poverty levels comparable with their black counterparts on the north side, yet they stubbornly clung to bigoted attitudes. Though conditions in the trailer park were undeniably foul, they viewed the impoverished north side with the disdain of suburbanites and lived in palpable fear of ever ending up there.
  • Desmond boldly tested this divide by splitting his field work between the trailer park and the north side, where he found south-siders’ prevailing fear of racial animosity to be exaggerated. If anything, white people, while turning heads in the ‘hood, tended to enjoy a particular immunity from the daily violence that defined life there. Desmond’s black landlord fretted mightily over his safety and took great pains to right any wrongs committed against him.

A common thread weaving through these examples is a remarkably rational thought process going on in low-income circles. We typically don’t think of poor people that way. We associate their status with some sort of dysfunction, a failing of the mental faculties the rest of us have. But to a junkie going through heroin withdrawal, spending money on another hit rather than rent is a reasonable response. Buying your child snazzy new shoes you can’t afford, not only because he needs shoes but because you love him, makes perfect sense. Even the seemingly irrational racial divisions of a city like Milwaukee reflect a common understanding among its residents, poor or otherwise, of where societal norms permit you to live. Such realities are important in redefining the scope of low-income housing challenges, if not poverty itself. You can’t go about proposing solutions if you fundamentally misunderstand the problems, and in that regard, Desmond has done a great service. Once we engage the mechanics of low-income housing as essentially rational, it becomes easier for sociologists to propose solutions and policymakers to implement them.

In his final chapter, Desmond pulls back the curtain for an intriguing mea culpa on his  project. The field work entailed what we’d expect — living among the residents, first in the trailer park and then on the north side. But in both cases he acknowledges blurring the line between observation and participation. He’d give one of his research subjects a ride to an apartment showing. He’d help another with packing and moving after an eviction. He’d lend someone a few bucks. He admits possibly crossing the line, but he makes a strong argument that living in a neighborhood with people, sharing a bathroom, meals, the streets — sharing their struggles and being a part of their lives — renders the notion of objectivity into an abstract and somewhat useless academic conceit. Fair point.

There are journalists who also test the limits of objectivity — for instance, the “gonzo” practitioners who insert themselves into the story they’re telling, or “solutions-based journalists” who write and report with the purpose of effecting social change. I’ve always believed that a journalist shouldn’t be part of the story, but these methods, while unorthodox, can be instrumental to achieving access, which is instrumental to painting an accurate picture. As a trained sociologist, Desmond seems to be comfortable with his overall conduct, and I can’t argue with the results. His interpersonal relations with sources were essentially about access, without which their stories could not have been told, and our misguided perceptions would have gone on unchecked. And that’s what turns a heady research project about poverty and housing into a book about people and homes.

On the cold side of history

Newsrooms can be fairly casual office environments. It’s common to see colleagues come and go in jeans, sweatshirts, shorts, T-shirts and maybe a baseball cap. Still, I try to maintain a degree of professionalism with dress shirts and khakis, sometimes a sweater and on rare occasions a sports jacket. The idea is you never know who you’ll run into in the office.

But there are days, typically Fridays, when I join the laid-back crowd, strolling into the office with well-worn jeans, a long-sleeve T or henley shirt and comfortable tennis shoes. It’s Friday. Who’s gonna care?

Yet one Friday this autumn, it suddenly mattered. I arrived at my desk to find various elements of a film crew working around the newsroom. I immediately shrank down into my chair to keep myself out of sight lines. With cameras passing left and right, I caught an editor’s attention and quietly got the scoop: It was an NFL Films crew shooting footage for a documentary timed with the 50th anniversary of the “Ice Bowl” NFL championship game in Green Bay.

WATCH: “The Timeline: The Ice Bowl”

As I contemplated an escape route to an empty office, the editor brought the film director by for what I thought would be a cursory introduction. Instead, the director spent several gracious minutes with me and a couple other editors sharing stories in a slight Texas drawl. He was Michael Meredith, son of the NFL and broadcasting legend Don Meredith, who as quarterback of the Dallas Cowboys was at Lambeau Field for that frigid championship game on Dec. 31, 1967. The Packers’ 21-17 victory, generally viewed as one of the greatest games in NFL history, cemented coach Vince Lombardi and quarterback Bart Starr as icons. It’s what put the glory in Green Bay’s glory years.

The game held outsized impact for Dallas, too, and that was the subject of the younger Meredith’s documentary. For him, the project was personal as he sifted through the reverberations of that loss for his father and the Cowboys community. Points of interest:

  • The Ice Bowl was the second heart-breaking championship defeat in the calendar year for Dallas. On Jan. 1, 1967, the Packers held off the Cowboys — this time the host team at the Cotton Bowl -— 34-27, with a late Meredith interception sealing the victory. For a team and city desperately struggling to escape the shadows of the JFK assassination, Dallas fans were beginning to feel snake-bitten.
  • While the Ice Bowl is cherished with reverence throughout Wisconsin, the game signaled a changing of the guard. Green Bay was an aging team. Lombardi would soon leave for a short stint in Washington, and Starr would retire not long after. The future belonged to the younger, faster Cowboys. The “star” that Meredith helped put on the NFL map would blossom into the wildly successful “America’s team” that produced two decades of winning, a pair of Super Bowl titles and a lucrative world-wide commercial brand that lives on today. The Packers, meanwhile, languished in mediocrity until the 1990s.
  • For all the beaming pride bestowed on the Packers’ gutsy performance in the Ice Bowl, the offense struggled mightily for long stretches of the game. After jumping out to a 14-0 first-quarter lead, Green Bay did virtually nothing until the final drive. To its credit, Dallas shook off its miserable start and erased the early deficit, but the Cowboys only produced one offensive touchdown, that being on a halfback option pass that caught the Packers’ secondary off guard. The other points came courtesy of a Starr sack/fumble and a field goal set up by a botched Green Bay punt return. In other words, this game was not an offensive showcase. But keep in mind, it was 12 degrees below zero.
  • It’s hard to fault the Packers for envisioning a future coach in Starr. On the opening touchdown, his audible for a slant pass to Boyd Dowler badly fooled the Dallas coverage, leaving Dowler open for the easy score. Starr also called the fabled game-winning sneak. He proposed it during the team’s final timeout, getting Lombardi’s blessing with “run it and let’s get the hell out of here.” It suggests that Starr, not Lombardi, ran the team while it was on the field. Yet, given his chance at the helm, the former would go on to lead the Packers through much of the “gory years” of the 1970s and ’80s. There’s more to coaching than play-calling.
  • There’s some question as to whether Packers guard Jerry Kramer made a false start on the Starr sneak. The replay does seem to show early movement by Kramer, whose block paved the way for Starr’s score. Had the penalty been called, the Packers could’ve tried again from 5-1/2 yards out or, with time dwindling and no timeouts, opted for a chip shot field goal, sending the coldest game in the history of the NFL into overtime.

Meredith took both 1967 losses as personal failures, and he retired just a year after the Ice Bowl at the age of 31. He went on to become a long-time broadcaster on “Monday Night Football,” where his folksy charm served him well opposite the wordy Howard Cosell and straight-man Frank Gifford. And while he pursued a successful acting career, he steadfastly avoided roles that cast him as an ex-athlete. He made it clear that when he retired, he was done with football.

The connection between Meredith’s career arc and the game that largely defined it lurks within the subtext of “Timeline.” This is where it gets personal for the director. His father, who died in 2010, remained tight-lipped about the game that arguably broke his spirit. Yet the younger Meredith regrets not asking him about it, and the documentary is largely a product of trying to fill in the gaps.

I couldn’t help but sense that filial mission as we chatted around my desk that Friday in autumn. Michael inherited a healthy dose of his dad’s affable nature, regaling us with some of the tidbits that his father did share with him. How the family had to hire security guards for their home after a Cowboys loss. And conversely, how fans wanted Meredith for governor after a victory. And somewhat scandalously to us in Green Bay, how Lombardi, earlier in the 1960s and hungry for titles, had proposed a trade with the Cowboys — Starr, plus some extras, for Meredith. Who knows how serious Lombardi was, but Starr was essentially unproven at that point, with Meredith likely showing more potential, so it’s possible it was a legitimate offer that Dallas rejected.

Eventually, the younger Meredith — an accomplished film director in his own right — needed to get down to filming, leaving me briefly regretting not telling him the fond memories I have of listening to his dad defuse Cosell on “Monday Night Football.” I worried that my personal association of “Dandy Don” as a TV celebrity — I’m too young to have seen him play — would be dismissive of his distinguished NFL career. I also indulged the journalist’s instinct to refrain from talking and keep listening. That’s always a good call.

As for the day’s filming, I’m happy to report I share screen time with the likes of Jerry Kramer, Dave Robinson, Mel Renfro and Roger Staubach. But to spot me, you’ll need to do a frame-by-frame advance during the half-second of my back appears as Michael Meredith and our editor walk past my desk. Just look for the guy in the rumpled red shirt.


Boring? Just add scoring

Some of the best workplace discussions are those which have nothing to do with work: the so-called “water-cooler” conversation that greases the gears of the white collar domain.

While office managers may look on with concern over lost productivity, they’re often as guilty as the rest, joining subordinates in Monday-morning-quarterbacking the big game or hashing out the best Christmas movie of all time. Perhaps those managers recognize some benefit to the occasional distraction from the day’s business. Surely it won’t matter if that TPS report is filed a few minutes late.

Our office is no exception, and I found it noteworthy to observe, twice in the last two weeks, the topic of “what is the most boring sport?” luring colleagues within earshot from their respective desks. For an undefined period of time (but we know when it’s over), work duties are set aside as the question becomes top priority. And when the designated BS session inevitably ends with no conclusive finding, we resume our paid duties in the hope that something we said got through. Well, given that this debate has come up twice, it’s apparent that, as with many of our legitimate meetings, nothing was settled and the oral arguments will resume at an appropriate lull in the future.

So I want to be ready. I’ve given the “boring sport” topic more thought than I’d care to admit, digging into what exactly it is about athletic contests that appeals to us. Call it defining the terms.

My theory in short: it comes down to scoring. It’s the essence of sports, the only way to objectively separate winner from loser. For purposes of our office debate, we’re talking about the “big five,” a predominant subset within the realm of organized spectator sports for which game progression and scoring operate independently of each other. Baseball, basketball, football, hockey and soccer all can advance and sometimes even end with or without scoring by either team. Contrast this with tennis or golf, for which scoring plays are integrated into the progression of the match. One can’t happen without the other.

Because of this difference, the big five have widely varying rates of scoring. That’s the crux of our “boring” question. The scoring play represents the ultimate gratification of our emotional investment in the game. How often that happens, and its degree of significance in the contest, largely accounts for our overall interest in the sport.

If I worked at fivethirtyeight.com, I’d have come up with a chart to illustrate the differences in the big five based on scoring frequency (or alternately, scoring ease). But let’s hammer it out in words. On one end of the spectrum is basketball, where teams score on a large percentage of their possessions. Accordingly, the value of a score is diminished. It’s for fans who have a high tolerance for repeated but small slices of gratification, with tons of baskets made, but not much interest in correlating a particular basket with the final outcome.

On the opposite end are soccer and hockey, where scoring a goal is comparatively difficult and therefore rare. One goal is a big deal. Two or three goals can be an insurmountable lead. And tie games are a common occurrence all teams and fans must accept. That’s OK, though. There’s nuanced gratification in a team’s ability to create scoring chances, even when they don’t hit home. Or in playing defense, which functions on the understanding that preventing an opponent from scoring is itself a form of scoring. And when there’s an actual goal? Let’s just say the celebration is more pronounced than when a basketball player sinks a jumper.

And then in the middle are football and baseball. Scoring tends to happen with moderate frequency, but typically not more than 6-7 times per team per game. That means a touchdown or home run can be the difference, but often they aren’t, meaning you may have to watch to the end to be certain. It gives fans a healthy frequency of gratification with enough “droughts” in between to give that gratification meaning.

So what’s the most interesting sport? I have my own views, but in terms of pure popularity, it appears basketball, with its many, many scoring plays to cheer, might have the brightest future. Repeated gratification, even in small doses, is the driving factor. The NFL and Major League Baseball seem to understand this as they continue to tinker with their rules to favor offenses and ease scoring. You can’t have bat flips and end zone celebrations without home runs and touchdowns. Soccer and hockey, with their well defined parameters and global audiences, have limited options (and limited appetite) for change and therefore will likely maintain limited appeal here in the U.S., where fans want to see points.

None of this accounts for the other myriad subjective and intangible attractions of athletic contests that capture people’s interest. It might be the sheer speed thrill of downhill skiing, the muscular artistry of gymnastics or the primal brutality of mixed martial arts. We like them… well, because we like them. It’s probably a social phenomenon worthy of discussion, but let’s save that for another day. I should really be getting back to work.

Short and sweet

“I”ll be brief,” said the guest presenter for a film screening I attended several years ago. He followed with a good 20 minutes of exposition on the topic at hand, after which he released us, exhausted, to finally enjoy the movie.

Clearly, some people have a different notion of “brief” than others. As a journalist, I understand the value of brevity in a vocation that demands it. I take great pleasure in paring four sentences down to two and dismissing five colorful but unnecessary words in favor of the essential one.

So imagine the consternation a couple weeks ago when Twitter revealed it was expanding, on a limited test basis, some users’ character limit from 140 to 280. Twitter is a favorite social media tool among journalists because it demands brevity and rewards word efficiency. You want description and clarity, but you also need to get to the point, and fast. It is a practiced art that few of us will ever master, but it’s a lot of fun to try. It forces good writers to be better and bad writers to… well, stick to Facebook.

There are greater problems in the world than Twitter’s character limit, and we’ll individually adjust if the change comes to pass (I suspect many of us will start out by voluntarily restricting ourselves to the original 140). But it’s worth entertaining a discussion on the value of brevity across the creative spectrum.

Ernest Hemingway famously challenged writers to think not only in terms of what they’re saying but what they’re not. Most of us know his famous six-word short story (“For sale: baby shoes, never worn.”), but here’s an excerpt from “The Sun Also Rises” that’s a little less provocative:

I walked away from the cafe. They were sitting at the table. I looked back at them and at the empty tables. There was a waiter sitting at one of the tables with his head in his hands.

That’s an interesting observation of a person (the waiter) we don’t encounter anywhere else in the novel. Is he distraught? Annoyed? Tired? Amid 247 pages of drunken carousing, it’s what Hemingway doesn’t say that captivates the reader.

Let’s look at pop music. In the transition from vinyl to CD (and later digital iterations), recording artists found themselves free from the constraint of five 3-to-4-minute songs, per side, offered by vinyl records (or cassette tapes). As a result, we got 70-to-80 minute CDs and now, virtually unlimited packages of official releases dressed up with alternate takes, out takes and other assorted recordings. So what’s lost? The need to curate. It’s fascinating to read about decisions that went into the production of vinyl releases, and how musicians and their labels negotiated what made the cut and what had to be left out. When artists are forced to assess and prioritize the quality of their work, you tend to get their best.

Let’s move on to cinema, where the supposed gold standard is the feature-length (90 to 120 minutes) fictional narrative. Yet recently I saw the French short film “Uncanny Valley,” set on a World War I battlefield, that was shot entirely in stop-action photography. It was an interesting, if experimental tactic that worked well for its 13:30 running time, but there’s no way it could have been sustained for an hour or two. Feature-length is not always the ideal format, yet at the Oscars or Cannes, those are the only films anyone talks about.

Let’s go back to our guest speaker. It’s a conceit in academics that more words equal more information, but anyone who has sat through their share of college lectures knows this isn’t always true. Even in politics, we’re taught that any speech of significance like the president’s State of the Union can’t be great unless it’s at least an hour long. But consider that one of the greatest political speeches in our history, Abraham Lincoln’s Gettysburg address, weighed in at 271 words and was delivered in less than 3 minutes. That was a man who knew how to get to the point without sacrificing eloquence.

It’s unfortunate that we live in a world that equates “more” with “better.” An all-you-can-eat buffet will always turn more heads than a small plate of fine cheeses, and that view is widely shared among the arts. Still, our guest presenter’s “I’ll be brief” tells me he understood the audience expectation for his opening remarks. The next time we just need to hold him to it.

After further review

At some point I suppose I have to reckon my lukewarm regard for The Band as a conscious and foolish decision to deprive myself of one of the endearing figures in rock music. The first step is admitting the problem; the second is taking a proactive step. So I did this week by picking up a DVD of the 1978 classic concert film “The Last Waltz,” shot at The Band’s 1976 farewell performance at the Winterland Ballroom in San Francisco. The movie’s mystique, carried by the premise of a beloved group calling it quits at its artistic peak, endures in critical circles as both a filmmaking and musical triumph. That it was shot by Martin Scorcese, who already owned claim to one of the best movies ever made in “Taxi Driver,” only enhanced the mystique.

Was it everything I expected? Sure, I’ll go with that. It was what it was: a finely tuned ensemble of veteran musicians commanding a wide variety of American musical styles with a loaded roster of guest players, captured in action via expert camera work. But much like 1970’s “Woodstock” (which featured a young Scorcese among its crew), it was the pieces rather than the whole that interested me. An unintended guitar lick. Singers harmonizing with careless perfection. A smile (or scowl) shared across the stage between bandmates. A moment of “I’m-too-old-for-this” candor amid the bravado of backstage interviews. These are the slices of life Scorcese captures so well.

Here then are some snippets that stood out, along with some general observations on the film:

1. Scorcese is a master. Yes, we already knew that, but his visual grasp of the concert stage is remarkable. There are a lot of moving parts during a rock show, yet the camera angles and focal points are invariably spot on.

2. A young Ray Liotta could’ve played Robbie Robertson. I kid of course — it’s a documentary, so Robbie makes the perfect Robbie. But it’s worth wondering whether Scorcese’s experience with his lead subject stuck with him when casting Liotta for 1990’s “Goodfellas.”

3. Neil Young has the goofiest grin.

4. Joni Mitchell has the cutest grin.

5. The Band’s good-guy, anti-rock star reputation was for real. It always struck me that they never seemed to have a true leader, a front man through which the group’s energy was channeled. On the stage you see it: a conspicuous absence of ego in a profession that rewards, even demands self-promotion. More amazingly, this odd humility rubs off on their musical guests, some of whom bring significant star power to the gig. One after another, top guns like Eric Clapton, Van Morrison and Muddy Waters shuffle onto the stage with shy grins, do their song with The Band, and depart with little more than a wave. No bows. No preening. No adulation. In a look-at-me business, it’s refreshing.

6. That said, I quickly got the sense that Robertson was pulling the strings. He has a low-key manner about him, but on stage and backstage, he holds the center of gravity. It turns out I was onto something — the final credits list Robertson as producer.

7. Robertson is a much better live guitarist than I’d imagined based on Band records I’ve heard.

8. Levon Helm was one of the great singing drummers of rock ‘n’ roll. Much like the Eagles’ Don Henley, Helm could rightfully have held the title of lead singer had he not been stashed behind his kit. Not that I’m blaming The Band for doing so — his drum work brings pop to the group’s somewhat slumbering vibe. But his ability to turn a lazy growl to a plaintive wail and back is a unique bonus.

9. Scorcese did humanity a favor by committing the live rendition of “The Night They Drove Old Dixie Down” to celluloid immortality. It reminded me of Crosby, Stills & Nash’s “Suite: Judy Blue Eyes” performance in “Woodstock,” in which Stephen Stills’ spellbinding guitar work made me completely forget that I’ve never much cared for that song. Likewise, I’ve always had mixed feelings about “Dixie” and its sympathetic view towards defeated Southern rebels, but the musical mastery on display — particularly the impassioned delivery from Southerner Helm — can’t be denied.

10. If there was any truth to what I’ve read of Marvin Gaye’s legendary insecurity, he had to be chewing his nails off hearing The Band reconstruct his “Don’t Do It” into a brassy, slow-groove masterpiece. This wasn’t the studio. This was one take, on stage, and they nailed it. That band was tight.

11. Van Morrison… thud. The first blip in the film arises in his inability to read the band’s wind-down sequence to “Caravan.” After some confused warbling at the mic, Van and The Band ultimately bring it home, but the moment was there, and the show loses some of its luster.

12. Bob Dylan… double thud. Remember what I said about lack of ego and preening? Check that. For a pair of songs, along with the ensemble concert finale, Dylan jealously projects himself in front of the entire venture, and for those few minutes, I genuinely lost interest in the film. Unfortunately for Scorcese, that’s how the show ends, leaving me to disappointedly scan back through the DVD for more congenial chapters. Theatergoers in 1978 didn’t have that option.

It’s interesting that a while back I blogged, somewhat carelessly, a list of revered musical acts I’m supposed to like but don’t, and wouldn’t you know it, leading off were none other than Van Morrison, Bob Dylan and The Band. “The Last Waltz” shows in part why the first two earned my scorn, but I now realize The Band, mixed bag that it is for me, is better than that.

I can objectively say The Band’s greatest asset is its musical breadth. It’s difficult to place the group into an identifiable style or genre, and the guest pairings in “The Last Waltz” hammer this point home. Ostensibly a rock ‘n’ roll outfit (the movie opens with a title card stating “This film should be played loud!”), The Band effortlessly slips into blues (with Paul Butterfield, Eric Clapton, Muddy Waters), yet shows itself equally comfortable doing folk rock (with Neil Young for “Helpless”), jazz (“Coyote” with Joni Mitchell) and even its own gospel-tinged “The Weight” with the Staple Singers. It may be a chameleon act of sorts, but while most chameleon acts lack a core, that’s the not case with The Band. Their feet are firmly planted in American roots rock (interesting for a group of mostly Canadians), and their catalog constitutes a road map of 20th century American music — the country, blues, jazz, gospel pieces, and the rock ‘n’ roll they comprised. Perhaps it’s their versatility that causes problems for me, but that’s my problem. “The Last Waltz” gave The Band darling status in perpetuity, and through Scorcese’s cameras, we see a band quite deserving of it.

Guitar hero

There’s a scene in the 2014 documentary “Glen Campbell: I’ll Be Me” that goes something like this:

Campbell, the faded country-crossover musician battling Alzheimer’s disease, is in the doctor’s office for a checkup. The doctor asks him to wiggle his fingers like he’s playing the piano, to which Campbell retorts, “but I’m a guitar player.”

Somehow, through the fog of Alzheimer’s, a lucid, wise-cracking Campbell sparkles with trademark Southern sass, right down to the folksy pronunciation of “GIH-tar.” And then just as quickly, the fog returns.

It was a phenomenon with which I was familiar, having watched my father spend his final years succumbing to Alzheimer’s, only to fire the occasional spark of humor or recollection. For anyone desperately hoping to see the return of the person they knew, it’s easy to be seduced by notions of an unexplained reversal, or perhaps a miraculous misdiagnosis. But my siblings and I knew the diagnosis to be correct, and we also knew there was no reversing the course of this disease.

I was in the doctor’s office with my dad when the diagnosis came. It was matter-of-fact, even cold, considering the weight of such news. Dad took it with puzzled acceptance and his trademark good cheer. I’ll never know whether he really understood the implications. I didn’t ask. It would be devastating to process, and he could be excused for shrugging it off the way he would a poor glucose test. Shortly after the doctor’s visit, we had lunch at his favorite diner, the “dad jokes” flying fast and furious. I wasn’t in a laughing mood, but even then, he scored a couple of direct hits that had me cracking up. For a moment the old Dad sparkled, and for a moment I entertained notions of a reversal or misdiagnosis.

“I’ll Be Me” features an eerily similar doctor’s visit for Campbell, who meets the presentation of MR scans and medical mumbo jumbo with the too-eager “uh-huhs” and “oh yeahs” of a student in biology class. Like my dad, he didn’t show it, but I suspect he knew what all of it meant. And yet, he responded by launching a last hurrah tour and agreeing to have filmmakers document it. The burning question is why?

His reasons weren’t clear to me, but perhaps Campbell wanted to give Alzheimer’s a face. Not just one face, but many faces — the sufferer, his family, friends or anyone else affected by the robbing of a person’s mind in the most brutal and heartless way imaginable. The film doesn’t sugarcoat the increasing difficulties Campbell has with touring, and the accompanying headaches his family navigates in managing their erratic headliner. There are some very uncomfortable moments in “I’ll Be Me,” and I struggled with the family’s insistence on going forward with a tour that reduced this once-great performer to a lost, confused, occasionally angry high-wire circus act robbed of his most important remaining possession — his dignity. I had to take it on faith that the entire venture, while endlessly skirting disaster, was in accordance with Campbell’s wishes.

When Glen Campbell died last week at the age of 81, I immediately thought of “I’ll Be Me.” I was aware of his tremendous musical legacy, beginning with his early days as a Beach Boys fill-in before breaking through with solo hits like “Wichita Lineman” and “Gentle on My Mind.” His defining “Rhinestone Cowboy,” a radio staple during my youth, made him a bona fide pop star, but it wasn’t until later that I discovered what a marvelous GIH-tar player Campbell was. And yet, it was “I’ll Be Me” that gave real meaning to his death. I felt not only grief but relief at the peace that had finally come to him and his family.

Performing artists succeed as pop stars because of their ability to effectively convey common themes. Through their expression of universal emotions, we feel less alone, and that’s why we listen to them. Before I watched “I’ll Be Me,” I stupidly regarded our family’s experience with our dad as somehow unique, a bizarre and embarrassing dysfunction that outsiders couldn’t comprehend. The film was a revelation, and perhaps that answers the “why” that nagged me as I began to recognize so much of my father on the screen. Campbell was conveying — in fact demonstrating — an experience that’s more universal than any of us imagined, and most importantly, true to his chosen vocation, by sharing his pain and loss, he made me feel less alone in mine.