Short and sweet

“I”ll be brief,” said the guest presenter for a film screening I attended several years ago. He followed with a good 20 minutes of exposition on the topic at hand, after which he released us, exhausted, to finally enjoy the movie.

Clearly, some people have a different notion of “brief” than others. As a journalist, I understand the value of brevity in a vocation that demands it. I take great pleasure in paring four sentences down to two and dismissing five colorful but unnecessary words in favor of the essential one.

So imagine the consternation a couple weeks ago when Twitter revealed it was expanding, on a limited test basis, some users’ character limit from 140 to 280. Twitter is a favorite social media tool among journalists because it demands brevity and rewards word efficiency. You want description and clarity, but you also need to get to the point, and fast. It is a practiced art that few of us will ever master, but it’s a lot of fun to try. It forces good writers to be better and bad writers to… well, stick to Facebook.

There are greater problems in the world than Twitter’s character limit, and we’ll individually adjust if the change comes to pass (I suspect many of us will start out by voluntarily restricting ourselves to the original 140). But it’s worth entertaining a discussion on the value of brevity across the creative spectrum.

Ernest Hemingway famously challenged writers to think not only in terms of what they’re saying but what they’re not. Most of us know his famous six-word short story (“For sale: baby shoes, never worn.”), but here’s an excerpt from “The Sun Also Rises” that’s a little less provocative:

I walked away from the cafe. They were sitting at the table. I looked back at them and at the empty tables. There was a waiter sitting at one of the tables with his head in his hands.

That’s an interesting observation of a person (the waiter) we don’t encounter anywhere else in the novel. Is he distraught? Annoyed? Tired? Amid 247 pages of drunken carousing, it’s what Hemingway doesn’t say that captivates the reader.

Let’s look at pop music. In the transition from vinyl to CD (and later digital iterations), recording artists found themselves free from the constraint of five 3-to-4-minute songs, per side, offered by vinyl records (or cassette tapes). As a result, we got 70-to-80 minute CDs and now, virtually unlimited packages of official releases dressed up with alternate takes, out takes and other assorted recordings. So what’s lost? The need to curate. It’s fascinating to read about decisions that went into the production of vinyl releases, and how musicians and their labels negotiated what made the cut and what had to be left out. When artists are forced to assess and prioritize the quality of their work, you tend to get their best.

Let’s move on to cinema, where the supposed gold standard is the feature-length (90 to 120 minutes) fictional narrative. Yet recently I saw the French short film “Uncanny Valley,” set on a World War I battlefield, that was shot entirely in stop-action photography. It was an interesting, if experimental tactic that worked well for its 13:30 running time, but there’s no way it could have been sustained for an hour or two. Feature-length is not always the ideal format, yet at the Oscars or Cannes, those are the only films anyone talks about.

Let’s go back to our guest speaker. It’s a conceit in academics that more words equal more information, but anyone who has sat through their share of college lectures knows this isn’t always true. Even in politics, we’re taught that any speech of significance like the president’s State of the Union can’t be great unless it’s at least an hour long. But consider that one of the greatest political speeches in our history, Abraham Lincoln’s Gettysburg address, weighed in at 271 words and was delivered in less than 3 minutes. That was a man who knew how to get to the point without sacrificing eloquence.

It’s unfortunate that we live in a world that equates “more” with “better.” An all-you-can-eat buffet will always turn more heads than a small plate of fine cheeses, and that view is widely shared among the arts. Still, our guest presenter’s “I’ll be brief” tells me he understood the audience expectation for his opening remarks. The next time we just need to hold him to it.

Advertisements

After further review

At some point I suppose I have to reckon my lukewarm regard for The Band as a conscious and foolish decision to deprive myself of one of the endearing figures in rock music. The first step is admitting the problem; the second is taking a proactive step. So I did this week by picking up a DVD of the 1978 classic concert film “The Last Waltz,” shot at The Band’s 1976 farewell performance at the Winterland Ballroom in San Francisco. The movie’s mystique, carried by the premise of a beloved group calling it quits at its artistic peak, endures in critical circles as both a filmmaking and musical triumph. That it was shot by Martin Scorcese, who already owned claim to one of the best movies ever made in “Taxi Driver,” only enhanced the mystique.

Was it everything I expected? Sure, I’ll go with that. It was what it was: a finely tuned ensemble of veteran musicians commanding a wide variety of American musical styles with a loaded roster of guest players, captured in action via expert camera work. But much like 1970’s “Woodstock” (which featured a young Scorcese among its crew), it was the pieces rather than the whole that interested me. An unintended guitar lick. Singers harmonizing with careless perfection. A smile (or scowl) shared across the stage between bandmates. A moment of “I’m-too-old-for-this” candor amid the bravado of backstage interviews. These are the slices of life Scorcese captures so well.

Here then are some snippets that stood out, along with some general observations on the film:

1. Scorcese is a master. Yes, we already knew that, but his visual grasp of the concert stage is remarkable. There are a lot of moving parts during a rock show, yet the camera angles and focal points are invariably spot on.

2. A young Ray Liotta could’ve played Robbie Robertson. I kid of course — it’s a documentary, so Robbie makes the perfect Robbie. But it’s worth wondering whether Scorcese’s experience with his lead subject stuck with him when casting Liotta for 1990’s “Goodfellas.”

3. Neil Young has the goofiest grin.

4. Joni Mitchell has the cutest grin.

5. The Band’s good-guy, anti-rock star reputation was for real. It always struck me that they never seemed to have a true leader, a front man through which the group’s energy was channeled. On the stage you see it: a conspicuous absence of ego in a profession that rewards, even demands self-promotion. More amazingly, this odd humility rubs off on their musical guests, some of whom bring significant star power to the gig. One after another, top guns like Eric Clapton, Van Morrison and Muddy Waters shuffle onto the stage with shy grins, do their song with The Band, and depart with little more than a wave. No bows. No preening. No adulation. In a look-at-me business, it’s refreshing.

6. That said, I quickly got the sense that Robertson was pulling the strings. He has a low-key manner about him, but on stage and backstage, he holds the center of gravity. It turns out I was onto something — the final credits list Robertson as producer.

7. Robertson is a much better live guitarist than I’d imagined based on Band records I’ve heard.

8. Levon Helm was one of the great singing drummers of rock ‘n’ roll. Much like the Eagles’ Don Henley, Helm could rightfully have held the title of lead singer had he not been stashed behind his kit. Not that I’m blaming The Band for doing so — his drum work brings pop to the group’s somewhat slumbering vibe. But his ability to turn a lazy growl to a plaintive wail and back is a unique bonus.

9. Scorcese did humanity a favor by committing the live rendition of “The Night They Drove Old Dixie Down” to celluloid immortality. It reminded me of Crosby, Stills & Nash’s “Suite: Judy Blue Eyes” performance in “Woodstock,” in which Stephen Stills’ spellbinding guitar work made me completely forget that I’ve never much cared for that song. Likewise, I’ve always had mixed feelings about “Dixie” and its sympathetic view towards defeated Southern rebels, but the musical mastery on display — particularly the impassioned delivery from Southerner Helm — can’t be denied.

10. If there was any truth to what I’ve read of Marvin Gaye’s legendary insecurity, he had to be chewing his nails off hearing The Band reconstruct his “Don’t Do It” into a brassy, slow-groove masterpiece. This wasn’t the studio. This was one take, on stage, and they nailed it. That band was tight.

11. Van Morrison… thud. The first blip in the film arises in his inability to read the band’s wind-down sequence to “Caravan.” After some confused warbling at the mic, Van and The Band ultimately bring it home, but the moment was there, and the show loses some of its luster.

12. Bob Dylan… double thud. Remember what I said about lack of ego and preening? Check that. For a pair of songs, along with the ensemble concert finale, Dylan jealously projects himself in front of the entire venture, and for those few minutes, I genuinely lost interest in the film. Unfortunately for Scorcese, that’s how the show ends, leaving me to disappointedly scan back through the DVD for more congenial chapters. Theatergoers in 1978 didn’t have that option.

It’s interesting that a while back I blogged, somewhat carelessly, a list of revered musical acts I’m supposed to like but don’t, and wouldn’t you know it, leading off were none other than Van Morrison, Bob Dylan and The Band. “The Last Waltz” shows in part why the first two earned my scorn, but I now realize The Band, mixed bag that it is for me, is better than that.

I can objectively say The Band’s greatest asset is its musical breadth. It’s difficult to place the group into an identifiable style or genre, and the guest pairings in “The Last Waltz” hammer this point home. Ostensibly a rock ‘n’ roll outfit (the movie opens with a title card stating “This film should be played loud!”), The Band effortlessly slips into blues (with Paul Butterfield, Eric Clapton, Muddy Waters), yet shows itself equally comfortable doing folk rock (with Neil Young for “Helpless”), jazz (“Coyote” with Joni Mitchell) and even its own gospel-tinged “The Weight” with the Staple Singers. It may be a chameleon act of sorts, but while most chameleon acts lack a core, that’s the not case with The Band. Their feet are firmly planted in American roots rock (interesting for a group of mostly Canadians), and their catalog constitutes a road map of 20th century American music — the country, blues, jazz, gospel pieces, and the rock ‘n’ roll they comprised. Perhaps it’s their versatility that causes problems for me, but that’s my problem. “The Last Waltz” gave The Band darling status in perpetuity, and through Scorcese’s cameras, we see a band quite deserving of it.

Guitar hero

There’s a scene in the 2014 documentary “Glen Campbell: I’ll Be Me” that goes something like this:

Campbell, the faded country-crossover musician battling Alzheimer’s disease, is in the doctor’s office for a checkup. The doctor asks him to wiggle his fingers like he’s playing the piano, to which Campbell retorts, “but I’m a guitar player.”

Somehow, through the fog of Alzheimer’s, a lucid, wise-cracking Campbell sparkles with trademark Southern sass, right down to the folksy pronunciation of “GIH-tar.” And then just as quickly, the fog returns.

It was a phenomenon with which I was familiar, having watched my father spend his final years succumbing to Alzheimer’s, only to fire the occasional spark of humor or recollection. For anyone desperately hoping to see the return of the person they knew, it’s easy to be seduced by notions of an unexplained reversal, or perhaps a miraculous misdiagnosis. But my siblings and I knew the diagnosis to be correct, and we also knew there was no reversing the course of this disease.

I was in the doctor’s office with my dad when the diagnosis came. It was matter-of-fact, even cold, considering the weight of such news. Dad took it with puzzled acceptance and his trademark good cheer. I’ll never know whether he really understood the implications. I didn’t ask. It would be devastating to process, and he could be excused for shrugging it off the way he would a poor glucose test. Shortly after the doctor’s visit, we had lunch at his favorite diner, the “dad jokes” flying fast and furious. I wasn’t in a laughing mood, but even then, he scored a couple of direct hits that had me cracking up. For a moment the old Dad sparkled, and for a moment I entertained notions of a reversal or misdiagnosis.

“I’ll Be Me” features an eerily similar doctor’s visit for Campbell, who meets the presentation of MR scans and medical mumbo jumbo with the too-eager “uh-huhs” and “oh yeahs” of a student in biology class. Like my dad, he didn’t show it, but I suspect he knew what all of it meant. And yet, he responded by launching a last hurrah tour and agreeing to have filmmakers document it. The burning question is why?

His reasons weren’t clear to me, but perhaps Campbell wanted to give Alzheimer’s a face. Not just one face, but many faces — the sufferer, his family, friends or anyone else affected by the robbing of a person’s mind in the most brutal and heartless way imaginable. The film doesn’t sugarcoat the increasing difficulties Campbell has with touring, and the accompanying headaches his family navigates in managing their erratic headliner. There are some very uncomfortable moments in “I’ll Be Me,” and I struggled with the family’s insistence on going forward with a tour that reduced this once-great performer to a lost, confused, occasionally angry high-wire circus act robbed of his most important remaining possession — his dignity. I had to take it on faith that the entire venture, while endlessly skirting disaster, was in accordance with Campbell’s wishes.

When Glen Campbell died last week at the age of 81, I immediately thought of “I’ll Be Me.” I was aware of his tremendous musical legacy, beginning with his early days as a Beach Boys fill-in before breaking through with solo hits like “Wichita Lineman” and “Gentle on My Mind.” His defining “Rhinestone Cowboy,” a radio staple during my youth, made him a bona fide pop star, but it wasn’t until later that I discovered what a marvelous GIH-tar player Campbell was. And yet, it was “I’ll Be Me” that gave real meaning to his death. I felt not only grief but relief at the peace that had finally come to him and his family.

Performing artists succeed as pop stars because of their ability to effectively convey common themes. Through their expression of universal emotions, we feel less alone, and that’s why we listen to them. Before I watched “I’ll Be Me,” I stupidly regarded our family’s experience with our dad as somehow unique, a bizarre and embarrassing dysfunction that outsiders couldn’t comprehend. The film was a revelation, and perhaps that answers the “why” that nagged me as I began to recognize so much of my father on the screen. Campbell was conveying — in fact demonstrating — an experience that’s more universal than any of us imagined, and most importantly, true to his chosen vocation, by sharing his pain and loss, he made me feel less alone in mine.

A different drum

It’s often puzzled me that I’ve never heard a decent cover version of the Beatles’ “Helter Skelter.” Yes, it’s the Beatles — arguably the best recording act the world has seen, and not to be taken lightly by prospective imitators.

But like many of the Beatles’ best-covered songs (“I Wanna Be Your Man,” “Come and Get It”) “Helter Skelter” is written from the straight-up rock ‘n’ roll playbook. It has the horsepower that should make lesser bands sound better than they are. With its buzzsaw guitar intro, blood-curdling vocals and the refrain’s killer machine-gun riff, it’s a muscle car just begging for a worthy rock act to test drive. And many have tried. It’s not that they’re bad. They simply are missing something, and I think I’ve figured out what it is: Ringo Starr.

History has been kind in assigning proper belated recognition to Ringo Starr, but he still ranks alongside producer George Martin as an overlooked staple of the Beatles arsenal, a critical piece of their musical machinery in the vein of McCartney’s Rickenbacker bass or Abbey Road’s magical mixing board. Ringo was much more than that, and “Helter Skelter” shows why.

We only need to listen to the aforementioned pretenders who’ve come up short in taking on this formidable classic: Aerosmith, Pat Benatar, Siouxsie and the Banshees, Motley Crue and, yes, U2. These are no slouches. Yet each one focuses too heavily on the front end of the Beatles’ original — the guitar and vocals — while, you guessed it, overlooking Ringo Starr’s unique contribution on the rhythm side. As a result, the drummers tend to follow the frenetic pace of the song — like that muscle car, it’ll go as fast as you want it to. And that’s where the problem lies. “Helter Skelter” was McCartney’s attempt to write the loudest, brashest song he could, and the guitars and his vocals do their part to make that happen. But what Ringo adds is the mayhem. He doesn’t pace the song, he bludgeons it. He smashes his way around the requisite fills, abusing his kit with such primitive fury (“I’ve got blisters on my fingers!”) it likely got Keith Moon’s attention. And it keeps the song from gaining too much form, which is exactly the point.

It’s a lesson on how a great drummer can make a song better by working against its grain, becoming a counterbalance of sorts. On the Beatles’ “I Am the Walrus,” for instance, Ringo takes the opposite approach. The song opens with an unfathomable soup of organ and strings before he counts in with a steady beat to provide much-needed structure under the wash of instruments and John Lennon’s lyrical stream of consciousness. Many drummers would say “this is a trippy song, I need to be trippy.” Not Ringo. He played straight man to Lennon’s clown in the same way he took a high-octane, quarter-mile McCartney rocker for a punishing off-road ride. It’s a rare talent indeed, and rare talent, especially when unrecognized, defies imitation.

Blazing a new trail

One of the great failings in environmental debate is the tendency to view environmentalism as an ethic that removes humans from the equation. Such rhetoric comes from both the for-and-against extremes, doing a great a disservice to the sensible middle. The tree-huggers envision a return to bucolic ecosystems of yesteryear despite those systems’ need to somehow accommodate 7 billion human beings. Meanwhile their opponents argue society’s economic health comes first, not realizing that whatever we do to the environment, we do to ourselves.

Here in Wisconsin, we sit in the sensible middle. It’s a state with a tremendous environmental legacy — the modern movement was set in motion by our former governor, U.S. senator and Earth Day founder Gaylord Nelson. Yet it’s no Wyoming or Utah, where areas the size of small Eastern states are set aside for parkland. Even in its sparsely populated northern reaches, Wisconsin doesn’t have undeveloped space on that scale. There’s tourism. There are Northwoods cabin getaways. There’s the timber industry. In other words, there are people.

IMG_2077As a result, we sometimes see environmentalism operate at the micro rather than macro level. Yes, there are state parks and national forests managed via vast bureaucracies that we trust have the right intentions. But there are times when the local folks take the reins. Case in point: The Ice Age trail.

OK, it’s technically the Ice Age National Scenic Trail. But that’s a federal designation bestowed only after tireless volunteers at the local level organized the patchwork project into something viable. It’s a success story brought to light in Melanie Radzicki McManus’ recently published “Thousand-Miler: Adventures in Hiking the Ice Age Trail.” In chronicling her attempt to complete the 1,100 miles approximating the furthest glacial advance of the last Ice Age, McManus highlights a state treasure hidden in plain sight.

There are two gratifying components to the Ice Age trail. First, it came together not via governmental decree but organically out of residents’ desire to set aside some green space within their communities. That desire reflected a number of favored recreational activities — camping, hunting, fishing, snowmobiling, horseback riding, biking and, yes, hiking. Secondly, it’s here in Wisconsin. You don’t have to go to Appalachia or the Mountain West to enjoy a world-class outdoors experience.

As McManus’ tale shows, this is hardcore hiking. Eleven hundred miles is not for beginners. Yet the culture, supported by a quasi-governing body called the Ice Age Trail Alliance, is one that encourages participation. McManus’ attempt qualifies as a “thru hike,” meaning the person pounds the trail — for weeks, months, however long it takes — until they’ve finished it from end to end. They rest each night, either camping or staying at local motels or the homes of “trail angels” — local volunteers who offer a bed, a hot meal or a ride to the next trailhead, where the hike continues. However, many people “section hike,” or complete portions as time allows, returning to their homes and jobs in the meantime, until they’ve done each piece comprising the trail. It doesn’t matter if it takes years. They’re given credit as finishers right along with the thru-hikers.

Either way you do it, it’s no walk in the park. McManus encounters a number of serious medical issues, mostly with her feet (not uncommon for distance hikers), that threaten to derail her. There are bears and wolves, particularly in the northern segments, while connecting highway routes can pose hazardous traffic. Mosquitoes and ticks, along with overgrown vegetation that impedes rougher portions of the trail, take their toll on exposed skin. But the biggest pain McManus contends with is losing her way. The yellow blazes affixed to sign posts and trees indicate the official route, but posts and trees sometimes fall over. Fast-growing vegetation can hide signs, leading hikers to unwittingly take a wrong turn or continue on when they should turn.

“Thousand-Miler” is at heart an adventure story, deftly sprinkled with educational components about the history and current operation of the trail, but the book’s most charming asset is its equal amounts of affection for people and place. Distance hiking has its own subculture, and McManus meets a number of characters on the trail, each with his or her own reasons, and strategies, for walking it. As a thru-hiker, she also benefits from the assistance of family members, friends and trail angels who serve as her crew, supplying her with food and water midhike, ferrying her to and from the night’s lodging and, most importantly, providing moral support. While an 1,100-mile hike affords the solitude and reflection of a personal journey, it becomes clear that there’s more to the story than one woman and the vast wilderness.

McManus acknowledges as much near the end of the book. Her own quest complete, she relates a recent effort by a veterans’ group to send military service members returning from war zones out on the trail. The idea is to plan and complete a hike as a way of therapeutically reintegrating into civilian life. The natural spaces give the veterans room to breath, while their reliance on support from family, friends and trail angels remind them of the goodness in humanity. They find peace through the land, and its people.

All of this comes together through the preservation of natural spaces. Score one for the tree huggers. But it’s a broader experience than traipsing through pristine forests undisturbed by human influence and hearing birds chirp unperturbed by human voices. What sense is there in creating an 1,100-mile trail if there’s no one there to hike it — or write about it? This is environmentalism in totality — one that has a place for people.

 

Days of future past

IMG_2003a

One of the neat characteristics of a legacy blue collar industrial city like Green Bay, Wis., is the endurance of commercial markers long since erased from mainstream America. You’ll find them hidden in plain sight, typically in older parts of town — the shoe repair shop on University, the hardware store on East Mason and Main where the guy will make duplicate keys for you, and, dotted around pockets of downtown with surprising frequency, the traditional barbershop.

To walk into a barbershop on Broadway on the west side of downtown, as I did recently, is to glimpse vestiges of not just the past, but a host of pasts associated with various period accoutrements. Like an archaeologist digging through layers of history, you’ll find pieces of the 1940s in the building’s size (tiny); the ’50s in its male-only clientele; the ’60s in the classic stainless steel chairs and other barber paraphernalia; the ’70s in the bulky (not flat screen) TV affixed to the wall; and ’80s and ’90s in the VCR and DVD player connected to it. It seems any number of bygone eras find representation here.

The most interesting archaeological discovery for me was a hardcover booklet, mixed in with the assorted Field & Stream and Sports Illustrated magazines spread around a table in the waiting area, based on the popular 1970s “Six Million Dollar Man” TV series. Part kids reader and part graphic novel, such digests were fairly common as producers tried to cash in on a hit show with accompanying light literature, posters and lunch boxes. I remember as a kid happily thumbing through beat-up “Star Trek”- and “Star Wars”-associated publications at our public library. They tended to be heavy on visual elements to go with simplified variations on the plots established by the parent franchise.

IMG_2006a

The adventures of Steve Austin were never that complex. But the idea of a six-million dollar man was fairly high-concept, in effect juicing up the traditional superhero narrative with highly plausible technological underpinnings. Spiderman derived his powers from the bite of a radioactive spider, and failing further explanation, you simply had to take it on faith that that could happen. “The Six Million Dollar Man” gave you a wonderful mix of robotics and medicine that, while fantastical, didn’t require too large of an imaginary leap from the actual advances of the 1970s (much less today). How Col. Austin’s bionic eye actually worked, or how, as my sister once pointed out, his unbionic back could withstand the strain of lifting a car, weren’t of much concern if you accepted, as Americans have done since the latter half of the 20th century, that scientists and doctors knew what they were doing and it was Luddite to question them.

As little understanding as I had of science and technology as a youngster, the appeal of a show like “The Six Million Dollar Man” having a foundation in reality would connect with later pseudo-scientific adventure series like “MacGyver.” Even the feature film “Back to the Future” captivated me with its fairly detailed, although entirely ludicrous, explanation of how time travel could be achieved. The whole idea was to marry fantasy with reality via test tubes and beakers.

So what happened? Hollywood’s current addiction to superhero franchises like “Spiderman” and “X-Men” has shifted the adventure narrative toward men and women transformed by murkier processes than doctors in lab coats. But really, we’re talking about distant cousins here — these are all stories that involve people coping (and sometimes struggling) with superhuman powers. Given their preference for established brands, it wouldn’t surprise me if studio executives saw easy dollar signs in a TV reboot or even feature film version of the “Six Million Dollar Man.” I’ll keep my fingers crossed, but for now, Steve Austin belongs, like the Green Bay barbershop, to another era.

IMG_2005a

A win for the fanatics

We’re all familiar with sports-as-war metaphors. I see them regularly in my day job with Packersnews.com: the “battles” between linemen “in the trenches,” the quarterback as “field general,” the NFL draft “war room.” It can be a bit much, particularly for those who have served and suffered in the real deal.

But there’s a purpose to such staged combat. Sports provide a safe outlet for our primal inclination toward periodic social conflict. Uniforms and logos establish which tribes we belong to. The winner is decided in accordance with game rules enforced by mediators, all of which is agreed to by combatants who shake hands as compatriots regardless of the outcome. We’ve satisfied our tribal and territorial impulses, with no one getting (badly) hurt.

Modern organized sports thrive off of the cohesion that has otherwise gone missing in a much of Western society. Outdated concepts such as honor, loyalty and territorial defense still matter — at least in the stands. While players play for whoever pays them, fans exhibit a devotion that, as the “fanatic” root word indicates, borders on the irrational. It’s about the hometown, the colors, the mascot. “We” define ourselves by not being “them.” It’s a tribal thing.

All of this is fairly harmless (not to mention big business) as long as people let it go at the final gun. Sure, there will always be sporadic bar fights and intra-family snubs as a few of us take the big game a little too seriously. But what happens when a sports team grows beyond a mild dalliance into tribalism, beyond geographical borders and comes to define an ethnic or racial identity? That’s the subject of “Forever Pure,” an Independent Lens documentary recently screened on PBS.

“Forever Pure” explores the unique story of the Beitar Jerusalem Football Club, an Israeli soccer team that is historically identified with the nation’s political right. Supported by a rabid fan base that has not only embraced but steered the team’s ideological bent, Beitar demands enthusiastic and vocal allegiance from any Likud party member with political ambitions. So it was a bombshell in Israeli sports and politics when in 2012, Beitar’s enigmatic owner, Arcadi Gaydamak, signed a pair of Muslim players. Never mind that the two athletes were Chechnyan; for supporters of a team that had never signed an Arab player, the act was tantamount to treachery.

The documentary follows the football club through its turbulent 2012-13 campaign, as sports and politics intermingle to sometimes uncomfortable degrees. After a smiling, highly orchestrated public welcome to Israel, the two young Chechnyans are subjected to Jackie Robinson levels of taunts, abuse and threats from fans. Their Israeli teammates, some well-meaning and others reluctant, offer half-hearted encouragement that scoring goals and winning games will turn the crowds in their favor. But as the Israeli players begin enduring collateral damage from the stands, whatever initial lukewarm support extended to the guest Muslims fades, and all quickly sour on the experience.

Gaydamak presses on, touting motives that on the surface appear altruistic, although somewhat perplexing. He admits he has little interest in the game of soccer, or sports in general, and seems intent on conducting a massive social experiment. But as losses pile up and fan boycotts leave the team playing in a nearly empty stadium, the businessman succumbs to the bottom line and ultimately sells the club. For the Beitar faithful, purity, with all its racial and ethnic connotations, is restored.

Sports movies are notoriously sappy. If “Forever Pure” had been a scripted Hollywood feature, Beitar would’ve gelled midseason, rallying together for an improbable championship run that wins over fans. A Chechnyan Muslim would score in overtime to end the title game, and Jews and Arabs would hug in the streets. Pie-in-the-sky to be sure, but there are brief flashes in “Forever Pure” that hint at such possibilities. You want it to happen. The reality that makes that scenario impossible is an overriding political narrative that relegates the players on the field, the coaches on the sideline and the wealthy owner in the luxury box as secondary figures to the irrational fanatics in the stands.