Guitar hero

There’s a scene in the 2014 documentary “Glen Campbell: I’ll Be Me” that goes something like this:

Campbell, the faded country-crossover musician battling Alzheimer’s disease, is in the doctor’s office for a checkup. The doctor asks him to wiggle his fingers like he’s playing the piano, to which Campbell retorts, “but I’m a guitar player.”

Somehow, through the fog of Alzheimer’s, a lucid, wise-cracking Campbell sparkles with trademark Southern sass, right down to the folksy pronunciation of “GIH-tar.” And then just as quickly, the fog returns.

It was a phenomenon with which I was familiar, having watched my father spend his final years succumbing to Alzheimer’s, only to fire the occasional spark of humor or recollection. For anyone desperately hoping to see the return of the person they knew, it’s easy to be seduced by notions of an unexplained reversal, or perhaps a miraculous misdiagnosis. But my siblings and I knew the diagnosis to be correct, and we also knew there was no reversing the course of this disease.

I was in the doctor’s office with my dad when the diagnosis came. It was matter-of-fact, even cold, considering the weight of such news. Dad took it with puzzled acceptance and his trademark good cheer. I’ll never know whether he really understood the implications. I didn’t ask. It would be devastating to process, and he could be excused for shrugging it off the way he would a poor glucose test. Shortly after the doctor’s visit, we had lunch at his favorite diner, the “dad jokes” flying fast and furious. I wasn’t in a laughing mood, but even then, he scored a couple of direct hits that had me cracking up. For a moment the old Dad sparkled, and for a moment I entertained notions of a reversal or misdiagnosis.

“I’ll Be Me” features an eerily similar doctor’s visit for Campbell, who meets the presentation of MR scans and medical mumbo jumbo with the too-eager “uh-huhs” and “oh yeahs” of a student in biology class. Like my dad, he didn’t show it, but I suspect he knew what all of it meant. And yet, he responded by launching a last hurrah tour and agreeing to have filmmakers document it. The burning question is why?

His reasons weren’t clear to me, but perhaps Campbell wanted to give Alzheimer’s a face. Not just one face, but many faces — the sufferer, his family, friends or anyone else affected by the robbing of a person’s mind in the most brutal and heartless way imaginable. The film doesn’t sugarcoat the increasing difficulties Campbell has with touring, and the accompanying headaches his family navigates in managing their erratic headliner. There are some very uncomfortable moments in “I’ll Be Me,” and I struggled with the family’s insistence on going forward with a tour that reduced this once-great performer to a lost, confused, occasionally angry high-wire circus act robbed of his most important remaining possession — his dignity. I had to take it on faith that the entire venture, while endlessly skirting disaster, was in accordance with Campbell’s wishes.

When Glen Campbell died last week at the age of 81, I immediately thought of “I’ll Be Me.” I was aware of his tremendous musical legacy, beginning with his early days as a Beach Boys fill-in before breaking through with solo hits like “Wichita Lineman” and “Gentle on My Mind.” His defining “Rhinestone Cowboy,” a radio staple during my youth, made him a bona fide pop star, but it wasn’t until later that I discovered what a marvelous GIH-tar player Campbell was. And yet, it was “I’ll Be Me” that gave real meaning to his death. I felt not only grief but relief at the peace that had finally come to him and his family.

Performing artists succeed as pop stars because of their ability to effectively convey common themes. Through their expression of universal emotions, we feel less alone, and that’s why we listen to them. Before I watched “I’ll Be Me,” I stupidly regarded our family’s experience with our dad as somehow unique, a bizarre and embarrassing dysfunction that outsiders couldn’t comprehend. The film was a revelation, and perhaps that answers the “why” that nagged me as I began to recognize so much of my father on the screen. Campbell was conveying — in fact demonstrating — an experience that’s more universal than any of us imagined, and most importantly, true to his chosen vocation, by sharing his pain and loss, he made me feel less alone in mine.

A different drum

It’s often puzzled me that I’ve never heard a decent cover version of the Beatles’ “Helter Skelter.” Yes, it’s the Beatles — arguably the best recording act the world has seen, and not to be taken lightly by prospective imitators.

But like many of the Beatles’ best-covered songs (“I Wanna Be Your Man,” “Come and Get It”) “Helter Skelter” is written from the straight-up rock ‘n’ roll playbook. It has the horsepower that should make lesser bands sound better than they are. With its buzzsaw guitar intro, blood-curdling vocals and the refrain’s killer machine-gun riff, it’s a muscle car just begging for a worthy rock act to test drive. And many have tried. It’s not that they’re bad. They simply are missing something, and I think I’ve figured out what it is: Ringo Starr.

History has been kind in assigning proper belated recognition to Ringo Starr, but he still ranks alongside producer George Martin as an overlooked staple of the Beatles arsenal, a critical piece of their musical machinery in the vein of McCartney’s Rickenbacker bass or Abbey Road’s magical mixing board. Ringo was much more than that, and “Helter Skelter” shows why.

We only need to listen to the aforementioned pretenders who’ve come up short in taking on this formidable classic: Aerosmith, Pat Benatar, Siouxsie and the Banshees, Motley Crue and, yes, U2. These are no slouches. Yet each one focuses too heavily on the front end of the Beatles’ original — the guitar and vocals — while, you guessed it, overlooking Ringo Starr’s unique contribution on the rhythm side. As a result, the drummers tend to follow the frenetic pace of the song — like that muscle car, it’ll go as fast as you want it to. And that’s where the problem lies. “Helter Skelter” was McCartney’s attempt to write the loudest, brashest song he could, and the guitars and his vocals do their part to make that happen. But what Ringo adds is the mayhem. He doesn’t pace the song, he bludgeons it. He smashes his way around the requisite fills, abusing his kit with such primitive fury (“I’ve got blisters on my fingers!”) it likely got Keith Moon’s attention. And it keeps the song from gaining too much form, which is exactly the point.

It’s a lesson on how a great drummer can make a song better by working against its grain, becoming a counterbalance of sorts. On the Beatles’ “I Am the Walrus,” for instance, Ringo takes the opposite approach. The song opens with an unfathomable soup of organ and strings before he counts in with a steady beat to provide much-needed structure under the wash of instruments and John Lennon’s lyrical stream of consciousness. Many drummers would say “this is a trippy song, I need to be trippy.” Not Ringo. He played straight man to Lennon’s clown in the same way he took a high-octane, quarter-mile McCartney rocker for a punishing off-road ride. It’s a rare talent indeed, and rare talent, especially when unrecognized, defies imitation.

Blazing a new trail

One of the great failings in environmental debate is the tendency to view environmentalism as an ethic that removes humans from the equation. Such rhetoric comes from both the for-and-against extremes, doing a great a disservice to the sensible middle. The tree-huggers envision a return to bucolic ecosystems of yesteryear despite those systems’ need to somehow accommodate 7 billion human beings. Meanwhile their opponents argue society’s economic health comes first, not realizing that whatever we do to the environment, we do to ourselves.

Here in Wisconsin, we sit in the sensible middle. It’s a state with a tremendous environmental legacy — the modern movement was set in motion by our former governor, U.S. senator and Earth Day founder Gaylord Nelson. Yet it’s no Wyoming or Utah, where areas the size of small Eastern states are set aside for parkland. Even in its sparsely populated northern reaches, Wisconsin doesn’t have undeveloped space on that scale. There’s tourism. There are Northwoods cabin getaways. There’s the timber industry. In other words, there are people.

IMG_2077As a result, we sometimes see environmentalism operate at the micro rather than macro level. Yes, there are state parks and national forests managed via vast bureaucracies that we trust have the right intentions. But there are times when the local folks take the reins. Case in point: The Ice Age trail.

OK, it’s technically the Ice Age National Scenic Trail. But that’s a federal designation bestowed only after tireless volunteers at the local level organized the patchwork project into something viable. It’s a success story brought to light in Melanie Radzicki McManus’ recently published “Thousand-Miler: Adventures in Hiking the Ice Age Trail.” In chronicling her attempt to complete the 1,100 miles approximating the furthest glacial advance of the last Ice Age, McManus highlights a state treasure hidden in plain sight.

There are two gratifying components to the Ice Age trail. First, it came together not via governmental decree but organically out of residents’ desire to set aside some green space within their communities. That desire reflected a number of favored recreational activities — camping, hunting, fishing, snowmobiling, horseback riding, biking and, yes, hiking. Secondly, it’s here in Wisconsin. You don’t have to go to Appalachia or the Mountain West to enjoy a world-class outdoors experience.

As McManus’ tale shows, this is hardcore hiking. Eleven hundred miles is not for beginners. Yet the culture, supported by a quasi-governing body called the Ice Age Trail Alliance, is one that encourages participation. McManus’ attempt qualifies as a “thru hike,” meaning the person pounds the trail — for weeks, months, however long it takes — until they’ve finished it from end to end. They rest each night, either camping or staying at local motels or the homes of “trail angels” — local volunteers who offer a bed, a hot meal or a ride to the next trailhead, where the hike continues. However, many people “section hike,” or complete portions as time allows, returning to their homes and jobs in the meantime, until they’ve done each piece comprising the trail. It doesn’t matter if it takes years. They’re given credit as finishers right along with the thru-hikers.

Either way you do it, it’s no walk in the park. McManus encounters a number of serious medical issues, mostly with her feet (not uncommon for distance hikers), that threaten to derail her. There are bears and wolves, particularly in the northern segments, while connecting highway routes can pose hazardous traffic. Mosquitoes and ticks, along with overgrown vegetation that impedes rougher portions of the trail, take their toll on exposed skin. But the biggest pain McManus contends with is losing her way. The yellow blazes affixed to sign posts and trees indicate the official route, but posts and trees sometimes fall over. Fast-growing vegetation can hide signs, leading hikers to unwittingly take a wrong turn or continue on when they should turn.

“Thousand-Miler” is at heart an adventure story, deftly sprinkled with educational components about the history and current operation of the trail, but the book’s most charming asset is its equal amounts of affection for people and place. Distance hiking has its own subculture, and McManus meets a number of characters on the trail, each with his or her own reasons, and strategies, for walking it. As a thru-hiker, she also benefits from the assistance of family members, friends and trail angels who serve as her crew, supplying her with food and water midhike, ferrying her to and from the night’s lodging and, most importantly, providing moral support. While an 1,100-mile hike affords the solitude and reflection of a personal journey, it becomes clear that there’s more to the story than one woman and the vast wilderness.

McManus acknowledges as much near the end of the book. Her own quest complete, she relates a recent effort by a veterans’ group to send military service members returning from war zones out on the trail. The idea is to plan and complete a hike as a way of therapeutically reintegrating into civilian life. The natural spaces give the veterans room to breath, while their reliance on support from family, friends and trail angels remind them of the goodness in humanity. They find peace through the land, and its people.

All of this comes together through the preservation of natural spaces. Score one for the tree huggers. But it’s a broader experience than traipsing through pristine forests undisturbed by human influence and hearing birds chirp unperturbed by human voices. What sense is there in creating an 1,100-mile trail if there’s no one there to hike it — or write about it? This is environmentalism in totality — one that has a place for people.

 

Days of future past

IMG_2003a

One of the neat characteristics of a legacy blue collar industrial city like Green Bay, Wis., is the endurance of commercial markers long since erased from mainstream America. You’ll find them hidden in plain sight, typically in older parts of town — the shoe repair shop on University, the hardware store on East Mason and Main where the guy will make duplicate keys for you, and, dotted around pockets of downtown with surprising frequency, the traditional barbershop.

To walk into a barbershop on Broadway on the west side of downtown, as I did recently, is to glimpse vestiges of not just the past, but a host of pasts associated with various period accoutrements. Like an archaeologist digging through layers of history, you’ll find pieces of the 1940s in the building’s size (tiny); the ’50s in its male-only clientele; the ’60s in the classic stainless steel chairs and other barber paraphernalia; the ’70s in the bulky (not flat screen) TV affixed to the wall; and ’80s and ’90s in the VCR and DVD player connected to it. It seems any number of bygone eras find representation here.

The most interesting archaeological discovery for me was a hardcover booklet, mixed in with the assorted Field & Stream and Sports Illustrated magazines spread around a table in the waiting area, based on the popular 1970s “Six Million Dollar Man” TV series. Part kids reader and part graphic novel, such digests were fairly common as producers tried to cash in on a hit show with accompanying light literature, posters and lunch boxes. I remember as a kid happily thumbing through beat-up “Star Trek”- and “Star Wars”-associated publications at our public library. They tended to be heavy on visual elements to go with simplified variations on the plots established by the parent franchise.

IMG_2006a

The adventures of Steve Austin were never that complex. But the idea of a six-million dollar man was fairly high-concept, in effect juicing up the traditional superhero narrative with highly plausible technological underpinnings. Spiderman derived his powers from the bite of a radioactive spider, and failing further explanation, you simply had to take it on faith that that could happen. “The Six Million Dollar Man” gave you a wonderful mix of robotics and medicine that, while fantastical, didn’t require too large of an imaginary leap from the actual advances of the 1970s (much less today). How Col. Austin’s bionic eye actually worked, or how, as my sister once pointed out, his unbionic back could withstand the strain of lifting a car, weren’t of much concern if you accepted, as Americans have done since the latter half of the 20th century, that scientists and doctors knew what they were doing and it was Luddite to question them.

As little understanding as I had of science and technology as a youngster, the appeal of a show like “The Six Million Dollar Man” having a foundation in reality would connect with later pseudo-scientific adventure series like “MacGyver.” Even the feature film “Back to the Future” captivated me with its fairly detailed, although entirely ludicrous, explanation of how time travel could be achieved. The whole idea was to marry fantasy with reality via test tubes and beakers.

So what happened? Hollywood’s current addiction to superhero franchises like “Spiderman” and “X-Men” has shifted the adventure narrative toward men and women transformed by murkier processes than doctors in lab coats. But really, we’re talking about distant cousins here — these are all stories that involve people coping (and sometimes struggling) with superhuman powers. Given their preference for established brands, it wouldn’t surprise me if studio executives saw easy dollar signs in a TV reboot or even feature film version of the “Six Million Dollar Man.” I’ll keep my fingers crossed, but for now, Steve Austin belongs, like the Green Bay barbershop, to another era.

IMG_2005a

A win for the fanatics

We’re all familiar with sports-as-war metaphors. I see them regularly in my day job with Packersnews.com: the “battles” between linemen “in the trenches,” the quarterback as “field general,” the NFL draft “war room.” It can be a bit much, particularly for those who have served and suffered in the real deal.

But there’s a purpose to such staged combat. Sports provide a safe outlet for our primal inclination toward periodic social conflict. Uniforms and logos establish which tribes we belong to. The winner is decided in accordance with game rules enforced by mediators, all of which is agreed to by combatants who shake hands as compatriots regardless of the outcome. We’ve satisfied our tribal and territorial impulses, with no one getting (badly) hurt.

Modern organized sports thrive off of the cohesion that has otherwise gone missing in a much of Western society. Outdated concepts such as honor, loyalty and territorial defense still matter — at least in the stands. While players play for whoever pays them, fans exhibit a devotion that, as the “fanatic” root word indicates, borders on the irrational. It’s about the hometown, the colors, the mascot. “We” define ourselves by not being “them.” It’s a tribal thing.

All of this is fairly harmless (not to mention big business) as long as people let it go at the final gun. Sure, there will always be sporadic bar fights and intra-family snubs as a few of us take the big game a little too seriously. But what happens when a sports team grows beyond a mild dalliance into tribalism, beyond geographical borders and comes to define an ethnic or racial identity? That’s the subject of “Forever Pure,” an Independent Lens documentary recently screened on PBS.

“Forever Pure” explores the unique story of the Beitar Jerusalem Football Club, an Israeli soccer team that is historically identified with the nation’s political right. Supported by a rabid fan base that has not only embraced but steered the team’s ideological bent, Beitar demands enthusiastic and vocal allegiance from any Likud party member with political ambitions. So it was a bombshell in Israeli sports and politics when in 2012, Beitar’s enigmatic owner, Arcadi Gaydamak, signed a pair of Muslim players. Never mind that the two athletes were Chechnyan; for supporters of a team that had never signed an Arab player, the act was tantamount to treachery.

The documentary follows the football club through its turbulent 2012-13 campaign, as sports and politics intermingle to sometimes uncomfortable degrees. After a smiling, highly orchestrated public welcome to Israel, the two young Chechnyans are subjected to Jackie Robinson levels of taunts, abuse and threats from fans. Their Israeli teammates, some well-meaning and others reluctant, offer half-hearted encouragement that scoring goals and winning games will turn the crowds in their favor. But as the Israeli players begin enduring collateral damage from the stands, whatever initial lukewarm support extended to the guest Muslims fades, and all quickly sour on the experience.

Gaydamak presses on, touting motives that on the surface appear altruistic, although somewhat perplexing. He admits he has little interest in the game of soccer, or sports in general, and seems intent on conducting a massive social experiment. But as losses pile up and fan boycotts leave the team playing in a nearly empty stadium, the businessman succumbs to the bottom line and ultimately sells the club. For the Beitar faithful, purity, with all its racial and ethnic connotations, is restored.

Sports movies are notoriously sappy. If “Forever Pure” had been a scripted Hollywood feature, Beitar would’ve gelled midseason, rallying together for an improbable championship run that wins over fans. A Chechnyan Muslim would score in overtime to end the title game, and Jews and Arabs would hug in the streets. Pie-in-the-sky to be sure, but there are brief flashes in “Forever Pure” that hint at such possibilities. You want it to happen. The reality that makes that scenario impossible is an overriding political narrative that relegates the players on the field, the coaches on the sideline and the wealthy owner in the luxury box as secondary figures to the irrational fanatics in the stands.

Country market

One of the defining characteristics of pop music is its transitory nature. No one sits still for long with the next big thing nipping at their heels. As a result, the music tends to lack any substantive sense of permanence, and the values it projects are a moving target.

Look no further than the original pop masters, the Beatles. In less than 5 years, the Fab Four went from shaggy but clean-shaven crooners to mustachioed psychedelic messengers, then bearded socialist peaceniks. I don’t doubt there were conscience awakenings taking place, but the Beatles were also savvy enough to devise a go-to tactic of the pop playbook — always keep your audience guessing. It gave them a strategic advantage over bands like the Dave Clark Five, who were every bit their equal in 1965 but dated relics by the end of that decade.

In soul music, Marvin Gaye underwent a similar evolution that separated him from “safe” Motown acts like Smokey Robinson and the Supremes. And in rock, punkers pushed beyond the comfortable boundaries established by legacy bands such as the Who, Pink Floyd and Led Zeppelin.

But some audiences don’t want to be kept eternally guessing. It’s no surprise then that a considerable number of abandoned masses gravitated to the Eagles-led country rock revolution and its successor in modern country. Amid the political and social upheaval of the 1960s and ’70s, country remained steadfastly committed its core values. As its name indicates, it draws upon a vision of rural America where life centers on family, faith and patriotism. Throw in an undercurrent of rowdiness, and there’s a winning formula.

Let me be clear: Country artists are in my estimation as musically engaging and innovative as their pop, soul and rock counterparts. But their marketing strategy is unique and, I would add, quite ingenious. In a pop cultural landscape that’s subject to a dizzying pace of change, they offer a slice of permanence. To be sure, the music has evolved into various interesting hybrids that Hank Williams wouldn’t recognize, incorporating pop, hard rock, soul and even hip hop. But the message remains firmly grounded in country’s rural roots. Even its impish rebelliousness — a necessity in all pop-oriented music — is framed in the larger paradigm of traditional American life.

Modern country’s ascendance into the mainstream somewhat mirrors that of NASCAR, with both institutions slickly presenting rural sensibilities in ways that resonate with urban audiences. In referencing a simpler past, the themes echo a universal yearning for the good times. You don’t have to be a redneck to relate. Lyrical references are heavy on beer-drinking and truck-driving and, other than mild patriotism, light on politics.

It’s a legitimate point then that country music, as a social force, favors cohesion over challenge and too often stands up for the status quo. But it’s that way by design. These are not, as some derisively suggest, dumb hicks. Country artists know what their audiences want, right down to the cowboy hats and cut-off sleeves. A case in point: This observation I made of two concerts Kenny Chesney played at Lambeau Field, four years apart.

 

Chesney is selling a good time, and a big part of the pitch is the packaging, a big part of which is familiarity. There’s a reason you’ll find McDonald’s hamburgers and Budweiser beer to be identical in Rhode Island and Wyoming. Like those companies, Chesney understands his market. Is it hokey? Well, he sold out Lambeau Field — twice — something only a few acts in the world can do.

Again, this isn’t a knock on the merits of country music. I simply marvel at its grasp of essential marketing. Other artists do it — think KISS wearing the same masks/outfits they wore in the 1970s, or Journey going to great lengths to replicate former singer Steve Perry’s vocals. Whatever your view of artistic integrity, popular music, country or otherwise, is a business, and experimentation invites commercial risk. While the Beatles may have won critical accolades for their adventurous later albums, it was their early output that sold millions of records and filled concert venues. By its association with a more conservative fan base, country has been able to bottle and sell, with its own distinctive twang, the spirit of pop music at its peak.

Hitwomen

Say what you will about “token” minority appreciation months — I always seem to learn something by indulging the concept. For instance, in honor of Women’s History Month, I thought I’d run down a short list of the female recording artists who have expanded my musical horizons, and I found myself naming name after name after name. After name. It turns out they’re not a subset of popular music. They’re an integral part of it, pushing songs up and down the charts, innovating, regressing, occasionally blowing our minds, and yes, occasionally flopping. It occurs to me that the story of women in pop music is really just the story of pop music. They’ve been playing right along with the men, and although I didn’t know it, I’ve been listening the whole time.

As I said, whittling my favorite female artists down to a list of 10, or 50, or 100 is a ridiculous exercise. But since I want to pay tribute to women whose music means the most to me, I narrowed the field to those who wrote the music they recorded, highlighted by the one song that epitomizes their talents. Make no mistake: These are musical giants by any measure. Consider the five I’ve named here not as a listing of “best women in music,” but a sampling of the very best artists in pop music who happen to be women.

Jackie DeShannon, “When You Walk in the Room” (1963): She’s best known for recording Burt Bacharach and Hal David’s “What the World Needs Now is Love,” but Songwriter Hall of Fame inductee Jackie DeShannon was no slouch with the pen, composing the classic “Put a Little Love in Your Heart” and co-writing “Bette Davis Eyes,” later a chart-topping hit for singer Kim Carnes. “When You Walk in the Room” echoes many of the female-oriented hits of the early 1960s — notably the Carole King/Gerry Goffin penned “Will You Love Me Tomorrow” — that regarded love with a tentative shyness and anxiety of a teenager*. To cite individual lines does an injustice to the entire piece, which paints an agonizing portrait of a lovesick wallflower whose world lights up when the man of her dreams enters her room. Painfully, she keeps her distance, observing from afar what she believes is unattainable. It’s heartbreakingly sweet, an ode to teen awkwardness that presumably fades in adulthood.

*It should be noted The Searchers covered “When You Walk in the Room.” Like most great love songs, the lyrics are not specific to gender or, for that matter, sexual orientation.

Joni Mitchell, “Free Man in Paris” (1974): I was never big on Joni Mitchell as one of the folk mothers of the ’60s, but her influence on the genre is undeniable. Her “Woodstock,” covered famously by both Crosby, Stills, Nash & Young and Matthews Southern Comfort, became a touchstone for a generation defined by that legendary music festival. In the 1970s, Mitchell moved toward fuller, jazz-infused instrumentation that produced such hits as “Help Me” and “Free Man in Paris.” The latter recounts an excursion to Paris she made with friend David Geffen, during which the ascending record executive, accustomed to dealing with “dreamers and telephone screamers,” was able to forget his cares, if only briefly. One can imagine Geffen letting his responsibilities slide away on the Champs Elysees to the refrain of “I was a free man in Paris, I felt unfettered and alive. Nobody calling me up for favors, no one’s future to decide.” That Mitchell nailed him so precisely is a credit both to the depth of their relationship and her songwriting mastery.

Liz Phair, “Mesmerizing” (1993): For a few albums in the 1990s, Liz Phair was so original, so genuine and so refreshing. I’d never heard anyone like her. She talked about sex, love, relationships, marriage, divorce and children in absolute frank terms. She gave the listener unfiltered access to what was on her mind, turning from schoolgirl sweet to shockingly filthy in the same stanza. Her 1993 debut “Exile in Guyville” carries so much interesting baggage, as titles such as “F–k and Run” and “Divorce Song” indicate, but “Mesmerizing,” in which Phair lyrically empties both barrels in a blaze of defiance and hurt, showed me a new way of recording music. It’s sparse, with little more than an electric guitar, a hint of percussion and her world-weary voice. But it holds up. Phair grew more polished in follow-up albums, including the excellent “Whitechocolatespaceegg,” but slowly fizzled into obscurity the further she strayed from the cut-to-the-bone sound that made her so, well, “Mesmerizing.”

Carole King, “It’s Too Late” (1971): Carole King is a no-brainer for this list; it’s just a matter of which song to single out. Luckily, as hugely productive as her years with husband and songwriting partner Gerry Goffin were, those were collaborative efforts and don’t count here. Her extensive solo catalog doesn’t make the job a whole lot easier, but given the enormous impact of her landmark 1971 album “Tapestry,” let’s start there. Even then, with a loaded track list that includes the wrenching “So Far Away,” I basically had to flip a coin to arrive at “Too Late.” First off, I’m a sucker for major 7 chords, which she throws liberally into both songs. The tiebreaker then is the latter’s unique appraisal of why a relationship failed, best summed up in the line, “Now you look so unhappy, and I feel like a fool.” Maybe it’s the female perspective. Maybe it’s just great writing. Either way, there’s a keen understanding that breakups aren’t always about what went wrong or who’s at fault. Sometimes unhappiness and foolishness just happen.

Stevie Nicks, “Silver Springs” (1977): As part of the rock ‘n’ roll soap opera that was Fleetwood Mac, Stevie Nicks had a treasure trove of intraband relationships and dalliances to mine for writing material. She did not waste them, crafting soul-baring songs like “Gold Dust Woman,” “Sara” and “Landslide.” How it was that “Silver Springs” led an orphaned existence as a B-side left off the group’s 1977 “Rumours” album will forever be a mystery to me, as it’s one of her finest efforts, both as a writer and singer. Aimed squarely at bandmate and ex-lover Lindsey Buckingham, “Silver Springs” explores betrayal, musing over what could have been before devolving into the obsessive ranting of a stalker. If it’s not clear from the line, “And can you tell me was it worth it/Really I don’t want to know,” how emotionally devastated this woman is, the refrain seals it with “I know I could have loved you but you would not let me.” Nicks alternately voices pleading, scorn and finally desperation as she pledges to “follow you down till the sound of my voice will haunt you.” And haunt it does. It’s rare to see heartbreak so powerfully written. It’s even rarer to see it so expertly illustrated.