Recollections / April 24, 2017
I was born in 1975, at the demographic nadir of the 1970s birth trough in America and before the mini Baby Boom of the early ’80s. My birth cohort is small; my grade school classes were the smallest they’d be for the next 30-some-odd years. So, as a late Gen-Xer, my oldest pop culture memories are of the late 1970s. I was a kid in the 1980s. But I came of age at the end of history.
In 1989, historian Francis Fukuyama published an essay titled “The End of History?” In it, he proposes that humankind had reached “the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.” Even at the time, Fukuyama’s thesis was controversial, but with the exception of thinkers like Jacques Derrida, no one was offering a competing narrative for the end of Communism and the triumph of market economics and liberal democracy. It simply was, and it was an unalloyed good. It’s worth noting, though, that in the three years between Fukuyama’s provocative essay and the release of his 1992 book, The End of History and the Last Man, his thesis, once a question, had stratified and reified into fact.
How much of this was on my radar as a 14-year-old about to enter his first year at Malden Catholic High School? I’d say very little; but in 1989 you would have had to be completely cut off from mass media not to see the wave of revolutions sweeping through the former Warsaw Pact countries. And in so many cases—Poland, Czechoslovakia, East Germany—there was real joy and jubilation on the part of the populace in being free of the yoke of decades of Soviet imperialism. Of course, things took a much darker turn in Romania, where the ruling Ceaușescu family was summarily executed, and in the former Yugoslavia, where the fall of Communist federalism led to nearly a decade of ethnic war (and phone calls from U2’s Bono). But for the most part, the images that were beamed into our homes on the nightly network news (and, increasingly, thanks to CNN in Atlanta) from the former Warsaw Pact were positive, jubilant.
In the America of 1989, exhausted from a decade of heightened Cold War anxieties, there were no qualms or reservations. This was our victory. Ronald Reagan, recently put out to pasture, was cited by conservatives and liberals alike as “the man who won the Cold War.” In a matter of months after the de facto reunification of Germany, actual no-fooling pieces of the Berlin Wall were on sale in my local Service Merchandise catalog showroom, which itself would go out of business in a decade thanks to the nascent internet. (At first I thought this might be a confabulated memory filtered through the last few years I’ve spent deep in Marxist theory, but no, this absolutely actually happened.)
This euphoria about the end of the Cold War even caught fire in pop culture. British pop group Jesus Jones released a single called “Right Here, Right Now” in 1990 that rocketed to the top of the charts. Its lyrics were painfully literal (“I saw the decade in/When it seemed the world could change/At the blink of an eye…”), the video even more so. There’s that euphoria again. Was it tempered at all by fear of the future and the unknown? In retrospect, it doesn’t feel like it. No one wondered what would rush in to fill the void of the Cold War. We were meant to believe this was another V-E Day, a triumph for the forces of democracy.
Transitioning from childhood to adulthood at this particular time in history made an already confusing process even more bewildering. I certainly wasn’t alone as a teenager struggling to find and forge an identity in his adolescent years. Luckily (and Jesus Jones aside), I was coming of age at a great time in pop culture. (Of course, it’s possible that those nostalgia goggles might be fooling me into thinking that the many offerings of those years were so great.) But simply examining the state of, say, music in the period 1989-1993 shows that a real revolution was going on, both artistically and commercially. The widespread introduction of the compact disc in the early ’80s and the introduction of Soundscan in 1991 nicely bookend this period of music industry revolution. Throughout the 1980s, so-called “alternative” music had been bubbling under the surface on college radio throughout North America. Every now and again, the monoliths of music of the day—MTV and major record labels—would snag one of these underground groups and give them some spit and polish for a larger audience. But it was really only in those first years post-Cold War that the “alternative nation” started to find venues outside of the whisper networks of indie culture, and that the pop titans who’d ruled the 1980s—Madonna, Bruce Springsteen, Prince, and Michael Jackson—began to fall.
What this meant for someone like me, who’d spent much of my junior high years (’87-’88) listening to the then-monolithic radio genres of classic rock and heavy metal, is that by the time 1989 rolled around there was a wealth of new and interesting artists to seek out and find. And I was certainly helped in this process by major media outlets. MTV had introduced the program 120 Minutes in 1986 in order to showcase leading bands on the indie circuit. By the time I was in high school, it had become a Sunday late-night institution and was appointment television. I filled countless VHS tapes with “mixtapes” featuring videos found on the show. (For those of you whose memories of the show are as fond as mine, please check out this 120 Minutes archive site with playlists and more from the actual episodes.) MTV had also started tentatively to explore original programming, from the TV-generation-oriented quiz show Remote Control to the cutting-edge animation showcase Liquid Television.
I was also fortunate to grow up in Boston, where there was a burgeoning local indie scene that had birthed alternative rock legends like The Pixies, Throwing Muses (and the successor bands to both, The Breeders and Belly), Blake Babies, Buffalo Tom, Tribe, and so many others I can’t take the time to list here. This scene was fostered by the duo of weekly alternative paper The Boston Phoenix and radio station WFNX (101.7). Thanks to the tastemakers at both of the Phoenix’s media outlets, I discovered new musical groups outside the local Boston scene. WFNX was particular crucial in spreading UK bands of the period to the US, which is where and how I first heard Madchester stalwarts like The Charlatans (UK), The Stone Roses, and Happy Mondays, not to mention grebo/acid house/”baggy” groups like the KLF, Pop Will Eat Itself, Ned’s Atomic Dustbin, and yes, even Jesus Jones and EMF before their hit singles broke through. And if I wanted to find weird imports by my favorite 4AD bands like the Pixies or Throwing Muses or Lush all I had to do was head to my local branch of Newbury Comics, where import CD singles from the UK were plentiful and new releases were almost always $9.99, well within the budget of a high school student working part-time at Staples.
Of course, it’s no exaggeration to say everything changed with the coming of “Smells Like Teen Spirit” in 1991. I first saw the now-classic video during off-peak hours on MTV in the fall of ’91: notably, not on 120 Minutes. At first, Nirvana just seemed like another Seattle alternative band getting some MTV airplay. But by the end of 1991, the middle of my junior year, Nirvana was everywhere: alternative radio, classic rock radio, MTV, and in the larger culture. I remember my high school’s smallish contingent of old-school punks—aficionados of the Ramones and the Clash—mocking Kurt Cobain’s singing style. My first taste of the concept of “indie-r than thou” definitely was a tough one to take. Me, though, I was all in: grabbing flannel shirts at the local thrift store, buying myself parachute boots at the Army-Navy store as I vacillated between grunge and industrial fashion looks throughout the last two years of high school. I attended Lollapalooza in 1992, seeing a bill that included Lush, Pearl Jam, Soundgarden, The Jesus and Mary Chain, Ice Cube, and Ministry, with the Red Hot Chili Peppers headlining. Ministry’s set, right before the Chili Peppers’, was marked by the crowd ripping up the sod of the Great Woods lawn, tearing down the fences, and starting multiple bonfires. It was like landing in the midst of a mid-summer pagan ritual.
Inevitably, battle lines were drawn. The high school landscape is a place where you pick your tribe and defend your territory viciously. The metal kids hated the alternative kids, the alternative kids didn’t trust the mainstream kids who liked hip hop and R&B, and so on. I personally have a lot of regret today for being an obnoxious high school “rockist” and not fully appreciating the mini-golden age of R&B that new jack swing delivered to pop radios in the early ’90s. I’d loved hip hop in junior high, and still followed groups like the Public Enemy and Digital Underground into high school (and of course the Beastie Boys, who had released their classics Paul’s Boutique and Check Your Head during these years), but I tended to shun anything with “pop” appeal. Little did I know that by the mid-’90s, alternative would be the new pop, and my own tastes would begin to be considered hopelessly mainstream.
Music helped form my identity, but so did my hobbies. I was, of course, a nerd from way back, and I’d started running 1st edition Advanced Dungeons & Dragons (AD&D), Marvel Super Heroes, and other tabletop RPGs for my friends back in junior high. Once I got to high school, our usual junior high ritual of hitting the movie theater on Friday nights and the local bowling alley/billiard hall/video arcade on Saturdays had withered back to just the video games on Saturdays. It was while playing Cyberball at that arcade in late 1989 that I met a couple of friends of friends who’d eventually form the core of my high school D&D group. It was also in 1989 that TSR released the second edition of AD&D, and that release was the catalyst for my high school gaming group’s four-year campaign in Gary Gygax’s original Greyhawk campaign setting. All-night Friday sessions of AD&D followed, fueled by $5 pizzas, 3-liter bottles of Coke, and judiciously-applied doses of Jolt Cola (this was, after all, an era before “energy drinks” like Red Bull).
So while video games had been the uniting force in forming my close-knit group of high school friends, it was still an era in which we had to game together, physically, in the same location (with one notable exception, which I’ll get to in a bit). And this was also that strange liminal, fallow period between the mid-to-late ’80s dominance of the Nintendo Entertainment System and the coming mid-’90s wave of fifth-generation consoles. The Super Nintendo and the Sega Genesis left long shadows, but neither of them caught on with us; console gaming was not on our radar for the most part. We either played computer games on our 386 and 486 PCs (and yes, we were still largely running our favorite computer games, like the Gold Box AD&D series, out of MS-DOS; I didn’t get Windows 3.1 on my machine until very late in high school) or we went to the aforementioned arcades. The golden age of the 2D fighting game launched late in my high school years: Street Fighter II (and later, its variant successors Street Fighter II: Champion Edition and Street Fighter II Turbo) was the sine qua non of competitive gaming for us at the time. Informal challenge structures formed at our local arcades, movie theaters, and sub shops. Laconic neighborhood legends gained enough status to be granted sobriquets like “Bulls Hat” based on their trademark items of clothing. Quests to the far edges of Boston’s North Shore to find “virgin” machines whose joysticks had not been ruined by too many neighborhood kids trying to amateurishly pull off uppercuts were entirely sincere. The introduction of the exaggerated bloody carnage of copycat Mortal Kombat in 1992 seemed to mark the end of an era; MK introduced a younger, less-refined crowd to the local arcade community, or so we felt at the time.
More important, this was the period right before the rise of the internet as we know it today. No online gaming, of course—with one exception. Our group had found local dial-up bulletin board systems (BBSs), one of the few “online” communities available at the time. We had a list of maybe a couple of dozen WWIV-style BBSs that we could dial up to and check out the services. I didn’t really use the bulletin boards themselves or the strange feature called “FidoMail“; the BBS for me wasn’t for communicating but for playing massive asynchronous multiplayer games like TradeWars 2002. While I didn’t use BBSs for much in the way of “social networking,” I got a set of 3.5″ inch floppies that allowed me to install the online service Prodigy on my PC during my senior year of high school, and I used it to chat with people all over the country about alternative music. I made some really great friends on Prodigy; I can still remember the excitement I’d feel when the “NEW MAIL” notification would pop up while I was surfing the boards. In a few months, I’d head to college and the very early World Wide Web and Usenet would open up to me, but in late 1993 and early 1994, I felt that it paled in comparison to the community we created on Prodigy.
Even with this sneak peek around the corner at the coming computer revolution, the main conduit of media into the home was still good old-fashioned television. The early ’90s were a pretty incredible time for TV: much like I discussed with respect to pop music, the titans of the ’80s were beginning to fade away, and more important, the family sitcom format that dominated the decade was beginning to fall from its heights. ABC had been a venue for some of the most challenging programming out there—Max Headroom may have been the late-’80s harbinger for this wave of new creative voices on TV, but the debut of Twin Peaks in 1990 changed the game utterly. Twin Peaks took pop culture by storm, twisting the tried-and-true format of the nighttime soap (Dallas, Dynasty, Knots Landing, Falcon Crest, etc.) and taking it (and network television) to weird new places. ABC also offered the memorable near-future cyberpunk setting of the miniseries Wild Palms in 1993. Northern Exposure on CBS (debuted in 1990) played with narrative forms and introduced a network audience to what we would have euphemistically called “alternative lifestyles” back then, and often offered a sometimes-thoughtful examination of the intersection of settler culture with American Indians (most often pushed in that thoughtful direction thanks to the Native actors that the producers had insisted upon casting). Whatever the case, network television programs were beginning to crawl out of the strict formulae that had been handed down from the early days of TV. This was also the era of politicians getting mad at “liberal” TV shows like Murphy Brown. And let us not forget the rise of the upstart fourth network, Fox, and its wildly popular hit The Simpsons, which George H.W. Bush took to task during the run-up to his 1992 re-election campaign as his Gulf War popularity began to fade away. (In true ironic ’90s style, both shows responded to the controversy in character, further blurring the lines between fiction and reality.)
While network TV was still partially the monolith that it had been in the ’80s and previously, the power of cable networks in the early ’90s could no longer be denied. Our household got cable back in 1983 or ’84, when there were maybe 30 or so channels putting out programming nationally. By the early ’90s, that roster had expanded considerably, and networks like CNN had suddenly become a cultural force. CNN arguably “came of age” itself during the Gulf War, with the memorable reporting of Bernard Shaw and his compatriots from “under a table” as the bombs began to fall on Baghdad, as Desert Shield became Desert Storm in January of 1991. CNN’s ascendancy was almost entirely thanks to its 24-hour coverage of this first post-Cold War American conflict. I remember when Saddam’s forces rolled into Kuwait, a few days before my 15th birthday. That week, the invasion was a hot topic at the school newspaper as we were putting together the first issue of the school year during some summer hours. The editorial board, most of whom were rising seniors, were seriously afraid about this coming conflict expanding to a Vietnam-style quagmire, and openly wondered if the U.S. military would need to reinstate the draft. In retrospect, it may seem silly for us to have thought that such a brief conflict would escalate in such a fashion. But looking at Iraq after the 2003 invasion, maybe our fears weren’t too inaccurate after all.
CNN was the only place for this kind of wall-to-wall news coverage back in the early ’90s. When the Los Angeles uprising took place in 1992 in the aftermath of the Rodney King beating decision, CNN focused and galvanized public opinion through careful and judicious use of images: Korean-American grocers acting as snipers on rooftops, the ambush of truck driver Reginald Denny, and racist commentators opining about how the riots were “opportunistic.” This same dynamic would persist in the mid-’90s as Los Angeles again convulsed under a new media spectacular: the murder of Nicole Brown Simpson and Ron Goldman and the chase and eventual trial of O.J. Simpson. As a suburban white teenager in the early ’90s, my opinions about war, global geopolitics, social justice, and even American history were formed by the portrayals of these issues in mass media. As such, my views might not have been considered very sophisticated at the time. Still, when the moment came to put my voice forward in 1992, I and many others in my age cohort saw Bill Clinton, the first Baby Boomer to have a real shot at the Presidency, as a deliverer from 12 years of Reagan-Bush cultural conservatism, at the very least. (Of course, this concern betrays how important George H.W. Bush’s opinions on The Simpsons and Dan Quayle’s on Murphy Brown were to me, and how little I cared at the time about Bill Clinton’s cynical use of the Sister Souljah newscycle around the L.A. uprising to build up his bona fides with white “Reagan Democrat” voters). On Election Day 1992 (in which I was not yet old enough to vote), I had my college admissions interview, and I was able to channel my hope for a political revolution, which I’d talked about in one of my essays, into the interview. I ended up getting into that Ivy League college, and Bill Clinton ended up being elected President. Neither experience over the next four years was as revolutionary as I’d hoped.
George H.W. Bush, in his address to Congress in the build-up to the first Gulf War in 1990, asked us to consider a “New World Order.” And these four years of Bush’s Presidency ushered in that very new world order, for good and for ill. The end of the Cold War was destabilizing on a lot of levels, pop culture included. New voices were indeed finding their way to briefly commandeer those few media megaphones, but how much was really being revolutionized? In retrospect, the early ’90s now feel like a period where the groundwork was laid for the severe stratification and inequality we experience today. As exciting as revolutions both large and small may have felt back then, so much of that energy was simply dissipated, poisoned, or co-opted by the existing power structures. Theorist Phillip E. Wegner, in his Life between Two Deaths, 1989-2001: U.S. Culture in the Long Nineties (the “two deaths” here being the death of the Cold War and 9/11), looks at pop culture during the ’90s. He notes that grunge and the resurgence of punk helped set up the cultural foundation for the later anti-globalization movement: “the argument could be advanced that grunge punk helped set the context for Seattle’s place later in the period in the emergence of the counterglobalization movement.” I will play the cynic in this debate and note that Seattle also fostered the tech revolution (thanks to the groundwork laid during the Cold War by defense darling Boeing) that birthed both of the colossal tech monopolies of the late 20th and early 21st centuries: Microsoft and Amazon (let alone global brand phenomenon Starbucks). All that cultural capital of “cool” can get recuperated into the control apparatus so easily; Seattle’s was so easily commodified by the powers-that-be that one of the era’s greatest hoaxes, the “grungespeak” dictionary created by Megan Jasper at Sub Pop Records, was swallowed hook line and sinker by the august (and out of touch) New York Times.
When I think about these years, the nostalgia feels so different from the faint yet deep memories of the ’70s and the vivid neon-colored images I associate with the 1980s. I always feel a strange sense of complicity in my high school years of coasting along, blissfully unaware of the impact of the colossal events happening all around me. As politically engaged as I might have been at the age of 17, life was still comfortable for me. The future looked bright, the idea that I would live a life like my parents’—get a full-time job, a home, and a family—was a foregone conclusion. My material conditions upon growing up would obviously be even better than theirs were! I came of age, like young people in America, going all the way back to the end of World War II, knowing deep down that the forces of history would not interfere in my quintessential American pursuit of happiness. Now I’m middle-aged, and I realize instead that my generation, and especially those who came after, were the canaries in the coalmine signaling the end of that American dream. The passing of the Cold War released the ghosts of that half-century, and those ghosts have come home to roost in the United States over the past quarter-century, economically and geopolitically. The end of the Cold War was, indeed, as Jesus Jones said, “the world waking up from history.” It just turned out that the new day dawning was far darker than the night that preceded it.