But: At the turn of the 20th century, a young Eleanor Roosevelt, on a field trip to the tenements, is horrified to see young children handmaking tchotchkes in their airless homes until they collapsed. Those in and adjacent to power were mostly oblivious (willfully or not) to the plights of the vulnerable, a privilege that smartphone and body cameras have all but done away with.
Social media amplifies and accelerates social movements to great effect, but have I read Beowulf? (And what else are cameras recording?) The consequence of a world in which the many plights of the many vulnerable can be made immediate is… TikTok. We can’t have both firsthand insight into every variant of the human experience and be expected to select one for 90 minutes’ contemplation. Complexity costs simplicity.
Speaking of, let’s return to modern discourse. From up here on my high horse, I enjoy staring down my nose at the naïveté of those who would summarize a policy objective in an Instagram story you don’t even need to hold your thumb on long enough to read. I roll my eyes and mumble “something something nuance.” I like the “nuance” meme because it’s very useful for me to gasbag away all the questionable choices I’ve made in my adult life rather than having to summarize them honestly! (“For the money.”)
I think that the rightest thing, morally speaking, is that which can be expressed simply. You shouldn’t need ten paragraphs to explain your value system.
But many right things can be true at once, and the challenge of governance is to build Frankensystems that serve as many of those right things as possible without collapsing. This is the blessing and curse of “hashtivism” (I hate simplicity but I love a portmanteau!) and technology: the speed, breadth, and concision of a given cause leave little room to jostle it into the Jenga tower of all the worthy ones.
Justice requires that all demands are satisfied. Power gets away with catering to a few. Case in point: our fractured Democratic party that pleases no one. (It’s unclear to me why the Democrats are communicating any message other than “A rising tide lifts all boats,” but I suspect they have some overeager digital strategists who have forgotten that there’s life outside their @ mentions.) Case in point: every corrupt government everywhere and every attempt to stem corruption that lapses into either dictatorship or just more corruption.
I like the old saw about the arc of moral history bending toward justice. I believe in the idea that we can build incrementally toward systems of governance that serve more people justly. (Or we could all just move to Denmark.) I think perhaps it’s impossible to do that without muddying each objective. Is that unimaginative? Lazy? An excuse to uphold a system that serves me well? Realistic?
Does it matter? I don’t plan on going into politics — performative all-nighters aren’t my thing — or even living in a swing state. I can see enough of the Wall Street Journal‘s op-ed page through the hands over my eyes to know what I see as pragmatic and incremental is viewed by many as radical. (They are so mad about middle schoolers learning about slavery!) And I only need to spend five minutes thumbing through my Instagram feed to know that most of my friends would think I’m one sensible Ann Taylor necklace away from writing an op-ed for the Journal myself. My politics please no one, and secretly I think that’s the best kind of politics to have.
It’s possible that I spent 90 minutes thinking about this one thing this afternoon. I guess I can say I went to English finishing school now, too.
P.S. Part of me feels like I should apologize for writing about politics, but I’m only going to apologize is if I start documenting my workouts or hawking spiralizers. (Which, to be clear, might happen if I don’t get out of my house again soon. Now that I have mostly learned to cook, it’s only my piss-poor photography skills stopping me from becoming a wellness blogger, and with enough time on my hands I might become Ansel Adams!!!!!)
Spotify’s Throwback Thursday playlist was Pride-themed last week. This post is not about how many of the songs on the playlist I had sung (4), choreographed (2), or danced to (7), but even setting the memory of my star* turn as Alexi Darling in the 2009 Vassar College production of “Rent” aside, I associate many of these with very moments in my life: My dear friend and former manager at the dancewear store, a person who does not lose her chill, flying into a blind rage any time “Love Shack” came on the radio. Doing step-claps to “Work Bitch” to warm up before my community theatre “A Chorus Line” in Virginia in 2013. And the boys in my fourth grade class, dressed in T-shirts cut as muscle tanks, dancing on the stage in our multi-purpose room to “Macho Man.” Wait, what?
For a couple years in elementary school we put on productions where each class did a little lip sync and dance to a song from the same decade. In third grade we did the ’60s in tie-dyed shirts and I think my mom did my hair in a bouffant a la Brigitte Bardot. My class did “Pretty Woman” with “choreography” that amounted to all the girls in the class chasing the handsome boy that all of us had a crush on anyway. I have a distinct memory of there being some kind of dance-off audition between him and the boy that all of us thought was gross, who obviously lost, and I won’t swear we didn’t lobby for that outcome. The teachers did Nancy Sinatra in cowboy boots. It would have made for quite the 2017 exposé.
The next year we did the ’70s, hence “Macho Man.” Have you listened to “Macho Man” recently? I mean, really listened? (Also, did you know that the Village People released a Christmas single a hundred years ago in December 2019 that hit the Billboard Top 20? The answer to both of those questions is, presumably, no.) It’s like a minute and a half of “Macho, macho man” and a minute and a half of variants on the phrase “BODY! Don’t you want to feel my BODY?” It is not, by any measure, a song you should trot a bunch of nine-year-olds out to, no matter how much fun they have flexing their chicken arms in front of a bunch of shrieking moms. (Is this my weirdest childhood memory? Is it weirder than when I won a contest for knowing the most about penguins and got to spend an afternoon yelling penguin facts at unsuspecting strangers at the children’s museum?)
Of course, none of us had any idea that the boys were flexing to a gay disco anthem.
Amid our current moment I find myself borne intensely back into memories that I feel compelled to question. Like — did literally nobody notice that they had a bunch of children dancing to blatantly sexual lyrics? (Where were the Mormons? I once got tattled on for taking part in a game of “Light as a Feather, Stiff as a Board” because in Provo they thought that was witchcraft.) Like — what does it indicate that I have a deep personal connection with so many songs considered to be LGBTQ anthems when I’m not part of that community? And for all of those connections to be from happy times that I treasure? There’s something a little off about having only joyful memories of music whose creators have a lot of memories that are less than joyful. The other day a friend shared some thought piece along the lines of “Straight people, check yourself before you go to Pride because you think it’s a fun party.” Relevant? Relevant.
I got a little weepy reading Ben Brantley in the Times the other day on memorable performances. I’m lucky to have seen a lot of really, really stellar live entertainment in my life: “Hamilton” in previews on Broadway (a fact I will continue to mention in every other conversation until I die), Patti LuPone as Mrs. Lovett with a tuba, Tiler Peck doing “Other Dances” at SPAC, John Cameron Mitchell reprising Hedwig in a knee brace, and some people would probably like me to add Ben Platt as Evan Hansen but that show was awful and I will die on that hill. Anyway, all that and still what I remember most vibrantly is being a fourteen-year-old freshman at the performing arts high school in the fall of 2003, watching the senior boys dance Ailey’s “Sinnerman” from “Revelations.” It was the first time I’d seen modern dance and I felt like I’d exited my body. They were good; I’m sure I’ve seen better since. It doesn’t matter.
Then I thought about my own favorite experiences as a performer and again, in spite of everything — the New York choreographers, the crowd-pleasing musicals with crowd-pleasing numbers, the classic ballets — I came back to the year the head of the RAD program at my ballet school set an African dance piece for our annual recital finale. I was probably twelve or thirteen and was better than mediocre (not by much) at ballet, tap, and jazz, and I fucking loved it. We all did. The boys got to draw makeup “muscles” on their chests and we wore whatever zebra/glitter unitard monstrosity the costume catalogue had on offer that year, shellacking feathers into our buns with AquaNet in fruitless hope that they’d stay put while we flung our heads back.
Our teacher was a Black woman from South Africa who had the perfect mix of fearsome authority and the desire to make us all share in her pure love for dance. You got good in Miss Carole’s ballet, tap, and jazz classes, but you also tapped into joy in a way that most dance teachers aren’t capable of cultivating. Every year we started preparing for the June recital in January, and she told us once that she liked to start choreographing during the layovers on her long trip home for Christmas, from Las Vegas to South Africa, marking out dances in the corner of the airport. Her son studied with us and he always seemed a little embarrassed about the whole dance thing. I can only imagine him cringing next to the gate agent while his mother did a jazz routine with her headphones on.
In college I was a member of Vassar’s dance company. A friend and fellow dancer who left modern and ballet for hip hop and street dance after we graduated recently posted on Instagram about how shocking it was to look back and realize that even at our progressive institution, the dance program was built entirely around historically white forms of dance. (J. Bouey unpacks this big statement briefly here.)
That was a real red pill moment for me (NB: I watched “The Matrix” for the first time like six weeks ago and boy howdy, was THAT good timing!). Much of what else I’ve confronted recently I had already read about in the New Yorker or whatever, got outraged about briefly, and forgot, as most high horse-sitting liberals tend to. This was an incontrovertible truth about a world I lived in for some fourteen years that I never thought to observe. What’s worse: You knew about injustice already and forgot, or you never noticed it in the first place?
As a child, you have no idea that you’re dancing to a gay disco anthem. You don’t think about how it is that your Black South African dance teacher ended up in Las Vegas teaching English-style ballet at a school in an overwhelmingly white suburb. As a young adult, even as I was writing history papers about Black choreographers, it didn’t occur to me to ask why I’d almost exclusively been taught traditionally white forms of dance, or why I hadn’t sought out African dance classes after learning how much I enjoyed the form. I’m 31 and learning more from a month of Instagram memes than I did in years of traditional education.
Anyway, weird time. As much as I’d rather not bear witness to mass death as a consequence of institutional racism, sclerotic bureaucracy, and bad luck (how many bat-borne zoonotic diseases DON’T land on a jetsetter at the seafood market?), I realize that it’s a privilege to live through this moment. And I mean “live” literally, not figuratively. It feels good to unwind the anxiety spiral into a productive line of inquiry. I never wanted to live through a dystopian novel, but I always wanted to live through history. It’s a privilege to watch the world change in real time and to change along with it.
As I start thinking about how to eventually market myself as an author, I’ve set myself to actually participating in social media. In case you’re wondering how that’s going for me, a misanthrope, here is an actual excerpt from my diary this morning:
“It’s nice to see how engaging on social media begets engagement on my own content, though even as I write that I’m filled with anxiety about being sucked into a miasmic echo chamber in which no art can be produced because all artists are preoccupied by the endless cycle of quipping and liking and being liked and at the end of the day, all we’re left with is, effectively, a circle jerk.
Is this basically the same thing as the patronage system of yore [Ed.: Yes, I did use the word ‘yore’ in my diary], though? Is it better, since artists have more agency over who they engage with, or worse, since it’s not as if most of us (“us”) are on Twitter at the behest of someone who’s paying us, only on the strength of the collective delusion that this is the only way to eventually get published? (And because it’s a good way, speaking of delusions, to feel productive without having to actually produce anything?)
Anyway, I haven’t had to shower yet today, and it’s fun to have a famous author acknowledge your existence / feel like you’re on the same plane, so let me just truck along with these endorphins and I’ll be copacetic.”
And in case you were further wondering how my manuscript is going, rest assured that I’m far too busy thrilling over having been followed by the NYT bestselling author I @ mentioned yesterday to do anything as pedestrian as actually write. (I did remove all of the quotation marks from my novel-in-progress to see if it would make it seem more literary. It did, but it also made it incomprehensible, and made me look like a dick, so then I put them all back. #crushingit)
As an aside, participating in Twitter as an aspiring author kind of feels like getting in line for the slaughter. Is it time to get canceled yet?
(Related: I’ve been combing and organizing this blog’s archives and have been genuinely alarmed by some of the language I casually threw around in my very early twenties, in 2011. Yikes, past self! I always suspected that this blog would be one of many reasons I can never run for elected office, but boy howdy, did I traffic in some tired tropes not long enough ago for it to be cute. Woof. Is this what the rest of my life as a white lady is going to be like? One cringe-inducing trip down memory lane after another?)
“To understand just one life, you have to swallow the world.” — Salman Rushdie, Midnight’s Children
Where you were when
When September 11th happened, I was twelve, a couple weeks into seventh grade. The footage on television was terrifying, but my classmates and I had never been to New York, and it felt consequential but not visceral. Our adults kept telling us that we needed to remember where we were that day “when we heard.”
They talked about JFK being assassinated and the Challenger exploding, and I didn’t feel like it mattered much that I, Dana Cass, was plugging in my curling iron in the Las Vegas suburbs when I heard a disarming report on the radio. But here I am nineteen years later, still conjuring the feeling of the bathroom tiles underneath my feet before I ran downstairs to turn on the television.
I’ve had some excellent history teachers who have taught me to properly interpret what I hear, see, and read, and of course now every podcaster whose closet has decent acoustics is out debunking one established symbol of history or another. For a long time, I’ve groused that we flatten history into a series of events that photograph well, and that in doing so we distort our understanding of how we got here and there.
Case in point: I remember Where I Was When Obama was elected for the first time (in a crowd of fellow first-time voters in the student center at Vassar, next to a friend who was weeping into a travel mug spiked with raspberry vodka) and Osama bin Laden was killed (nested amid a pile of books on my last standard-issue twin bed, writing the last mediocre paper of my college career, flipping between Microsoft Word and Safari open to CNN.com, the May breeze blowing through a window whose screen had been ripped open the prior weekend when campus security broke up our party and the attendees fled through my bedroom).
I also remember, bizarrely, applauding a radio broadcast that announced the conviction of Sandy Murphy for the murder of her casino billionaire husband Ted Binion following a trial so lurid it could only have taken place in Las Vegas, from the swimming pool in my best friend’s backyard in 2000, after my mom bought a couple pallets of water from Costco in a perfunctory nod to Y2K but before my next-door neighbor read aloud a poem her parents had been emailed called “How the Gore-inch Stole the Election,” during our morning carpool, and I learned about partisan politics for the first time.
Waiting for when
I don’t remember any one historic moment between November 2008 and spring 2011; I do remember that in 2010, I saw over someone’s shoulder what turned out to be a faux New York Times headline proclaiming “IRAQ WAR ENDS.” I had a brief remember-where-you-are moment before I realized it was fake, though it’s taken until recently for me to understand that the Iraq war wasn’t — isn’t — the kind of conflict that was going to be sewn up with a V-E Day.
(Neither was World War II, but my early education mostly elided over V-J Day and the war beyond Europe more broadly, especially where American moral clarity was in question. If it weren’t for crossword puzzles, I might still not know that Ethiopia was among the theaters in which WWII was fought.)
No soldier would dip a nurse into a symbol of war as something that begins, yes, and is terrible, but reliably ends. Part of me, having been steeped in the American tradition of moral certitude and ham-handed symbolism, is still waiting for that ending.
If it’s not on Twitter, is it even history?
The ubiquity of photography, and the ensuing barrage of images as indelible as the Zapruder film or the billowing orange contrails where the Challenger was supposed to be, has made history even more like a boiled frog that usual. I can’t figure out whether everything is a watershed moment or nothing is.
This isn’t a hot take. Every third person wringing their hands over the advent of social media and the 24-hour news cycle shares this sentiment.
But I — wait for it; this is about to be a real stretch — recently read Salman Rushdie’s Midnight’s Children (hence the epigraph) and think there’s something to be said for understanding the subtle gradations that color transitions from one saturated moment (Tahrir Square; a Trump rally) to the next, and how the mundanity of individual memory and experience better captures the zeitgeist as it evolves.
My excellent history teachers imparted to me the value of primary sources, extrapolating a cultural moment from individual lived experiences. We’re swimming in primary sources. History is Zapruder and contrails, but history is also a twelve-year-old plugging in her curling iron and a 21-year-old staring for the first time at the war machine in action.
(A few years later, when I was working as a proposal writer for a defense contractor, someone printed that photo of Obama and company in the Situation Room and taped it to my office door with “WAR ROOM” written on top in ballpoint, so that everyone passing would know that my officemate and I were hard at work chasing a new contract for the war machine itself.)
I remember that period in the ’90s when photomosaics became popular. The other week I saw an installation at the Barbican composed of labeled images from the ImageNet database that has enabled automated image recognition. It’s apt to compare the recognizability of a photomosaic (it’s the Mona Lisa! Made up of everyone who came to see the Mona Lisa this year! Etc.) to the anarchy of one arbitrary slice of the modern Internet.
But it only requires some imagination to extrapolate the implications of, e.g., the series of images of besuited men labeled “venture capitalist,” and similarly you don’t have to work hard to roll your eyes at a twelve-year-old white girl in her suburban bedroom who couldn’t have found Afghanistan on a map squinting her eyes shut to fix the memory of Where I Was When the bad men came to attack American values.
And here I am now, passing the “Prepare for Brexit” signs posted at bus stops on my way to my office, having left the US after the morning when I eavesdropped on a businesswoman opening her conference call with appropriate solemnity (“We’re all a little quiet this morning…”) in the airport lounge en route to Japan, where the friendly Japanese man who led us in entirely the wrong direction off the top of Mount Inari shook his fist and said “Trump!” at us fiercely when we told him we had come from New York. I read about Leonard Cohen’s death a few days later in a coffee shop in Shimokitazawa. Does it matter? Do I matter? Time will tell.
I can’t figure out why YouTube wants me to watch Carpool Karaoke and segments from the Ellen Show so badly. Most days, I watch two videos on YouTube: a yoga video from one of a few channels that I like, and one of two videos with a sequence of exercises for unlocking your lockjaw, both from a movie-handsome chiropractor who wears a wedding ring the size of a cuff scrounged from antiquity.
I know I should relax and learn to love the algorithm, but I don’t get it. I watched the Carpool Karaoke episode where they acted out “The Sound of Music” on the streets of L.A., and obviously I’ve watched every Taylor Swift interview Ellen has conducted, but beyond that, it’s all yoga and that chiropractor teaching me how to massage my suboccipitals, all the time.
(Pause to acknowledge how how precisely on-brand my YouTube tastes are. Second pause to glowingly recommend Taylor Swift’s Netflix documentary, which you should absolutely catch before it wins Best Picture in a few hours.)
Current working conspiracy theories:
Nobody binge-watches yoga classes or chiropractic instruction videos (though I did unearth what might be a subculture for watching videos of other people having chiropractic adjustments; just leaving that nugget there for you all to chew on so I don’t have to keep doing it alone) and the algorithm is trying to point me to videos that I’m likelier to binge-watch. Counterpoint: The algorithm isn’t pointing me to videos from Tony Awards performances from the ’70s and ’80s, so why is it even bothering?
The chiropractor’s channel is several helpful videos on at-home exercises to alleviate various ailments, from TMJ syndrome (the technical term for “I can’t open my mouth because I grind my teeth so ferociously that I’ve nearly bitten through my night guard”) to tennis elbow, and… a video hawking the benefits of not vaccinating your children. The algorithm, which resents my affinity for woo-woo, is subtly trying to point me back toward science. If only the algorithm could see the look on my disgruntled face right now as I listen to a child cry in my vicinity.
If you seek relief through stress through yoga and movie-handsome chiropractor videos, and also you’ve watched the original Broadway cast of Les Mis perform “One Day More” at the 1987 Tonys more than three times, your innate character is one that wants to watch Carpool Karaoke. Lie back and think of England (James Corden’s accent will help).
The problem with #3 is my niggling paranoia that one day the Internet is going to disappear and I won’t be able to do anything anymore. I usually think about this when I’m spinning around in a circle on the sidewalk trying to figure out where the blue dot is telling me to walk, but sometimes I wonder whether I could even choose my own reading material if left to my own devices (or, more precisely, without them).
Do I even know what I like anymore? (I guess I’ve been pondering this for a while.) I immediately forget most of what I read. I’d been blaming it on my attention span, but it occurred to me recently that maybe I just hate most of what I read. I slept terribly all last week because I started Tana French’s latest on Monday and I kept staying up long past my bedtime — reading, and then wondering if the shadows in my bedroom were intruders, and then wondering if the shadows in my bedroom were intruders, what seemingly insignificant incident from my childhood triggered their presence? Also, are all murder detectives shrewd and pithy calculators who can sniff out human weakness like the tasting notes in a fine wine, or just Irish ones? Also, how do you pronounce Gardai?
Anyway, it’s been nice to remember that books can be good. There are also only fifteen or so albums that have been released in the past decade that I actually want to listen to over and over again. (All of them are “1989.” Kidding! Maybe! See footnote .) I dutifully listen to my Spotify New Release Radar every Friday, but little speaks to me.
I guess the problem is that taste is eclectic. I was going to say that my taste is eclectic, but that seems unfair to everyone else who is more mercurial than predictable about what they like and don’t, which I assume is most people. How do you square that with predictive recommendation algorithms?
I read the first four Harry Potter books upwards of 40 times each as a child and then, while I waited for the next three, tried and discarded the canon of derivative books about boy wizards (sorry, Artemis Fowl), then gave up entirely on fantasy as a genre until I read The Night Circus, following which I wrote an honest-to-God fan letter to Erin Morgenstern. I love Tana French, yes, and I loved Gone Girl, but every subsequent entry into the unreliable-female-narrator genre is trash and I won’t be convinced otherwise. I am over misogyny as an artistic technique but I can’t stop reading Murakami, except 1Q84, which is a doorstop, not a novel, and I loved Super Sad True Love Story, though I hated Lake Success. I hit peak dystopia after the first Hunger Games and slogged through not only the rest of the trilogy but also the abominable Divergent series, which offended me so badly I swore off anything set in a future; but then the genre went highbrow, and I rolled my eyes but can’t say I wasn’t unmoored by Station Eleven (a book nobody should read until all cruise ships have been released from their coronavirus quarantines. Trust me) and Severance. I’ve already forgotten every novel I read in 2019 except Trust Exercise, even the ones that are also about bad people in positions of power, with and without clever plot devices. I’m a little devastated to admit that I think I’ve outgrown YA, though excited to eventually be ready to read Mrs. Dalloway.
TL;DR: My tastes are mercurial. I like books that speak to me. If I were to draw a thread between my favorite books, it’s protagonists that exist at a slight but impassable remove from reality: friendless boy wizards who make friends only to discover that friendlessness hardens into a character quality (cf. Harry Potter but also The Magicians), educated twentysomethings ashamed of their lack of ambition (Sweetbitter), educated twentysomethings ashamed of their lack of ambition even as they flee a global pandemic (Severance). I like books where the slight but impassable remove from reality is incidental, not the plot itself (ergo my dislike of Divergent, although I also prefer my books to read like they were edited at some point).
I’m not sure that’s a quality you can write into an algorithm. I like what I like.
So — where was I? YouTube’s seemingly baseless recommendations. I completely lost the plot there, didn’t I? How do I sew this back up into something? Conclusion: Art doesn’t need to be a buy-one-get-one situation; anomalies are precious. Half the reason I liked the Carpool Karaoke “Sound of Music” video was its sheer weirdness. No book that sets out trying to be Gone Girl can be as audacious. Dystopias were over before we entered into one. I’ve even developed an affinity for my anti-vax chiropractor and how he stares into my soul while he teaches me how to massage my masseter muscles. I don’t want YouTube to find me another chiropractor; I want YouTube to find me something radical that I can’t unsee. Is there a setting for that?
 1989, yes, but also Badlands, 1000 Forms of Fear, Strange Desire, By The Way I Forgive You, Queen of the Clouds, 3 Rounds and a Sound, The Fool, Blue Neighbourhood… I’m sure there are a few more, but I can’t think of them now.
I was sour all this week. Logically, I knew it was because it’s January and there’s nothing good about January, especially not in this year of our lord 2020 when the next ten months are going to be an even more arduous slog toward inevitable disappointment than usual. Emotionally, I decided to blame it on “hot-desking,” a lesser-known scourge of work in the age of lifehacking wherein one isn’t assigned a desk but is instead invited to share a “pod” with their teammates. To me, this is a nightmare on par with weddings without seating charts, and I yearn for my past life as a dancer when barre spots weren’t assigned, per se, de jure, but God help you if you stood at the spot furthest from the mirrors on the barre nearest the courtyard because everyone knew that was my spot.
I was also sour because I’ve been trying to read more twentieth-century classics and so I’m gnashing my teeth through Lucky Jim, by Kingsley Amis. It’s a sendup of postwar England in which the hapless protagonist suffers, among other indignities, the hysterics of his would-be ex-girlfriend upon trying to dump her. Actual hysterics. Screaming, sobbing, frothing at the mouth until someone slaps her in the face. I’m too humorless and militant a misandrist to abide tired stereotypes, even in the context of satire.
To be fair, I was predisposed to dislike Kingsley Amis, the second husband of Elizabeth Jane Howard, my favorite literary discovery in 2019. She wrote the popular Cazalet Chronicles, five volumes of family saga that span pre- to postwar England, among other well-reviewed novels, but during her marriage to Kingsley her career took a backseat to his because that’s what was done then, and so I hate him out of allegiance to “Jane.” Sorry, Kingsley. (Besides, who the fuck names their kid Kingsley? Honestly. Brits.)
At the beginning of 2016 I decided to spend the year reading only books by authors who weren’t straight white men. It was a terrific experiment that took on unexpected poignance that November (I watched the election returns in front of a literal shrine to women leaders in history that my friend built for us to celebrate in front of, in case you were somehow confused about where my loyalties lay) and one that’s stuck with me, in terms of both the books I select now and my view on books I’ve read in the past. In my early twenties I read a lot of Philip Roth and John Updike and I couldn’t figure out why I felt so dejected every time I finished an American Pastoral or Rabbit, Run.
I obviously appreciate erudite writing that captures a time and place indelibly, and I love to read about socially unacceptable human foibles, but it’s only been in recent years — after immersing myself in voices from the margins, and in the era of #MeToo — that I’ve realized that I just don’t really like misogyny as a literary technique. God help me if I have to wade through another gratuitous description of the hysterical wife of a put-upon man chafing at the bonds of corporate servitude and his milquetoast children. Give me Eileen and her constipation any day.
I didn’t have the energy to deal with hot-desking this week, so instead of a desk I sat at a countertop between the video games and the pool table (recall that I work in Silicon Valley, where employment contracts are Faustian bargains, though it turns out the eternal youth gets old once you hit thirty). Fortunately, I joined the London location of The Wing in November, where I can leave behind the animal screams of post-adolescent coders taking breaks from “deep work” to hear women dressed in the millennial British uniform of that Zara dress over Chelsea boots under a boxy pastel car coat use the phrase “side hustle” in a sentence.
I felt especially grateful for The Wing during a week that felt spectacularly male with Kingsley Amis prattling on about the unbearable lightness of women who don’t follow recommendations on what lipstick to pair with your pallid skin tone and the only Bernie bro I know tweeting prolifically. It feels extravagant to pay for a coworking space when I already have a home and an office, but I have to spend the rest of 2020 and also, probably, my life catching up on the great misogynists of twentieth-century literature and being governed by the great misogynists of twenty-first-century politics and riding the Tube to work underneath male armpits. If shelling out an arm and a leg to sit underneath an oil portrait of Phoebe Waller-Bridge gets me through paying taxes to two governments led by men who have single-handedly inspired white women to rage-knit more performatively than ever, then it’s money well spent.
When I think about unpacking writing to its constituent biological processes, or to the rules of grammar and tone that comprise it, I feel nauseous like I do when I think about what’s outside of the universe, or God, or my most profoundly embarrassing moments.
At best, my writing is Martha Graham’s quickening translated through me into action. Writing, when I do it well, isn’t something I think about; it’s something that I do with my body, as I did in my past life as a dancer. That writing might be something other than an incalculable force is anathema to the confidence that I’ve developed over the past several years since I first came to realize that there was, in fact, a thing that I was good at.
Writing this essay was an out-of-body experience. I was 25 and absolutely wretched with despair. I was crawling with feelings and memories. I didn’t know where to put my rage and shame, nor did I know how to ask the world to pity me, and then finally I began to feel something bubbling up at the base of my skull, and I put my hands to the keyboard and then there was my heart, articulated. It was the first time I had felt powerful in months. It was the most powerful that I had felt. It was no biological process or series of instructions that a computer could execute; it was unfathomable. It was transcendent.
In truth, I know that I, a writer, am a machine. I consume the New York Times Morning Briefing and Reddit threads about the misery of the Tube and I listen to my colleagues tell me all the ways in which they would do my job if they were me and I catch sight of a long-gone lover rock-climbing with his new girlfriend on Instagram and I scroll through movie reviews and restaurant reviews and gadget reviews email after email after email after email. What comes out the other end is one sheet from the multiverse, a dispatch from the version of me who crammed onto this morning’s Central Line to White City underneath the armpit of a man listening to a song that I haven’t heard since the long-gone lover played it for me in, for reasons that escape me, a parking lot.
In the same way that baking is chemistry, and you can’t eyeball the baking powder, so is artistic expression. What I put on paper is the product of the precise number of hours I spent in the thrall of my A.P. English teacher in 2006 and the precise number of times that I’ve reread the first love letter I received as a semi-grown woman and the precise feeling I get when I forget that I’m brushing up against a stranger’s sweat, jostling for a grip as the train rattles from St Pauls to Bank, and remember instead that I live five thousand miles from where I was born. Had Mrs. Hampton retired five years earlier, I could be writing investigative journalism, not prose poems about the normal thingsIhate.
What I create bears the mark of what I’ve consumed. And does that make me any different from a bot recapping the high school baseball season or a Russian troll farm regurgitating Stormfront in a Facebook ad?
I like to think of myself as exalted. I’m an artist. You can’t teach an algorithm to feel where the commas go in its bones. I’ve never felt that my talent is explicable or that job, to speak bluntly, is at risk of being automated away. I don’t know where to put commas because I memorized Chicago; I know where to put commas because I feel it in my bones. I’ve made a career of putting commas in such a way that the person on the other end can’t help but feel what I’m feeling or buy what I’m selling. It’s a function of my being one with the commas. It’s innate.
But, then, how did I learn to drive? How did I learn to scale the shelves in the stockroom at the store where I worked in high school to restock a cartonful of shoes in the twenty minutes I had left before they stopped paying me whether or not I was done? Speeding down I-15 outside of Las Vegas, through the alien desert with mountains looming high above, is a task that a robot can do, but a joy that only a human can feel. Once, at the store, I fit a woman with half a foot missing for a pair of shoes; I held her damaged foot in my hand and we looked one another in the eye while she told me what she needed to be comfortable.
It’s precious of me to imagine that being good at something that’s hard to teach makes me immune to the force of technology. I don’t get paid — yet — for the kind of writing that makes me really tick. And the writing I do get paid for can be such a slog that I might envy the robot that could dispassionately listen to the engineer line-editing my copy on the basis of his having once written for his college paper. (Perhaps we could train the robot to also dispassionately flag every time the engineer suggests language that is a little phallic for a technology marketing document. It, being neither a woman nor sentient, might get better results than I.)
And yet. I write because I think it’s the best thing I have to offer the world, but I also write because it’s the best thing the world has to offer me. I can live with the idea that I might never drive a car again. I can’t live with the idea that one day holding a pen and scratching it on paper or letting my fingers fly along the keyboard might be quaint, that my naked human prose might not pass muster next to the output of a machine that has read more of Proust than I have. (Which is none, as long as I’m offering up naked human prose.)
I want desperately to make a career of letting people see themselves in what I write and I’m scared to think that I might be up against not just the army of Buzzfeed listicle writers who have bafflingly landed book deals and an industry that only buys knockoffs of Gone Girl, but… robots. Or, more specifically, the decay of attention devoted to good writing. Machines can get the job of imparting information done. You can call it utilitarian, but what’s to say that writing — mine, or anyone’s — is more than that?
It’s rich to claim that what I exude when I’m feeling productive is unique or valuable. You could, as Seabrook finds, mix up the same ingredients in another pot, and the consumer might be one the wiser. So who am I to imagine or even wish for a stop to the technology that so inexorably marches over what others hold as dear as I do writing?
When I was little, growing up in Las Vegas, I liked to name the colors I saw outside. I had the jumbo box of Crayolas, and I reacted almost synesthetically when they named the colors right. Cerulean made me tingle. It was blue like I’d never seen before, blue like they don’t have in the desert or even in the ocean off Mission Beach, and the name was like the fairytale kingdoms that I used to write stories about in my piles of spiral notebooks. Asparagus made me nauseous and so did its eponym (and anyway, jungle green was the only green that mattered). Robin’s-egg blue was pretty but predictable; razzmatazz was cheap and trashy.
When my aunt used to visit from Santa Barbara we’d walk slow through the Red Rock and name every color we saw. It was how I tolerated the Mars-red desert, so beautiful and alien from my fluorescent everyday that I could hardly stand it. This is still how I respond to beauty: I feel it overtake me and then I want to make it mine. Looking isn’t enough. I want to bottle the second act of Giselle and eat the vista of fir trees that blanket the German Alps and stash the gold foil of Wat Phra That Doi Suthep in my pocket for later.
I started taking ballet classes because I thought it would make me feel how I felt when I watched the Nevada Ballet dancers in their tutus on the stage at the university. (It did, and once every two years when I take class nowadays it still does, even when I catch sight of myself in the mirror and remember that like Jody Sawyer, I wasn’t born with turnout.) Dance gave me what I lost from music after a prodigiously talented sixth-grader swooped in and stole the first chair from me in the Becker Middle School orchestra. I was all set to be indignant, but then he started to practice Bach’s Cello Suites, and I forgot for a moment what anger even was. I don’t suppose there was much I could do to come back from the shame of being in the middle school orchestra but even so, I was unwilling to risk it by doing something so gauche as actually watching him, so instead I looked at my shoes and flicked my eyes leftward every so often to peek at him hunched over his cello, sawing and swaying like it was part of his body.
I wanted to play like that too and sometimes when I practiced, when no one was home, I would try to sway my body along with “La Cinquantaine.” But it didn’t work for me. The music didn’t live in my bones like it lived in his. I swallowed the desire and stared at my shoes and told my friends stories about “Weird Cello Boy” who moved his body in time with his bow like he was possessed. That was the same year I started ballet in earnest and in time, I began to feel the beauty I craved in my bones.
I see a lot of beautiful things these days. I live in Europe now, and one of my favorite things to do in a new city is to visit its museums. I grew up with a print of “Starry Night” on my bathroom wall, and I was nine when the Bellagio hotel opened in my hometown of Las Vegas and I saw Monets from Steve Wynn’s collection for the first time. Las Vegas is a grim place to learn about beauty, but the Bellagio was a game-changer. I had never seen simple rooms like the ones the Impressionists painted, wood floors and iron bedposts and windows that flung open onto vistas of endless corn.
I drank it in and then puberty hit and I forgot all about visual art, losing myself instead in the sweet release of dance. Then a decade later at Vassar, I steered clear of art history because it was the domain of the lank-haired girls with New York private school pedigrees and coke habits (also, I was afraid I’d fall asleep every day). Today I can’t get enough. Travel can be overwhelming and art compresses it into something I can understand.
I thought a lot about art and how I digest it when I was reading what turned out to be my favorite book from last year, Elif Batuman’s The Idiot. The protagonist Selin is a college freshman and the book is mostly about her experiencing sublimity for the first time. Life becomes overwhelming, and art (and semiotics) compresses it into something she can understand.
I remember vividly how the raw emotion of young adulthood, the wringer of heartbreak, betrayal, watching the US bomb the shit out of the Middle East, etc., gave way to realizing other people felt those emotions too, and that art was what they did to make them manifest. I nearly lost my mind several times during AP English my senior year of high school. I tucked a printout of “Good Country People” into the back of a textbook to read during a lecture I found boring, and I was so overcome by the ending that I got up from my desk and walked down the hall to find my English teacher and flap my arms at her until she sent me back to class. This teacher also read us “The Hollow Men” out loud one day and I remember that she looked almost sly during the final lines, as if she knew already what she’d see when she looked up after the end (“not with a bang, but a whimper”). I guess she’d been teaching for long enough to expect twenty slack-jawed seventeen-year-olds looking at her like she’d just elucidated, I don’t know, string theory. It was 2006. We were bombing Iraq and life was very long besides. We were all too aware of the Shadow.
Years later, I learned the word “sublime.” I don’t know philosophy well and maybe I’m perverting the definition, but this is how I think of sublimity, as my urge to shake myself free of what “Good Country People” means about humanity or my fear that the silence following “The Hollow Men” would never end.
I had forgotten about the idea of the sublime until I read The Idiot. There’s a scene where Selin and her friend Svetlana, who are eighteen or nineteen, take up standing in front of paintings for thirty minutes at a stretch. It’s the kind of thing I used to do as a child — I recall distinctly sitting on the toilet for far longer than I needed to stare at that “Starry Night” print on the wall opposite — and the kind of thing I’ve forgotten to do now that I’m an adult, and busy, and living in a time when everything is ephemeral (the algorithmic timeline) but nothing disappears (the LiveJournal whose password I’ve forgotten). I think about my taste more than I act on it, and I’m ashamed by how I’ve gone to some fifteen European museums in the past year and yet all I want to do is beeline to the paintings that look most like Monet.
Last summer I went to a Picasso exhibit at the Louisiana Museum north of Copenhagen, next to the sea at Humlebæk. It was mostly his ceramics, and they were charming and I sent photos to my friend who likes when human faces appear on inanimate objects, but I was more interested in the tiny photos of his blue paintings on the timeline of his life pasted to the wall in one of the side rooms. If I had gone to the Picasso Museum when I visited Paris last year instead of spending 45 minutes in line for a galette across the street at Cafe Breizh, I might already have known about his “Blue Period.” In my defense, it was a really good galette, and I had already been through the emotional wringer of walking through Shakespeare & Co. to the sound of some hipster playing The Killers (the default soundtrack for every Las Vegas whose youth was a) misbegotten and b) in the early 2000s) on one of the bookstore pianos and then leaving only to see Notre Dame rising above the Seine through a strand of Edison-bulb Christmas lights, and I think maybe if I’d seen the Blue Period at that juncture I might have had to Javert myself straight off the Pont-Neuf.
The Blue Period paintings remind me of when I traveled to New Zealand for business in 2015. I was in a blue period of my own, and for two weeks I went jogging every morning along the Oriental Bay listening to Halsey and Sia. The water was the cerulean blue I only ever saw in crayons as a child, and that Halsey song “Colors” kept looping on my Spotify (“everything is blue, his pills, his hands, his jeans”). It was synchronistic, and poignant, and I felt grateful to have seen cerulean in real life but in utter disarray nonetheless.
Later I was ashamed to have been so sent by the synchrony between a teenager’s pop song and the ocean, which is probably the most pedestrian natural thing you can find to be moved by. I was ashamed again, in Humlebæk, to be ignoring Picasso’s little-seen, avant-garde ceramics so I could wax emotional over something so literal as blue standing in for sadness. And I’m ashamed every time I try and fail to make eye contact with a Basquiat or one of those wacky Pop Surrealist paintings that give me nightmares.
But lately, I’ve felt inclined to treat myself more generously. I feel so anxious to take in all the culture that Europe has to offer while I live here that I trot through museums staring at paintings that make me ill instead of standing like I want to in front of Woman with a Parasol until I will myself into a field in Argenteuil. Reading about Selin and Svetlana reminded me that I can still access the sublime, and that to do so requires giving myself over to it. There’s no point in giving myself over to something that doesn’t move me and no use in trying to be moved by something for the sake of performing sophistication.
I have also wanted lately to put away my camera and to feel sublimity in my bones again, not through a lens, to listen to what my body tells me about beauty rather than to try to measure it in likes. I put on my ballet slippers for the first time in a few years the other week and eased my way through a barre, and I remembered how it felt to be giving beauty back to the world.
I guess we’re all feeling this these days, in our collective awakening to the destructive forces of technology. I don’t think taking photos to satisfy the hunger that beauty evokes in me is any better or worse than naming the colors I see in the desert. It’s all just one means after another of negotiating my place in the world, and I’d argue that even looking at the world through my cracked iPhone lens I’m still better off than this French art thief who tried to cat burgle his way into taming his hunger for the sublime. Though Lord help me the next time I’m in Paris if I’m feeling as delicate as I was the last time. Give me another dose of acoustic piano, Camembert crepes, Gothic cathedrals, and my favorite Crayola crayon color that also reminds me of being 25 and heartsick and I might just have to grab the “Sleepy Drinker” and run.
This is part 2 of an ongoing* series about the Internet. Last week, I talked about how social media was my conduit to self-actualization (at least once I emerged from underneath the rock where I’d been hiding from Instagram for five years). This week, I counter that thesis by arguing that the Internet is a medium that is destroying our messages, and I’m not just talking about being limited to 140 characters. Next week, I’ll write about the meaning of identity in the machine learning era.
*It was going to be 3 parts and then it was going to be 2 parts but now it’s going to be 3 parts again and in the course of writing those 3 parts I’ve realized that I have A LOT OF FEELINGS ABOUT THE INTERNET, so why limit myself?
I didn’t expect that trying to learn about search engine optimization would trigger my latest existential crisis, but there you have it. (It’s been that kind of year, hasn’t it? I can’t figure out if it’s the omnipresent threat of nuclear war or if this is just what it’s like to be 28.)
I was trying to figure out what you’re supposed to be doing if you actually want people to read your blog. This in and of itself wasn’t that eye-opening, because I know perfectly well that there’s a metric fuckton of content on the Internet and you’re supposed to be doing some voodoo magic to make sure that when people Google “Dana Cass” they don’t come up with someone’s Florida mugshot. (Someone else’s. I’ve never been arrested in Florida, although I did consider burning down Harry Potter World when I went there in October, realizing that I had paid the equivalent of three new pairs of shoes to lay waste to my most precious childhood memories. The frozen butterbeer was really good, though.)
So I’m reading about SEO, which already feels like the used car salesman patter of the digital age, and then I came across this saga of how mattress reviews are actually just a proxy for the battle to dominate an oversaturated market. And then I was trying to figure out what to do with my books while I’m living abroad next year, and it turns out you basically can’t find anything unbiased about long-term storage. It’s literally all so-called sponsored content. (Pardon me if I don’t link it here lest I negatively impact my SEO with links to low-quality content. You, too, can Google “long term storage nyc” if you want to dispel the few illusions you had left about the democratization of information being net positive.)
“Indeed, it is only too typical that the ‘content’ of any medium blinds us to the character of the medium.”
“Sponsored” is a euphemism for “paid,” which means that what you’re reading is an advertisement disguised as neutral information. This is not the first time I’ve thought about the elusiveness of truth on the Internet. As it turns out, that’sa hot topiclately. But I’ve felt lately that a number of threads I’ve been tracking are beginning to converge, specifically: there is a metric fuckton of words on the Internet and consequently, the words themselves matter increasingly less.
I was reading some casual media theory a few weeks back. (Quick piece of advice: reconnecting with my academic self has been a great way to navigate the apocalypse without going completely insane. I balance out the New York Times with selections from my college bookshelf.) I didn’t spend much energy in college on anything that happened in the past hundred years. I spent most of my time on the nineteenth century — including a semester where, memorably, I managed to write more than one final term paper on the relatively narrow topic of the Shakers — so last month was the first time that I’d actually read Marshall McLuhan of “the medium is the message” fame.
In the course of my work, I spend a lot of time thinking about data and technology and the impact their use and misuse have on our daily lives. I spend much of my spare time writing. I don’t often think about the connection between the two beyond how I apply my talent as a writer in service of my company, where I was hired in 2012 to write proposals and white papers. I had heard the term “content marketing” and I assumed that that was what I was doing: writing things to get people to buy something. It was only when I started applying to content marketing jobs that I learned that even though I’m a better writer than most people I know, writing is not actually the point.
An entire massive cottage industry has sprung up around “content marketing,” which is not the art of writing well to describe what your company can offer a client but the science of getting in front of as many eyeballs as possible. It’s “the medium is the message” taken to the extreme, where every resource is brought to bear against the medium and the message itself is, if anything, an afterthought. The objective is no longer truth or even precision but rather a sort of association, where if you walk away thinking Manhattan Mini Storage is long-term storage the content marketer has done their job right.
“The effects of technology do not occur at the level of opinions or concepts, but alter sense ratios or patterns of perception steadily and without any resistance. The serious artist is the only person able to encounter technology with impunity, just because he is an expert aware of the changes in sense perception.”
I have always held writing as sort of a pure act, even in the context of my profession. I write to convey truth. I don’t hold sales or marketing as antithetical to the pursuit of truth, at least not in their traditional forms. Content marketing, though, strikes me as a bastardization of my talents as a writer. Nobody has any illusions about the intent of a proposal or a white paper or even an advertisement on the subway. But an advertisement disguising itself as advice on how to improve your work from home experience? No, thank you. Stick with product placement and let the writers pursue their truth. (And when it comes to how the art of writing has been bastardized in service of moneymaking, don’t even get me started on internet journalism.)
Some time after I discovered that I can’t be a content marketer because I didn’t come out of the womb knowing how to optimize my blog content for search engines, I moved to a new role inside of my company. Today, I often help people who aren’t speakers prepare talks for large audiences. Most of this work is therapy — reminding people that “The audience wants to hear you share what you have to say!” in hopes that they will remember that their arms are attached to their bodies and that they might even consider occasionally moving them — but a surprising amount of it is simply trying to get people to just say what they’re trying to get across in plain language.
How does this relate to content marketing? It’s just another symptom of the epidemic of not being able, or no longer caring, to speak meaningfully. I work mostly with engineers who think a lot about data — information — and how to make it usable. They tend to think about speaking in the same way, where the actual thing that they’re trying to say is secondary to the way in which they say it. “So I’m going to talk about x, y, and z,” they tell me. We go into rehearsal a few weeks later, and they talk all around x, y, and z, and they ask me for ways to visualize x, y, and z, and at some point I look at them and say, “Well, why don’t you just say x, y, and z?”
Every time, it’s somehow a revelation to both of us that it can actually be that simple. In a world where we are inundated by content, speaking truth without the trappings of search engine optimization or fancy slides feels as impractical as speaking truth without a microphone. The message doesn’t matter if it’s buried in the medium. (I think I’m abusing McLuhan here, but bear with me.)
That’s upsetting, isn’t it? I’ve been in ongoing conversation with a singer-songwriter friend of mine who recently deleted his Facebook account because he’s sick of how promotion on social media — and, increasingly, success as an artist — depends on your ability and willingness to manipulate the ranking system. He doesn’t feel like tying his success to his being able to fund Facebook ads, nor does he feel like his success should be something that Facebook gets to monetize.
This is even more insidious when you think about the inevitable politicization of the mediums we’ve come to rely on to speak our truths. Maybe it was idealistic to think that art and truth were pure — patronage has always existed; newspapers have always had editors — but today it feels that they are elusive. Before the Internet democratizes information, it bastardizes it. Why are you reading what you’re reading, or listening to what you’re listening to? Who paid for it to reach you? What’s their end goal and how do you, the the content consumer, figure into it? Are you the actor or the audience and who wrote the script, anyway? Do art, truth, and opinion still exist or are they all just a function of who’s paying whom to do what?
And man! All you wanted to do was buy a new mattress.
Every so often, I give up on pretending that I have sophisticated taste in music and turn on the kind of thing I used to wallow to in high school. It’s a sure ticket to the past, which has been especially welcome lately—nothing like escaping to the good old days when the president was just a war criminal and Chandler’s mom was still a punch line on Friends, am I right?!—and easier than ever now that everything’s on Spotify. (Just remember to turn off sharing, unless you’re proud that it’s 2017 and you’re still listening to Something Corporate. You shouldn’t be, in case that wasn’t obvious.)
So the other day, in between wondering if I should quit my job and counting the number of dystopian novels that I didn’t think to take as cautionary tales, it occurred to me to turn on Jason Mraz. While he’s arguably a better musician than most of his contemporaries on my high school playlists, it’s still difficult to justify the existence of a lyric like “it takes a crane to build a crane,” and let’s not even broach the subject of his newer albums. Like Alanis in the Jagged Little Pill era versus Alanis now, it would be for everybody’s benefit if he’d just get dumped already. Success in love does not a good singer-songwriter make.
To step back into my teenage shoes, though, is to set aside the issue of quality. More precisely, it’s to set aside nuance. On many counts, I was inarguably a better person when I was a teenager. For example, when I was seventeen, I submitted an essay proposing that Congress vote anonymously to authorize military actions overseas to “allow politicians greater freedom to vote the way they feel is correct rather than be pressured by the party line.” This is probably not even the most preposterous thing that I thought was practical when I was a teenager, but it’s the only one I still have in my Dropbox, so it’ll have to do. Later in this essay, I also suggest that the United States would be able to end the genocide in Darfur—it was 2007—“if only we were willing to commit the troops to do so.” (Those troops, of course, would be committed through anonymous vote. Like YikYak, but for war!)
“Better” probably isn’t the right word: I was, if anything, purer. I thought that Congress was made up of good people who were simply at the mercy of their uneducated constituents. I thought that “it takes a crane to build a crane” was a genius observation that had never been articulated better. (I sort of still do. Congress, on the other hand, is obviously a lost cause.) Today, I can argue myself in circles; where I once nearly stormed out of the classroom in a heated debate with my World Affairs teacher over the best way to end the practice of female genital mutilation, I now hear myself using the dreaded phrase “I see where you’re coming from.” And I don’t even follow it up with “…and it proves my hypothesis that you’re a goddamn sociopath who wouldn’t recognize nuance if it punched you in the face.”
I miss the comfort of certainty. Writing cringingly naive social studies essays, blasting something like “Coin-Operated Boy” on my way through the Del Taco drive-through… nowadays it takes me a solid thirty minutes to decide what to order from Seamless, and even then I only pick because I know that if I don’t have something more than stale pretzels in my apartment within the next 45 minutes, I will chew off my own arm. (This is also in part why I don’t cook. I cannot handle grocery stores. I would say it’s an eating disorder thing, but it’s the same reaction I have to the New York Public Library eBooks catalog.) I’m too aware at any given juncture that whatever route I take will inevitably be the wrong one. What I wouldn’t give to be seventeen again and know that I am, without question, right!
Now I’m all too aware of nuance, and it means that I’m incapable of going in anywhere with guns blazing. That’s not entirely true, as just about all of my coworkers and the senior leadership of my company can attest to, but that blaze flames out so quickly, the second I open my eyes and realize that there’s another perspective to be considered. My intractable stubbornness has given way to… waffling. I’ve been catching myself lately vacillating wildly between different positions depending on how well they’re being argued to me. Protests are useless! “But they’re the only way to get the public read onto a cause! Look at how the attorneys mobilized via social media to help out travelers being detained at JFK!” Okay, protests are great! “They’re political theatre!” Those pink hats are still ugly! Okay, I’m done now. That one is an incontrovertible fact.
I guess the tradeoff is that while I might no longer be bullheaded enough to get myself sent to the dean’s office rather than submit myself to standing during the Pledge of Allegiance, I’m also no longer dumb enough to, say, get myself sent to juvenile court with a summons for drinking underage (in full “seventies roller disco regalia.” With tube socks. After trying to hide under a car). Or leave a Burger King soft drink cup full of Dr. Pepper in my cupholder for hours in the Las Vegas sun and not expect the cup to give way, sending Dr. Pepper leaking… everywhere. Or forget to look behind me before I make a U-turn and send my car straight into the path of an automated gate, practically knocking my bumper off (Dad, if you’re reading this, that’s the genesis of that massive scrape on my back bumper. Not a shopping cart. Just in case you happened to have bought that airtight excuse).
That isn’t to say that I’m not still incompetent—have I mentioned yet on the blog the time last year that I managed to miss a transatlantic flight by a full 24 hours?—but that nothing seems as consequential as it did when I had no concept of nuance. The photos of me wearing tube socks haven’t yet sunk my political campaign. I cleaned up the Dr. Pepper. (And United didn’t charge me for that mishap, which is probably because I have already sold them my soul.) It got better, as they say.
But that, too, is why the music I listened to when I was sixteen doesn’t resonate the way it used to. Everything felt so final, or so urgent: I needed Jason Mraz strumming his stupid guitar and singing to me that “it takes a night to make it dawn,” because just as I was sure in my World Affairs essay that using “media infiltration” to “alert the citizens [of the Middle East… no, literally, the whole thing] that a freer world does, in fact, exist” would bring about peace, so, too, was I sure that getting a B on a trigonometry test was to live the rest of my life behind the cash register at Capezio. I live now in a constant state of awareness that everything evens out to… well, mediocrity, I guess, since that’s what you get when you can’t forget that the highs are as temporary as the lows.
It was nice the other day to walk down Seventh Avenue with my headphones on, listening to music that is only sort of good, remembering what it was like to be confident that everything I said was right and everything I knew was true. It’s not a state that I’d return to—for one thing, I’d take going toe-to-toe with my boss’s boss’s boss’s boss any day over my eminent social studies teachers, and Lord help me if I ever see a look on my mother’s face like the night I got caught drinking Smirnoff Ice in tube socks!—but it’s good to remember that I have, in the past, been capable of taking a position, of making a decision. And, for what it’s worth, of listening to a second-tier singer-songwriter because it makes me feel better about the world, without concerning myself with what the world might feel about me.
NB: My final argument in that World Affairs essay was that the U.S. should remove troops from the Middle East “because at this point, all that that is accomplishing is proving the theory that Americans are evil.” While this is unquestionably true, and I congratulate my younger self for having had the foresight to recognize that this would be an issue in the future, I recognize now that at least epistemologically, I was a little confused.