The Common Good: Imagination

If someone had asked you what you think about “Harry Potter” in 1996, the year before the first of the seven celebrated novels was published, you probably would have stared at them in confusion. Before any of us knew what someone was talking about when they said muggle, Quidditch, or Dumbledore, the entire universe of Harry Potter existed only in the mind of author J.K. Rowling. Fast-forward to the present week, and the third Wizarding World of Harry Potter is set to open in Universal Studios Hollywood, allowing people to smell, taste, touch, and hear the world of the story in physical form. You can kick back with friends over butterbeers in Hogsmeade or take a picture in front of Hogwarts.

The power of imagination is astonishing. What once exists in only one person’s brain can go on to sweep through the rest of the world, causing new structures and ways of life to emerge. Words and images, on a page or in a speech or on a screen, can create dramatic social change. Imagination has shaped the world we live in now, and it can shape the world we live in tomorrow.

Before there were cities, cars, computers, the 40-hour workweek, hospitals, political parties, recycling, and countless other things we take for granted as normal now, certain people thought them up, shared their ideas with others, and constructed them as real, concrete things in the world.

We used to have great imagination about what society could be like. When no other country had set aside expanses of nature to preserve for the enjoyment of the public for generations, America created a National Parks system. When the United States was rife with some of the worst racism and structural inequality in its history, Martin Luther King, Jr. gave a famously profound speech about having a dream of a different kind of humanity. When for centuries people had looked up at the moon and wondered what it was like over there, John F. Kennedy proclaimed in 1961 that we would send a man to the moon by the end of the decade.

Where are we at now?

Is the United States a country that treasures nature even more than when the first Parks were formed? A good chunk of Americans won’t even acknowledge the science of climate change and the painful consequences to come in our lifetime.

Is America a country that’s realized Dr. King’s dream–respecting the life and worth of every human being no matter their race, gender, age, or other uniqueness? We have a contending presidential candidate succeeding largely because of racist, misogynistic, xenophobic rhetoric.

Is the United States spearheading greater space exploration, pushing the limits of what we know, where we can travel, and who can go there? NASA is so strapped for cash that any real space endeavors are being contracted out to private companies like SpaceX. Though space is the necessary frontier for the future of humanity, things are hardly different–if not worse–than the days when we enthusiastically launched astronauts to the moon decades ago.

Our collective imagination has disintegrated and died out. Our visions of what this country could be are uninspired or nonexistent. We’re stuck in the status quo, occasionally fighting over relatively negligible changes.

When we should be coming up with a compelling, comprehensive vision of what work in the 21st century should be like so that every person has the resources they need to live well, it’s “pie in the sky” to even move for something as meager as a $15 federal minimum wage. To be sure, $15 would be an appreciated improvement for many people, but it’s an amount that’s still almost $4 per hour shy of where it should be if minimum wage had increased at the same rate as overall productivity. We should already have a $19 minimum wage nationally; instead, we’re bickering about maybe going to $15 sometime in the next decade. We’ve hardly begun to think about how we’ll deal with rampant unemployment as more and more jobs are taken over by automated technology.

We have to get back to dreaming big, together, and transforming society into the better world it can be. Take what we know about what’s good for people, look at where we’re at today, and invent a future that brings everyone closer to the common good.

If we can turn Hogwarts and butterbeer into real things for millions of people to see and taste as if they were actually wizards, we can surely imagine and construct a better world in the theme park of our nonfiction world.

 

Let’s Let People Be People

I am riveted by the current presidential campaign. Each presidential election is historic and interesting in its own way. But this one is different. Traditional candidates and predictable narratives are being subverted. There seems to be a groundswell of desire to move beyond the status quo in dramatic ways. In the long run, we’ll see what that means concretely: who wins and what (if anything) changes in society. Right now the race is still up in the air. I hope you’re enjoying watching and participating as much as I am. We need as many people as possible to be invested in this process.

But even in an unpredictable and entertaining race, some things never change. I’ve been especially bothered in recent weeks by the lazy use of stereotypical identity attributes to bunch people into monoliths. Media, candidates, pundits, and others do this regularly. African-Americans must all think and vote one way because they’re African-American. Elderly people must think and vote the same way because they’re elderly. Pick out a single trait or two–an age range, a gender, an income level, a race or ethnicity, a religion–and you can probably find some commentary about how that whole group of people is essentially homogenous.

One among many examples: last night in the Democratic Debate, Hillary Clinton and Bernie Sanders had a spirited exchange over the significance of Henry Kissinger being a valued mentor of Hillary Clinton. In the short time since the debate, I’ve heard Anderson Cooper and others on CNN, my local NPR station, and various tweets say something sarcastic like, “I bet that played well with Millennials–since they have no idea who Kissinger is.”

Surely some younger adults don’t know. But quite possibly some do know and have an opinion about it. Why assume an entire age range of the country fits into a uniform group of don’t know and couldn’t care less? And who’s to say a younger person can’t deem it important and figure it out? Between Internet searches, books, documentaries, and other sources, it’s pretty easy to give yourself a decent introductory lesson in things you don’t know. I’m inclined to believe that a number of people care enough to do so–and not just the ones that fit into a nonspecific age range reduced to “Millennials.”

Analogous things could be said about how predominant voices are talking about Hispanics who live in Nevada, African-Americans who live in South Carolina, young women, older women, and many others.

We should all be insulted by this. We can do better.

Identity generalizations may be convenient for a stump speech or a news segment. But they certainly do not represent or empower the individual human beings they are made about. You are a complex person. I (like to think I) am a complex person. Despite the fact that you may belong in a fundamental way to this or that race, class, or generation, you have a unique web of motivations, interests, knowledge, experiences, and beliefs. So does the person next to you. However much we are like others who share certain identity traits, there are meaningful idiosyncrasies that make each of us profoundly different from one another.

So let’s move beyond lazily and simplistically grouping people together. Let’s call it out and challenge it whenever we see it–from national media to our conversations with each other. Let’s let people be people.

 

Eating Well: Food Doctrine of the Mean

Scallops and Juniper
Photo Credit: Netflix

Food is the most universal language. In many places of the world, you can get by without knowing a word of Spanish, Mandarin, or French. They are widely spoken, but not wide enough to encompass all of humanity. No single, literal language does. But if you’re a human being anywhere, you cannot get by for more than a handful of days without consuming and digesting some kind of edible flora or fauna. The cultivation and intake of food are some of the most fundamental activities of being a person. We have to eat. As such, much of our civilization and culture has emerged around the things that constitute a meal and their sources. Over the course of the last few hundred years, many and varied branches of a thoroughgoing food industry have grown, and we now find ourselves high up in the canopy of the culinary tree without a view or an interest in the trunk and roots below that led to its growth.

We now watch food competitions on television and take smartphone pictures of food seemingly as much or more than we spend time eating it. We order out, drive through, and snack on the go while our pans and pantry collect dust at home. We readily recognize the sight of plastic-wrapped packages of meat in bulk without having any idea of how the animal was raised and butchered. We have kids who struggle to identify tomatoes and potatoes in their original, just-picked state. There are chefs and food industry experts who are nearly as popular and well known as Hollywood celebrities are for film.

We are enraptured by the consumption of food. There are two polar extremes. On the one hand, trying too hard: bombastic, absurdist gastronomy with excessive technique and uncomplementary ingredients forced together (which the average person will largely never be able to taste or learn to make anyway). On the other, indolence: processed junk with grotesque amounts of sugar, fat, and carbohydrates encouraging us to indulge in foods that undermine our well-being. In both, there is a lack of understanding true craft—knowing the essential interconnections in ecosystems where edible plants and animals grow, what constitutes quality, true skill, and approachability. Food is the universal language, but few of us are fluent anymore—even many chefs. We live in one of the two extremes. We’ve lost the meaning of a well-prepared meal and the proper amount of reverence it deserves.

In such a paltry conceptual environment, Netflix’s original series Chef’s Table could not have been released at a better time. It’s food television, to be sure, but it supersedes existing programs in a way that makes it more of an artistic philosophical reflection than pop entertainment. Chef’s Table compellingly presents a middle ground between the extremes of fetishized gastronomy and profane processed foods. Each of the chefs featured are struggling to break free from the status quo of the culinary world and provide people with a resonant, grounded food experience.

Massimo Bottura has established the third-best restaurant in the world by simply capturing the essence of traditional small-town Italian flavors and presenting them with the playfulness of a child sneaking tastes in a grandmother’s kitchen. Dan Barber is a prophet for understanding that the best flavors are inherent to the best ingredients, which is ultimately dependent on the health of the soil and the rhythms of nature. His literal farm-to-table restaurant at a barn in rural New York just won the James Beard Award for Outstanding Restaurant—the Best Picture Oscar of cuisine. Francis Mallmann is a revered chef with the utmost classical technique, who prefers to cook over open fire on a remote Patagonian island with the closest of companions—journeying through life as a sort of renegade band. Niki Nakayama uses the memory of past meals and personalization to every diner to imbue her cooking with additional layers of curated thoughtfulness and relatability. Ben Shewry shows that exceptional food need not be haughty or showy, that creativity often comes out of necessity, and that care of family and friends is just as important as aspiration for brilliance. And in the season finale, it becomes fully clear that eating well is not limited by place or expectations of how a restaurant and its kitchen should function. Magnus Nilsson prepares some of the most renowned food in the world by picking and preserving what arises in each season in remote Sweden—later preparing it in the intimacy of a 12-seat, tightly-staffed lodge.

Each chef narrates their journey of ambition and failure—of perseverance and gaining insight and originality. Like Aristotelian virtues, flourishing occurs in the balance or mean between the extremes. Culinary arts as a genuine art is about: humility in relation to the dirt that produces everything we take and use to eat; learning and sometimes failing at technique to be able to later freely play with it like a virtuoso instrumentalist; and connecting with other people by prioritizing simplicity and enjoyment over pride and recognition. Show creator David Gelb hopes people, “watch these films and then look at their own lives and the places where they eat and see how it changes their perspective.”

In The Unquiet Grave, Cyril Connolly writes, “The secret of happiness (and therefore of success) is to be in harmony with existence, to be always calm, always lucid, always willing, ‘to be joined to the universe without being more conscious of it than an idiot,’ to let each wave of life wash us a little farther up the shore.”

This kind of harmony—such moderation between trying too hard and indolence—is the way the chefs in Chef’s Table engage cooking, and the earth that brings about the bounty of what can be cooked, in preparation for the people who eat it. We, too, can find genuine happiness through food by coming to see the culinary mean between the extremes as we dine.