The Abject Righteousness of the Civil Rights Movement

O, yes,

I say it plain,

America never was America to me,

And yet I swear this oath–

America will be!

-Langston Hughes, “Let America be America Again,” 1935

timthumb

Recently I’ve been reflecting about citizenship, and all its connotations, in uncomfortable registers. I’ve been motivated by how to prepare for the 2018 midterms, including which candidates to support and how to get involved, and also newly conscious of how Trump’s election and governance have scrambled my understanding of American politics and history. I’m hardly alone in this. It feels like every week there is a new confusion on the left about what is going on right now, and why it’s happening, in order to at least begin to decide whether to care about it.

One haute example of this is the debate between Francis Fukuyama’s and Louis Menand’s philosophies of history. Fukuyama’s idealism proposes that history is predestined “given current trends,” while Menand instead suggests it is defined more by shuffling, disagreements, and unpredictable coalitions. We’ve seen Obama himself appear to waffle about this, first by echoing the call that the arc of the moral universe, while long, does in fact bend towards justice, and later by suggesting, post-Trump, that history “zigs and zags”.

From long reflection, I have come to believe the phenomenon of Trump hinges predominantly on race, and I will explore this topic in future posts. For now, I’d like to prepare those substantive reflections—and also expand on the themes of earlier writings on this blog and elsewhere—by briefly commenting on what seems to make American history tick or flow in certain directions, and how this organizes my sense of what should be done. It’s easy for conversations like this to descend into navel-gazing or airy abstractions, and indeed there’s a bravura in even trying to do it that I suspect makes most commentators avoid it on principle.

Let me say up front that, echoing a recent Facebook post by my friend Nicholas Mulder, if you assign any significance or weight to the daily news, you need a philosophy of history. Whether you think events are stochastic or statistically predictive, or if grand narratives are even possible, is implicitly at stake if you are inclined to groan while reading the Huffington Post or to tweet #metoo.

In America there’s always been this conflict between the “We” and the “I.” It’s seared into our cultural makeup that America is both a collective work-in-progress and a vehicle for personal freedom. Here I mobilize it in a more specific, political sense: a profound and abiding disagreement over how republican politics should operate in this country, and how decisions should be made. The debate is between epistocracy (i.e. that those who know best and are most qualified should rule) and democracy proper. Both themes are present in the Declaration of Independence, famously contrasting a deliberative “we” against the failures of the English King George III. Yet we know from history that proto-Federalists like Alexander Hamilton and John Adams favored administrative power and a centralized financial system, while remaining skeptical of the “common man” and putting down Shay’s Rebellion. Though we mythologize the country’s founding as a democratic revolution, this tension between independence and interdependence remained, seared into the Constitution and justified in The Federalist Papers.

Let me say provocatively (because I cannot hope to fully elaborate the point here) that this conflict between the “We” and the “I” is shot all the way through American political history, that we seem to have been living in a great Age of Identity since the social protests of the 60s and 70s, and that there are now signs of a reborn “We” in the forms of protest and resistance following Trump’s election. Again, this is still early days, and we will need a complete diagnosis of this past age in order to learn how to move forward, which neither I nor anyone else can now provide. But I have already found the “We” and the “I” useful for schematizing past upheavals.

Allow me to elaborate on this stance with reference to the Civil Rights Movement. I recently finished watching the documentary series Eyes on the Prize, which traces the movement from Brown vs. Board of Education to Harold Washington’s election as the first black mayor of Chicago in 1983. I was alerted to it by a passing mention in an essay that claimed viewing it was necessary for one’s civic education to be complete (!).

The episodes are filled with moments so harrowing that I sometimes broke out in a sweat from anxiety, and spent long minutes on my phone as scenes passed by on my laptop that I was too embarrassed to consider head-on. John F. Kennedy, audibly embarrassed, on the phone with the governor of Mississippi, pleading that a single black student needed to be admitted to Ole Miss or else the National Guard would need to be deployed. Young white women explaining to a reporter their own shifting beliefs about whether blacks are racially inferior or just uneducated, as in the background the Little Rock Nine pass down the sidewalk into school. An older white woman in Cicero, Illinois, possibly Italian or Polish but who looked exactly like my mother, confiding to an interviewer that “of course it’s fine for Negroes to live here, but the question is, are they Negroes or ****ers?” The Detroit conflagration as block after block of the city was swallowed up in racial fury. Faculty at Howard University explaining to students why courses in black identity (what would later be known as African-American Studies) were counter to the mission of Historically Black Colleges. New York Jews discovering their inner whiteness as they raged against blacks and “black consciousness” allies over community control of the Ocean Hill-Brownsville school district in Brooklyn. Robert F. Kennedy talking softly to a dirt-poor family in the Black Belt about the importance of learning to read, and tentatively stroking the face of the youngest boy as he walks away. The literal drowning of Resurrection City from downpours on the National Mall, extinguishing the Poor People’s Campaign. A woman crying fifteen years later as she describes singing the “Battle Hymn of the Republic” under a full moon as RFK’s hearse drove past the Lincoln Memorial. A middle-aged Asian woman shoved into the back of a police van, trying to get out the tune to “We Shall Overcome” and plaintively holding the van door open so she remains visible to the TV cameras. A reporter gagging out of moral sickness as, just behind him, the Attica prison revolt was doused in gas, calling out to his cameraman to cut the feed as the mic can only pick up the sound of bones being broken. White Bostonians overturning police cars and attacking horses while screaming racial slurs I’d never heard before, protesting the forced desegregation of public schools.

And yet, there were moments of unimaginable heroism. The decision of Emmett Till’s mother to have an open coffin at her son’s funeral so that thousands of blacks in Chicago could see “what they did to my baby” when he spoke up to a white woman in Mississippi, was lynched, and stuffed at the bottom of the Tallahatchie River. Members of SNCC training for hours at a time and days on end as their own members, white and black, assaulted and beat them as training for the real abuse of sit-ins at lunch counters. The mayor of Nashville vaporizing his own political base by answering the persistent queries of a civil rights worker that while as a public official he swore to uphold all laws including segregation, as a man he could not defend its practice. An uncut three minute sequence of marchers crossing the Edmund Pettus Bridge, seeing what was about to happen, yet continuing straight into a charging horse-mounted policeman’s baton. Fannie Lou Hamer ignored at the 1964 DNC for having the gall to ask if Mississippi’s closed society was in fact America. Stokely Carmichael inventing, on the spot, the phrase “black power” during the March Against Fear, defying the philosophy and temperament of Martin Luther King, Jr. even as they marched step by step. Cassius Clay, a couple years before becoming Muhammad Ali, insisting over and over that he is “already beautiful” as a sports announcer questions whether he “looks good enough for the cameras yet.” The spontaneous gelling of black consciousness as, for the first time in Howard University’s history, the elected homecoming queen would sport an afro.

In other words it was thirty years of history, well-told and ill-gotten.

I find some of its lessons difficult to absorb. I had never before seen Martin Luther King, Jr.’s achievements as a product of failure, but his own moral arc was defined by personal trauma and strategic blunders that stymied the movement repeatedly. It was out of such blunders—which to him felt less like setbacks than like a totalizing morass of major depression—that King became the moral force we have mythologized. The central example came in the form of Laurie Pritchett, chief of police in Albany, Georgia, who broke the SCLC’s naïve attempt to overwhelm Southern jails. Pritchett figured out how to fight nonviolence with nonviolence by coordinating with other sheriffs to keep his own jails from overflowing. King retreated to lick his wounds for weeks at his home, focusing thereafter on more specific, symbolic victories like Selma and Montgomery (Howard Zinn was part of the Albany movement and to this day views this pivot as a major, tragic blunder). Watching “I Have A Dream” after seeing King be broken made me hear, for the first time, the desperation, despair, and raw deferment in that speech of any near-term expectation of recognition and dignity. What got me wasn’t that King was defeated—it was that he was confused. History was not moving in the inexorable direction inaugurated by the big “We” of the Freedom Riders, SNCC, and SCLC.

It was this conflict between King and Pritchett that oddly reminded me most of our current political moment, in which Cory Booker refers to his own release of confidential Kavanaugh documents as “civil disobedience”, and some on the left advocate the exclusion of Trump supporters from restaurants and boutiques. At the time, King was realizing that the great moral authority of one “We” was running up against, and gummed up by, the indignation and recalcitrance and brute effectiveness of a counter-mobilized “We”. So he became willfully symbolic, shifting into the King of the history books, and alienating many of his more radical adherents in the process.

In effect King gambled that his “We” would scale better and win a war of attrition against Southern anachronism. He refused to take any actions and lead any protests that were not permitted to proceed by federal authorities and judges, including the crowning jewel of Selma, finally extinguishing the Jim Crow South by outing it as politically unworkable and contrary to the American experiment–that it was in fact an “I” pretending to be a “We”. Yet, his movement ran aground when it went north to confront the quieter, more restrained, more economical, and ultimately more pernicious racism of redlining.

Like today, there seems to have been a strategically unworkable, yet changing relationship between political tactics and moral compasses, whose plate tectonics could not be refastened by the hands of men while the ground shifted beneath them.

During the Civil Rights Era both sides unequivocally claimed the moral high ground. Both were convinced of their own righteousness. However I slowly realized, episode by episode, that the movement’s defining feature was not this righteousness but its insistence on democracy as a radically moral and potentially self-destructive commitment to personal autonomy. As the movement ate itself alive, as the SCLC and SNCC eroded and splintered into the Black Panthers against the institutionalists, its true irreverence came to the surface. Why should I fight so hard to be considered part of America? Why can’t I myself be America? What is this paltry “America” that whites think they own? Don’t ask for their permission to free the slave that’s inside you, do it yourself. The rise of black epistemology was, in this sense, a declaration of independence from the old-form civil rights movement, just as the Founders’ was from King George III. The “We” that King helped birth was bleeding out into multitudes of “I’s”. It’s not surprising why, when asked what they most remember about the year 1968, O.J. Simpson’s teammates have cited their winning the Heisman Trophy.

Still, in the process, the movement rewrote the moral calculus by which anyone in this country could take up the mantle of citizenship. We remain stuck today in the molds of resentment and civic pride first cast at that time.

Today I think that mold, particularly for whites, has become brittle and unworkable. In future posts I will explore what this means for being a citizen in the Trump era, and possibly after.

Advertisements

The Scientific Spirit and The Death of Absolute Objectivity

quote-god-is-dead-marx-is-dead-and-i-don-t-feel-so-well-myself-eugene-ionesco-108-42-23

A primary motivation for scientific inquiry has always been the notion that our scientific discoveries offer us transcendence from our subjectivity, that they give us access to absolute objective truths (or at least some approximation of it). Stephen Hawking’s famous line about knowing “the mind of god” captures both the essence and the intellectual history of this idea. Physicists, for example, frequently treat their discoveries as a secular analogue of divine wisdom, with the Einstein equation or Newton’s laws sacred to them just as scripture is to the pious. And this scientific spirit, as opposed as scientific culture currently stands to organized religion, grew out of western theology. It’s not an accident that Newton and Copernicus were devout Christians.

But the philosophers realized long ago that “God is dead.” The failures of centuries of metaphysics to rationalize Christianity failed and were abandoned. Nietzsche pointed out that without God, there was nothing standing in the way of a descent into nihilism. Much of philosophy became devoted either to resolving or confronting this new reality. Nietzche believed that this inconvenient truth was recognized at least subconsciously by all, but that Christians remained in a state of denial out of fear or angst.

The death of God presaged the death of the analogue scientific idol, absolute objectivity, which would come with the advent of Quantum Mechanics. The world can be understood, but, it turns out, according to the most straightforward interpretations of quantum theory this is a matter of being able to establish a consistent intersubjective reality more than one of accessing a single overarching objective one. As Niels Bohr said:

“Physics is to be regarded not so much as the study of something a priori given, but rather as the development of methods of ordering and surveying human experience. In this respect our task must be to account for such experience in a manner independent of individual subjective judgement and therefore objective in the sense that it can be unambiguously communicated in ordinary human language.

“The Unity of Human Knowledge” (October 1960) (emphasis mine)

“There is no quantum world. There is only an abstract quantum physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature…”

As quoted in “The philosophy of Niels Bohr” by Aage Petersen

The Schrodinger’s cat thought experiment makes a lot more sense once you’re willing to demote the wavefunction from a descriptor of some transcendent, objective reality, to an effective bookkeeping device for describing an (information theoretical) relationship that the observer has with the cat. Nowadays, physicists have taken many of these ideas even further, e.g. with Black Hole Complementarity.

This picture of science also resolves another paradox, presented in the first pages of Roger Penrose’s Road to Reality. It goes like this. The physical universe, according to the standard scientific picture, is a subset of all mathematical possibilities. In fact one could say the goal of science is to distinguish which subset is the correct one. Mathematics, on the other hand, is itself a subset of our mental activities. That is, our mind can in principle if not in practice access all of mathematical reality, but also is capable of non-mathematical things like emotions. Finally, our brains are proper subsets of the physical universe. This circular picture is paradoxical. Penrose did not claim to know the resolution.

penrose1

But the paradox dissolves if we embrace the view of science advocated by Bohr and based on the lessons of quantum mechanics. The realm of mathematical possibility is not an independent world, but a collection of methods available for relating our minds to the world. Mathematics is what connects our minds to the world, it isn’t a separate platonic realm of its own, independent of our minds and the physical universe.

One might argue that a careful reading of Kant’s Critique of Pure Reason bolsters this framework for science as well. We cannot access things “in and of themselves.” So this philosophy of science far predates quantum theory and was born in the aftermath of the original scientific revolution of Newton and his contemporaries. The quantum nature of the atomic realm merely reinforces it.

Nevertheless many or even most scientists have not accepted this paradigmatic change and properly confronted its implications. Almost all are aware, but like Nietzsche’s Christians many avoid or deny it, lest they fall into a kind of nihilism of the scientific spirit. (for a recent example of this, see the recent awakening of the great Steven Weinberg to the problem)

It may be that this sort of cognitive dissonance is what drives much of the research on alternative interpretations of quantum theory. Many people are drawn to science in general and physics in particular because of the desire to reach for absolute objective truth. It’s a psychological impossibility for them to accept that the findings of their pursuit undermine its original purpose. Famously, even Einstein could not accept these lessons. And how could he? His belief in “Spinoza’s God” had led him to General Relativity, one of the greatest intellectual achievements of human history.

But if we’re honest, we’re faced with a problem. How can we accept some of the more sobering lessons of fundamental physics without losing our sense of purpose in the process? If the moon isn’t there unless we’re looking at it, should we really bother to wonder about it?

There are of course technological reasons to pursue scientific research. The practical value of science is not diminished, and of course that matters. But what of the scientific spirit? Are we left with some form of scientific nihilism? An existential despair of the scientific spirit?

This is worth working through and worth being discussed. But I can offer a personal solution based on my own journey coming to terms with quantum mechanics. I’ve found that, for myself, connection can replace transcendence as a motivation for a scientific life. When I reflect carefully about what’s driven me to study science, it was really a desire to feel connected to the universe, and to broaden and sharpen my experience of reality. None of the revolutions of quantum mechanics need diminish this. I think there was always an unnecessary arrogance in the idea of an absolute objectivity, the belief that we could access a “view from nowhere”, that our minds could be mirrors for the world, that we could reconstruct the universe in totality in our heads. To instead treat science as more about connecting ourselves to the world around us, and about our relationship to it, feels more honest. It doesn’t diminish my motivation while having the added benefit of embodying a humility commensurate with the reverence with the world around us that all scientists feel.

How to Be a Scientist

bill_nye_the_science_guy_title_screen

Science is an unsettling enterprise. This is much of what makes it different from law, or politics, or art, or religion, or other noble and worthwhile pursuits. Science is not interested in conveying fundamental truths, or in ensuring people’s safety, or in improving lives, or telling stories, although it has played a role in all these endeavors and will continue to do so. It is first and foremost a commitment to the unknown and a desire to transform it into something tractable; as Galileo put it, “to measure what is measurable, and make measurable what is not so.” This quote offends some of my friends as dehumanizing, and rightly so. The fact of the matter is that science often tells us things we don’t want to hear, in ways that are hard to understand unless you have a PhD, and for reasons that are opaque to outsiders. Insiders spend most of their time either doing experiments that don’t produce publishable findings or throwing out bad ideas. Even successful papers are boring. We do acknowledge all this kerfuffle as worthwhile and meaningful, and we certainly depend on science’s discoveries. But science, pound for pound, is a mercurial idol, and we haven’t learned how to worship it in a way that feels collectively satisfying.

If we instead look to pop culture, and consider what most people who are looking for intellectual fulfillment or affirmation find exciting and stimulating, it seems that philosophy wins out. Western thinking, after all, was inaugurated by a troll. Socrates’ metaphor of the cave in The Republic remains an acceptable and accessible metaphor for how almost everyone lives their lives and, in the figure of those who escape to the sunlight, the image of what we all strive to achieve. Never mind that most people don’t actually try to do this; the idea is a beautiful one, and pleasing to think about. It is much easier to imagine yourself as someone striving to reach for ultimate, real, uncompromised Truth than as a scientist, trying to make out shadows on the wall in ever greater and verifiable detail. It is true that modern people have a drive for distinction, but in almost all cases this is expressed as a drive for consumption rather than self-expression—the shows we watch, food we eat, people we elect to hang out with, etc. Perhaps out of laziness or wishful thinking, we like to have our own perspectives be regularly affirmed. We allow ourselves to expect some easy alchemy from our ties and associations even though we know that lead and gold are atomically independent elements.

This game of social tautology has its limits. We watch Neil DeGrasse Tyson and expect some kind of realization from his descriptions of the cosmos. We watch The Big Bang Theory not exclusively because we like to make fun of nerds, but because Sheldon Cooper reminds us of Buster Keaton. Bill Nye is kind of sexy. There is an insecurity in this that needs to be unpacked. Why do we sometimes expect science to speak to ourselves in ways that it demonstrably cannot? What task can it perform for our own spiritual, cultural, or libidinal fulfillment?

I am not trying to level an attack on science as a professional endeavor. Nor is my focus to vouchsafe the idea, which already lies at the foundation of Western philosophy, of living scientifically, even as I recognize that idea’s importance in my own life. Instead, I am suggesting that the vocation of science and the idea of living scientifically are even more wondrous when they are brought together—that Constructed Truth and Absolute Truth should be one and the same. This may be impossible to achieve, but it can at least act as a principle that guides our actions. It is not that science is our way of understanding the True, or that the True is an object of endless inward search, but that science can and should be approached as a vehicle for self-discovery, just as that process of discovery can only be called a scientific one.

There is a tradition of American thinking that is attuned to this. Thoreau’s grand experiment in Walden expresses the sentiment well: “It is something to be able to paint a particular picture, or to carve a statue, and so to make a few objects beautiful; but it is far more glorious to carve and paint the very atmosphere and medium through which we look, which morally we can do. To affect the quality of the day — that is the highest of arts. Every man is tasked to make his life, even in its details, worthy of the contemplation of his most elevated and critical hour.” Religion and art may more often serve this purpose, and they are less cognitively demanding, but our insecurity can be explained by our collective sense that scientists, however boring and self-contained their machinations are, are tapping into the “quality of the day” and embodying it in a register that other cultural luminaries cannot. Slavoj Zizek can convince me that capitalism monetizes my fetishes, the Dalai Lama can help me be more mindful, but science allows me to read them on my smartphone. Science takes the inner reality of nature and somehow makes it into a vehicle for our own self-affirmations. All four physical forces must have been accounted for and operationalized for me use my laptop to watch Rachel Maddow remind me I am not dreaming even though Donald Trump is now President-elect.

I could also cite Emerson: “There are always sunsets, and there is always genius; but only a few hours so serene that we can relish nature or criticism.” The problem here is that the relation of genius to the world is not just expressed in a shortage of time but in the multiple ways of appreciating that world. If how we experience the world determines how we interpret it, then a vast division of interpretive labor is necessary for its richness to be available to each of us. We crave a unity from this manifold, but we have a thirst that the very shape of our mouths makes impossible to quench.

Is it possible, then, to do science both professionally and existentially? To be both heroes at once? It’s a tall order. In my experience, many who pursue science as a vocation fail to live scientifically. Doing science is hard enough—being it is a whole other level. In fact, these two forms of science seem to actively oppose each other. I have studied at some of the world’s best universities, and I have interviewed some of the best scientists at those universities, and within a few moments of conversation it would often become clear that I was speaking with a person whose fundamental worldview became entrenched the moment an authority figure complemented their ability to intelligently transpose some trusted pattern of thought. The cultural values that accrue in the wake of this pivotal moment of ego reinforcement—among them the laudable traits of independent thinking and critical reflection—are distressing for not themselves embodying a hypothesized and systematically tested relation to the person’s life. They were adopted by force of habit, and crystallized as a person excelled at creating what he or she was told would be recognized by one’s elective community as honorable and good and worthwhile insofar as it would set up others to skillfully create in a likewise recognizable way. In this way, the existential foundations of science, and especially the most reproducible science, are almost always dogmatic in spirit.

The vocation of science encourages and even depends on deliberation. But it is almost antithetical to living deliberately. The school of Romanticism—willfulness, subjectivity, free expression, poeticizing, seduction—was itself formed in opposition to the pretention of science to discover the fundamental nature of reality. Having studied the intellectual history of that movement, I have always found it ironic and even tragic that this body of beliefs is essentially just a hypothesized instance of counterfactual reasoning: if the values of science are basically wrong, then the opposite ones might be better. Many of the modern world’s most brilliant artists and storytellers have been moved to rage against the spiritual “disenchantment” that science has caused, and told wonderful stories as a result, and in doing so they were living scientifically. The deliberate refusal of Schlegel, Kierkegaard, Shelley, and Novalis to see life singularly, and instead to see it as a playful and experimental domain for their own fantasies—what is more scientific than that?

We have here two models of heroism. The former is dependable, analytical, reproducible, and black box-able. Scientists train themselves into compartmentalizing Truth, and we gladly buy up their Truth-furniture, and use it to furnish our lives. The latter is protean, brave, original, daring, and intense. Artists paint experience in their own image, and we let these paintings color our own dreams and aspirations beyond the lives we now live. These two cultures, and their temperaments, seem utterly orthogonal. What might it mean to bring them together?

As you get older you realize that who you distinctively are is defined more and more by the breadth of your experiences than the field of your opportunities. People treat you more as a product of where you’ve been than what you could potentially become. What Thoreau and Emerson are getting at is that it is desirable, if not to reverse this process, then at least to operationalize it into a controllable variable. It need not just be “a part of life” or a force of nature. To make oneself into something that could be falsified, but nevertheless seems to be true—to transform the world into a laboratory for one’s own exposition—that is the scientific calling in the fullest sense of the term. Life must be lived forwards, but it can also be understood forwards, if understanding is elevated to a principle of vitality rather than reflection.

This all sounds very postmodern. David Foster Wallace, in his brilliant commencement address at Kenyon College, spoke of a liberal arts education similarly as empowering us to choose what we pay attention to and to choose how we make meaning from experience. But I would go a step farther and assert that the scientific life has not just an aesthetic but also a moral and even religious wind at its back. It is a lifestyle choice, but also a vocation, and even more fundamentally a kind of calling. The goal is not to escape boredom or to become happy with one’s own station but to reorient the logic of one’s relation to the world into something testable. We cannot escape the expectations of others, but we can willfully subject those expectations to a court of fair play where the result, i.e. ourselves, is more than arbitrary. To be fulfilled, rather than simply happy, means your happiness passes a test of statistical significance with the reality you have had a hand in constructing.

So to be a scientist, and to also live scientifically, would be to make the entirety of oneself into something tractable and fungible. Even as I write this, it sounds abhorrent and gross. And yet we know from the biographies of great writers and artists, who went out of their way to accumulate interesting and unprecedented life experiences and then tell new kinds of stories out of them, that they were much better at living scientifically than many so-called scientists themselves. We need not make this compromise so long as the truth that you want to discover and make understandable is the latent truth of yourself, something that is itself made in the course of your search for it. The maxim here is to make this process into something more than something that just happens. To be a scientist means to master the conditions of oneself.

Let my lovers be case studies, let my days be data points, let me sleep under the covering-laws of my own Procrustean bed.

How to Not Be a Metaphysician

Most people live in such a way that they are the center of their own lives. I believe that mindset can be happy and valuable when deliberately chosen, but we run into problems when it becomes our default setting. I call someone who is unreflectively self-centered in this way a “metaphysician.” In philosophy, a metaphysician is someone who spends time thinking about the first principles of things. I mean it in a more existential sense: someone who assumes that he or she has access to the bones of reality inside himself or herself, rather than considering that they may lie somewhere on the outside. Recently, I became interested in developing some practical tips on how to become something other than this. They are included below:

-Find a reliable way of reminding yourself that you do not have anything close to all the answers.

-Meditate twice a day.

-Understand the difference between knowing and understanding.

-Make sure that your own decisions have a proximate relation to the forces driving your life.

-Maintain friendships with people from different parts of your life.

-Don’t be mysterious on purpose.

-Indulge yourself as often and as richly as it is interesting to do so.

-Ruminate forward, not backward.

-See movies without consulting Rotten Tomatoes or Metacritic.

-If it’s not an emotion you’re comfortable sharing with anyone, you probably shouldn’t be having it.

-Grow in ways that encourage you to grow.

-Keep track of the times when you start to be a certain way and of what had just happened.

-Be more interested in what you hear and see than in what you have to say and reveal.

-If it’s not a decision you can avoid, it’s also not a decision you should rush.

-See how long you can go without forming an opinion.

-Fail as often as you are able to learn from failure. Which is most of the time.

-Cultivate a meaningful distance between your decisions and your actions.

-Use anger as a forge, not as armor.

-Aim for a filter that keeps things spiffy and doesn’t just keep things hidden.

-Try to get better at trying.

-Trust your gut, it knows more than you do.

-Aim for skillful coping, not managing success.

-Decide which parts of your life you aren’t going to try to make better right now.

-Experiment in such a way that you don’t foreclose the possibility of future experiments.

-Experiment as often as possible.

-Don’t play games with yourself you can’t win.

-When all else fails, recognize that survival is its own virtue.

-Cultivate a relationship between your ends and your means of realizing them.

-Fill what’s empty, empty what’s full, scratch where it itches.

-Make the dysfunctional functional.

-Check that the parts of your life you don’t think about are the parts that are working.

-Forge meaning, build identity.

-Be willing to sacrifice uniqueness for distinctiveness.

-Learn the difference between being outstanding and standing out.

-Date.

-Fuck up artfully. At least make it a good story.

-Do not throw away your shot, but wait for it.

-When in doubt, get more data.

-Be aware of the things you put inside your body and your mind.

-Be discreet and not duplicitous.

-Master the art of forgiving yourself.

-Ensure the ambiguities of your life are healthy.

-Be glad to share the story of your past. Be afraid to know the story of your future.

-Get in the habit of forming habits.

-If you can’t laugh about it, don’t talk about it to people you don’t trust.

-Know what it would take for you to be happy, and don’t be indifferent to the stakes.

-Discover life’s contradictions, treat them as challenges, and share them as parables.

-See fear as a resource.

-Minimize fantasies, maximize projects.

-Gain intuition about your own limits. Be the person who decides what they are.

-Front-load your pain.

-Befriend your insecurities.

-Never be afraid to compromise in the pursuit of becoming yourself. In fact, jump at the chance.

-Live one day at a time.

-Attempt to please others, and work hard for what you care about, but strive to be yourself.

-Don’t try to be a great “man.” Just be a “man,” and let history make its own judgments.

-Take a gender studies class.