Cast Down Your Bucket Where You Are

black-panther-interrogation-scene

My favorite scene in Black Panther is the interrogation between Ulysses Klaue, played by Andy Serkis, and CIA operative Everett K. Ross, played by Martin Freeman. T’Challa (Chadwick Boseman) and Okoye (Danai Gurira), both Wakandans, watch as Klaue and Ross banter—Ross wants to know how Klaue got his hand(s) on vibranium, while Klaue, an Afrikaner, insists on the existence of Wakanda as a “technological marvel” on par with El Dorado. The scene plays little role in the story other than to establish character, but cleverly recasts the “bad guy is ok with being caught” trope as a metaphor for race relations: Serkis’s white supremacist (“I can see you!” he insists) debates with Freeman’s well-meaning but naïve white professional (“I’m doing you guys a favor by letting you even be in here”) about whether blacks are more powerful and independent than they let on, as Boseman and Gurira (“Americans…” she hisses) roll their eyes from the other side of a one-way mirror standing in for the color line. The effect is to convey both the black and white experiences such that the viewer can relate to them simultaneously, no matter where you come from.

Boseman is from South Carolina. Gurira was born in Iowa and raised in Zimbabwe. Serkis and Freeman are both English. None of the actors are portraying their actual nationality, and the two British actors are performing a pantomime about the existential status of a country that, after all, doesn’t exist. The fact that we can still take this scene seriously is a direct product of Barack Obama’s own approach to race. There are important cultural—and political—costs to this.

Before reading Rising Star, I had forgotten my own skepticism of Obama after his 2004 convention speech through fall 2007, believing he was not ready and advertising his opposition to the Iraq War out of proportion to his accomplishments. I had forgotten the electric inspiration of his Iowa speech, which made his campaign feel like a movement. I had forgotten my anger at my parents, my sister, and myself for being so slow to support him. And I had not realized the extent to which these sentiments were emblematic of a new culture just then being born: claiming to navigate the color line while suppressing an actual politics of race.

It’s a huge book, but its underlying thesis is simple: “Barack Obama” doesn’t actually exist. Barack Hussein Obama, a failed biracial writer raised in Indonesia and Hawaii, is frustrated at the world for not giving him a history or sense of orientation from which he could describe who he is or what stories he should bother to tell. Inspired by misreadings of Malcolm X and Martin Luther King Jr., he remakes himself into an inheritor of their legacies as a post-racial politician, able to be all things to all people and forsaking a firm set of ideals or principles. The failures of his Presidency—Republican political gains, growing inequality, continuing racial strife, Trump’s election—reflect Obama’s fundamental lack of self-understanding and directionless ambition.

These are strong claims, and to be blunt I do not take them seriously. It is true Obama exhibited a profound willingness to compromise and reframe his own positions in response to changing political realities, but this clearly flows from his commitment to pragmatism. I find his emotional intelligence—and sheer intelligence—intimidating, not underwhelming. If there has always been an aloofness or superficiality to his public persona, I credit this to the kenosis of trying to be a national figure while also writing the most resonant and insightful American political speeches of the twenty-first century. Still, this book demands to be taken seriously, and the perspective it offers has provided grounds for reflection on my previous posts as well as myself.

Obama is a “We” politician whose life and training reflect the politics of the “I”. His tragedy and his accomplishment is that he found a way to identify not as black, nor as white, nor as biracial, but as a navigator of the color line, as a mediator of the difference it once enforced. This empowered him to reflect and refract the color line without fundamentally transcending its limitations, to displace its effects rather than demolish them. This distinction is crucial for understanding how Obama laid the groundwork for Trump’s Presidency, which is not an aberration from but an exacerbation of Obama’s rhetorical strategy.

I will explore these ideas in future posts. For now, I will summarize the narrative steps and events of Rising Star that suggest how this identity was formed:

-Obama is raised mainly by his grandparents, and feels severe abandonment at the hands of his mother (he later transferred these feelings to his entirely symbolic “father”, Obama Sr., who he never knew). He is a slacker in high school, goes to college in California to meet expectations, and is annoyed at professors who expect more from him than B writing assignments. He transfers to Columbia and dates a white woman, also raised in Indonesia and who shares his sense of disorientation.
-Ambitious but unsure of himself, he signs up for a community organizing gig in Chicago, apparently looking for writing material. He is only hired because the team’s exclusively white members are not trusted in Roseland, on the far south side. While there he immerses himself in the philosophy of Saul Alinsky, which helped him understand community organizing as a quest to identify heroes and villains in order to gin up anger and resistance, before ultimately rejecting its strategies as parochial and unable to sustain real political change across diverse communities facing the same structural problems caused by a changing economy.
-Obama’s sense of abandonment is quenched by the love he finds there: from his teammates, from role models like Jeremiah Wright, from the community members he helps (and sometimes sleeps with), from a biracial UChicago grad student with whom he goes steady and to whom he later proposes, and ultimately from himself, when he discovers his destiny to be President. Obama quickly maps out the next thirty years of his life: he will go to Harvard (as his father did), he will return to Chicago and become mayor (as Harold Washington did), then governor of Illinois.
-His experiences in Chicago—and in particular his failure to convince his girlfriend’s parents, who perceive him as entitled and arrogant, of his worthiness—motivate him to break up with her and identify fully as black before undertaking his master plan. Somehow he realizes that the ticket to his own political ascendancy is to accept and inhabit the standpoint of a black man who can understand whites from within, rather than a biracial man at home in both worlds (or neither). He imagines marrying a dream woman with the “mind of Toni Morrison and the body of Whitney Houston”.
-He spends his time at Harvard Law running circles around other students and professors alike, armed with an organizer’s sense of resolving interpersonal conflicts and an actual knowledge of how race and inequality work beyond the classroom. He learns nothing there except how to win friends and influence people, and knows it. He decides not to apply to join the Law Review (but is roped in), not to run for its Presidency (but is roped in), and not to pursue clerkships (though he is offered all of them and wins even more respect for saying no). In his spare time he wins Michelle.
-He experiences failure back in Chicago: loses potential black allies by challenging Alice Walker’s signatures to become a state senator, is horrified at the consumption of graft and cuisine and loose sex in Springfield, runs for Congress in 2000 and loses, struggles to impregnate Michelle, and does not quit smoking. He toys with the idea (which Michelle encourages) of leaving politics entirely, before a broad coalition of white allies (e.g. Mayor Daley, who does not want Obama to challenge him, and David Axelrod, who wants to redeem himself from Daley’s corporatism and lack of ideals) and key black supporters (e.g. the Springfield majority leader, who wants him to be a feather in his cap rather than a thorn in his side) to run for U.S. Senate. He also pens a dishonest personal account of growing up black in Hawaii and combines or erases his non-black girlfriends, portraying his life as a story of race relations.
-He wants to bide his time in the U.S. Senate and soak up achievements before running for President, but others’ expectations—supporters across Illinois, his own Chicago network, the media, and finally national Democrats who realize the party will embarrass itself if he doesn’t run—force him to declare after just two years. His abandonment issues flare up when Wright leaves the campaign, but this affords an opportunity for more personal reflection on race and political grounds to cut Clinton off at the color line, which he does in “A More Perfect Union”. His nomination is sealed when he loses Indiana by a smaller than expected margin.

The rest is (mostly) history. Here are my more analytical takeaways:

-Every post-college career step Obama took was made possible by his adopted racial identity as a black man. He is considered more prepared for a position because, and not in spite of, his blackness. It is his main qualification, like a bullet point on his resume.
-Obama never truly understood or took seriously the foundation of white racial anxiety, despite possessing a profound understanding of its mechanics and how to assuage them. Nor did he understand the causes for black parochialism that made Chicago neighborhoods not see each other as allies, or made some abandon him after he turned on Alice Walker and Bobby Rush. For him, race is a heritage you adopt in order to find your place in the world and hitch your wagon to a wider sense of purpose, what allows you to relate to other people. But the experience of race as something you possess against your will, that ethnicity is ultimately rooted in nationality, a sense of place that is exclusive and to which others simply cannot belong or fully appreciate, was beyond him.
-There is a ghostly cynicism and indifference about Obama that never goes away. There’s something deeply sad about a man who decides when he’s 25 exactly how he will spend every year of the rest of his life, and I got the sense that Obama gave up trying to become an independent writer when he realized that he could just embody the great unfinished story of the Civil Rights Movement. The hunger of whites around Obama from Harvard on, who openly expect him to be that movement’s capstone, is disturbing, as is Obama’s willingness feed that hunger.
-His career steps all take the same shape: 1) join a new organization, 2) be horrified at its backwardness, racism, and insufficiency, 3) adopt a principled stance of “aren’t we better than this” that impresses a set of old institutional hands who nod silently at his pronouncements, 4) resist their calls for him to take charge of it before giving in, 5) briefly take charge, 6) abandon responsibility for its maintenance when the next opportunity is in sight. Per his Mandela speech on global capitalism and populist anger, Obama’s current arc is somewhere between 2) and 3).
-The tragic dénouement offered by his Chicago girlfriend is that Obama’s willingness to meet the world as it is in order to make the world what it should be—his pragmatism—came at the cost of his sensuousness, the capacity to feel and breathe and just be himself without already evaluating whether it is possible or practical to feel those things. This is disturbing to me and I am still processing its implications. Axelrod observed that his famous 2004 convention speech would have been conventional and over-rehearsed if not for his being interrupted by the Kansas delegation’s cheers—he smiles, points to them, puts his shoulders back, and is suddenly “there” in a way he wasn’t before. At key moments in his path, this sensuousness or “We”-ness somehow pointed the way for his next great evolutionary leap, before receding into an “I”-ness of strategy and calculation and post-racial hobnobbing.

“Cast down your bucket where you are” is one of the most infamous statements on race relations in American history. Uttered by Booker T. Washington, it is often interpreted as the need for African-Americans to know their place and not try to rise above their station in life—to become sharecroppers and shoe-shiners before politicians and poets. The hero of Invisible Man, which apparently served as both inspiration and structural model for Obama’s Dreams from My Father, recites it after taking part in a bloody battle royale for the amusement of white patrons, for whom it serves as his admission to a historically black college. But Washington’s parable illustrating the line is instructive:

“A ship lost at sea for many days suddenly sighted a friendly vessel. From the mast of the unfortunate vessel was seen a signal, ‘Water, water; we die of thirst!’ The answer from the friendly vessel at once came back, ‘Cast down your bucket where you are.’ A second time the signal, ‘Water, water; send us water!’ ran up from the distressed vessel, and was answered, ‘Cast down your bucket where you are.’ And a third and fourth signal for water was answered, ‘Cast down your bucket where you are.’ The captain of the distressed vessel, at last heeding the injunction, cast down his bucket, and it came up full of fresh, sparkling water from the mouth of the Amazon River.”

Race is such a complex category because it is both existential and sensuous. It must be both owned and felt, inhabited and accepted, chosen as well as obeyed, for a true reconciliation to occur. It is both a story you tell every day and a script you cannot help but follow. What intrigues me about Obama after Rising Star, and which makes me respect him more even as I admire him less, is his willingness to align his own process of ethnic self-actualization with the demands of his chosen political destiny. Had he not done this, it’s doubtful he could have become President, and we are richer for his having done it, if more disoriented in turn.

Obama, despite his thirst in New York and Harvard and Springfield and Washington, refused to cast down his bucket where he was, but not because he was ambitious or lacked principle. It was because he deliberately abandoned the effort to make sense of himself in terms of himself, and with it whatever organic versions of himself may have emerged. He learned to navigate the color line at the expense of learning to navigate and accept the full blessed contradictions of his own being. Without him there is no Ta-Nahesi Coates, or Jordan Peele, or Childish Gambino, or Ava DuVernay. But there probably also would not be Sherriff Joe Arpaio, and assuredly not Donald Trump. We are now all navigating that estuary, many of us against our will. How many of us are willing to cast down our buckets where we are?

I cannot help but imagine some other version of Obama who, after successfully linking up with and inhabiting the black experience on the south side, rejected the Harvard offer in favor of Northwestern (where he was also accepted) and stayed with his UChicago girlfriend. School allegiances notwithstanding, it would have been hard. He would probably have become a civil rights lawyer, stayed involved in organizing, perhaps run for local office. It would have meant fewer speeches, less money, and certainly less attention. He would not have married, and would have either given up the effort to reinterpret himself as black or dedicated himself to a more protracted struggle for identity. But he might have begun to put down and cast earth around the roots he so desperately craved, and perhaps arrived at truths about race that are less inspiring but more edifying than the ones he is known for. I can’t claim to understand what it means to be biracial or post-racial, but after growing up in the political order he helped articulate, I recognize they are not the same thing. Thanks, Obama.

Advertisements

The Abject Righteousness of the Civil Rights Movement

O, yes,

I say it plain,

America never was America to me,

And yet I swear this oath–

America will be!

-Langston Hughes, “Let America be America Again,” 1935

timthumb

Recently I’ve been reflecting about citizenship, and all its connotations, in uncomfortable registers. I’ve been motivated by how to prepare for the 2018 midterms, including which candidates to support and how to get involved, and also newly conscious of how Trump’s election and governance have scrambled my understanding of American politics and history. I’m hardly alone in this. It feels like every week there is a new confusion on the left about what is going on right now, and why it’s happening, in order to at least begin to decide whether to care about it.

One haute example of this is the debate between Francis Fukuyama’s and Louis Menand’s philosophies of history. Fukuyama’s idealism proposes that history is predestined “given current trends,” while Menand instead suggests it is defined more by shuffling, disagreements, and unpredictable coalitions. We’ve seen Obama himself appear to waffle about this, first by echoing the call that the arc of the moral universe, while long, does in fact bend towards justice, and later by suggesting, post-Trump, that history “zigs and zags”.

From long reflection, I have come to believe the phenomenon of Trump hinges predominantly on race, and I will explore this topic in future posts. For now, I’d like to prepare those substantive reflections—and also expand on the themes of earlier writings on this blog and elsewhere—by briefly commenting on what seems to make American history tick or flow in certain directions, and how this organizes my sense of what should be done. It’s easy for conversations like this to descend into navel-gazing or airy abstractions, and indeed there’s a bravura in even trying to do it that I suspect makes most commentators avoid it on principle.

Let me say up front that, echoing a recent Facebook post by my friend Nicholas Mulder, if you assign any significance or weight to the daily news, you need a philosophy of history. Whether you think events are stochastic or statistically predictive, or if grand narratives are even possible, is implicitly at stake if you are inclined to groan while reading the Huffington Post or to tweet #metoo.

In America there’s always been this conflict between the “We” and the “I.” It’s seared into our cultural makeup that America is both a collective work-in-progress and a vehicle for personal freedom. Here I mobilize it in a more specific, political sense: a profound and abiding disagreement over how republican politics should operate in this country, and how decisions should be made. The debate is between epistocracy (i.e. that those who know best and are most qualified should rule) and democracy proper. Both themes are present in the Declaration of Independence, famously contrasting a deliberative “we” against the failures of the English King George III. Yet we know from history that proto-Federalists like Alexander Hamilton and John Adams favored administrative power and a centralized financial system, while remaining skeptical of the “common man” and putting down Shay’s Rebellion. Though we mythologize the country’s founding as a democratic revolution, this tension between independence and interdependence remained, seared into the Constitution and justified in The Federalist Papers.

Let me say provocatively (because I cannot hope to fully elaborate the point here) that this conflict between the “We” and the “I” is shot all the way through American political history, that we seem to have been living in a great Age of Identity since the social protests of the 60s and 70s, and that there are now signs of a reborn “We” in the forms of protest and resistance following Trump’s election. Again, this is still early days, and we will need a complete diagnosis of this past age in order to learn how to move forward, which neither I nor anyone else can now provide. But I have already found the “We” and the “I” useful for schematizing past upheavals.

Allow me to elaborate on this stance with reference to the Civil Rights Movement. I recently finished watching the documentary series Eyes on the Prize, which traces the movement from Brown vs. Board of Education to Harold Washington’s election as the first black mayor of Chicago in 1983. I was alerted to it by a passing mention in an essay that claimed viewing it was necessary for one’s civic education to be complete (!).

The episodes are filled with moments so harrowing that I sometimes broke out in a sweat from anxiety, and spent long minutes on my phone as scenes passed by on my laptop that I was too embarrassed to consider head-on. John F. Kennedy, audibly embarrassed, on the phone with the governor of Mississippi, pleading that a single black student needed to be admitted to Ole Miss or else the National Guard would need to be deployed. Young white women explaining to a reporter their own shifting beliefs about whether blacks are racially inferior or just uneducated, as in the background the Little Rock Nine pass down the sidewalk into school. An older white woman in Cicero, Illinois, possibly Italian or Polish but who looked exactly like my mother, confiding to an interviewer that “of course it’s fine for Negroes to live here, but the question is, are they Negroes or ****ers?” The Detroit conflagration as block after block of the city was swallowed up in racial fury. Faculty at Howard University explaining to students why courses in black identity (what would later be known as African-American Studies) were counter to the mission of Historically Black Colleges. New York Jews discovering their inner whiteness as they raged against blacks and “black consciousness” allies over community control of the Ocean Hill-Brownsville school district in Brooklyn. Robert F. Kennedy talking softly to a dirt-poor family in the Black Belt about the importance of learning to read, and tentatively stroking the face of the youngest boy as he walks away. The literal drowning of Resurrection City from downpours on the National Mall, extinguishing the Poor People’s Campaign. A woman crying fifteen years later as she describes singing the “Battle Hymn of the Republic” under a full moon as RFK’s hearse drove past the Lincoln Memorial. A middle-aged Asian woman shoved into the back of a police van, trying to get out the tune to “We Shall Overcome” and plaintively holding the van door open so she remains visible to the TV cameras. A reporter gagging out of moral sickness as, just behind him, the Attica prison revolt was doused in gas, calling out to his cameraman to cut the feed as the mic can only pick up the sound of bones being broken. White Bostonians overturning police cars and attacking horses while screaming racial slurs I’d never heard before, protesting the forced desegregation of public schools.

And yet, there were moments of unimaginable heroism. The decision of Emmett Till’s mother to have an open coffin at her son’s funeral so that thousands of blacks in Chicago could see “what they did to my baby” when he spoke up to a white woman in Mississippi, was lynched, and stuffed at the bottom of the Tallahatchie River. Members of SNCC training for hours at a time and days on end as their own members, white and black, assaulted and beat them as training for the real abuse of sit-ins at lunch counters. The mayor of Nashville vaporizing his own political base by answering the persistent queries of a civil rights worker that while as a public official he swore to uphold all laws including segregation, as a man he could not defend its practice. An uncut three minute sequence of marchers crossing the Edmund Pettus Bridge, seeing what was about to happen, yet continuing straight into a charging horse-mounted policeman’s baton. Fannie Lou Hamer ignored at the 1964 DNC for having the gall to ask if Mississippi’s closed society was in fact America. Stokely Carmichael inventing, on the spot, the phrase “black power” during the March Against Fear, defying the philosophy and temperament of Martin Luther King, Jr. even as they marched step by step. Cassius Clay, a couple years before becoming Muhammad Ali, insisting over and over that he is “already beautiful” as a sports announcer questions whether he “looks good enough for the cameras yet.” The spontaneous gelling of black consciousness as, for the first time in Howard University’s history, the elected homecoming queen would sport an afro.

In other words it was thirty years of history, well-told and ill-gotten.

I find some of its lessons difficult to absorb. I had never before seen Martin Luther King, Jr.’s achievements as a product of failure, but his own moral arc was defined by personal trauma and strategic blunders that stymied the movement repeatedly. It was out of such blunders—which to him felt less like setbacks than like a totalizing morass of major depression—that King became the moral force we have mythologized. The central example came in the form of Laurie Pritchett, chief of police in Albany, Georgia, who broke the SCLC’s naïve attempt to overwhelm Southern jails. Pritchett figured out how to fight nonviolence with nonviolence by coordinating with other sheriffs to keep his own jails from overflowing. King retreated to lick his wounds for weeks at his home, focusing thereafter on more specific, symbolic victories like Selma and Montgomery (Howard Zinn was part of the Albany movement and to this day views this pivot as a major, tragic blunder). Watching “I Have A Dream” after seeing King be broken made me hear, for the first time, the desperation, despair, and raw deferment in that speech of any near-term expectation of recognition and dignity. What got me wasn’t that King was defeated—it was that he was confused. History was not moving in the inexorable direction inaugurated by the big “We” of the Freedom Riders, SNCC, and SCLC.

It was this conflict between King and Pritchett that oddly reminded me most of our current political moment, in which Cory Booker refers to his own release of confidential Kavanaugh documents as “civil disobedience”, and some on the left advocate the exclusion of Trump supporters from restaurants and boutiques. At the time, King was realizing that the great moral authority of one “We” was running up against, and gummed up by, the indignation and recalcitrance and brute effectiveness of a counter-mobilized “We”. So he became willfully symbolic, shifting into the King of the history books, and alienating many of his more radical adherents in the process.

In effect King gambled that his “We” would scale better and win a war of attrition against Southern anachronism. He refused to take any actions and lead any protests that were not permitted to proceed by federal authorities and judges, including the crowning jewel of Selma, finally extinguishing the Jim Crow South by outing it as politically unworkable and contrary to the American experiment–that it was in fact an “I” pretending to be a “We”. Yet, his movement ran aground when it went north to confront the quieter, more restrained, more economical, and ultimately more pernicious racism of redlining.

Like today, there seems to have been a strategically unworkable, yet changing relationship between political tactics and moral compasses, whose plate tectonics could not be refastened by the hands of men while the ground shifted beneath them.

During the Civil Rights Era both sides unequivocally claimed the moral high ground. Both were convinced of their own righteousness. However I slowly realized, episode by episode, that the movement’s defining feature was not this righteousness but its insistence on democracy as a radically moral and potentially self-destructive commitment to personal autonomy. As the movement ate itself alive, as the SCLC and SNCC eroded and splintered into the Black Panthers against the institutionalists, its true irreverence came to the surface. Why should I fight so hard to be considered part of America? Why can’t I myself be America? What is this paltry “America” that whites think they own? Don’t ask for their permission to free the slave that’s inside you, do it yourself. The rise of black epistemology was, in this sense, a declaration of independence from the old-form civil rights movement, just as the Founders’ was from King George III. The “We” that King helped birth was bleeding out into multitudes of “I’s”. It’s not surprising why, when asked what they most remember about the year 1968, O.J. Simpson’s teammates have cited their winning the Heisman Trophy.

Still, in the process, the movement rewrote the moral calculus by which anyone in this country could take up the mantle of citizenship. We remain stuck today in the molds of resentment and civic pride first cast at that time.

Today I think that mold, particularly for whites, has become brittle and unworkable. In future posts I will explore what this means for being a citizen in the Trump era, and possibly after.

The Scientific Spirit and The Death of Absolute Objectivity

quote-god-is-dead-marx-is-dead-and-i-don-t-feel-so-well-myself-eugene-ionesco-108-42-23

A primary motivation for scientific inquiry has always been the notion that our scientific discoveries offer us transcendence from our subjectivity, that they give us access to absolute objective truths (or at least some approximation of it). Stephen Hawking’s famous line about knowing “the mind of god” captures both the essence and the intellectual history of this idea. Physicists, for example, frequently treat their discoveries as a secular analogue of divine wisdom, with the Einstein equation or Newton’s laws sacred to them just as scripture is to the pious. And this scientific spirit, as opposed as scientific culture currently stands to organized religion, grew out of western theology. It’s not an accident that Newton and Copernicus were devout Christians.

But the philosophers realized long ago that “God is dead.” The failures of centuries of metaphysics to rationalize Christianity failed and were abandoned. Nietzsche pointed out that without God, there was nothing standing in the way of a descent into nihilism. Much of philosophy became devoted either to resolving or confronting this new reality. Nietzche believed that this inconvenient truth was recognized at least subconsciously by all, but that Christians remained in a state of denial out of fear or angst.

The death of God presaged the death of the analogue scientific idol, absolute objectivity, which would come with the advent of Quantum Mechanics. The world can be understood, but, it turns out, according to the most straightforward interpretations of quantum theory this is a matter of being able to establish a consistent intersubjective reality more than one of accessing a single overarching objective one. As Niels Bohr said:

“Physics is to be regarded not so much as the study of something a priori given, but rather as the development of methods of ordering and surveying human experience. In this respect our task must be to account for such experience in a manner independent of individual subjective judgement and therefore objective in the sense that it can be unambiguously communicated in ordinary human language.

“The Unity of Human Knowledge” (October 1960) (emphasis mine)

“There is no quantum world. There is only an abstract quantum physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature…”

As quoted in “The philosophy of Niels Bohr” by Aage Petersen

The Schrodinger’s cat thought experiment makes a lot more sense once you’re willing to demote the wavefunction from a descriptor of some transcendent, objective reality, to an effective bookkeeping device for describing an (information theoretical) relationship that the observer has with the cat. Nowadays, physicists have taken many of these ideas even further, e.g. with Black Hole Complementarity.

This picture of science also resolves another paradox, presented in the first pages of Roger Penrose’s Road to Reality. It goes like this. The physical universe, according to the standard scientific picture, is a subset of all mathematical possibilities. In fact one could say the goal of science is to distinguish which subset is the correct one. Mathematics, on the other hand, is itself a subset of our mental activities. That is, our mind can in principle if not in practice access all of mathematical reality, but also is capable of non-mathematical things like emotions. Finally, our brains are proper subsets of the physical universe. This circular picture is paradoxical. Penrose did not claim to know the resolution.

penrose1

But the paradox dissolves if we embrace the view of science advocated by Bohr and based on the lessons of quantum mechanics. The realm of mathematical possibility is not an independent world, but a collection of methods available for relating our minds to the world. Mathematics is what connects our minds to the world, it isn’t a separate platonic realm of its own, independent of our minds and the physical universe.

One might argue that a careful reading of Kant’s Critique of Pure Reason bolsters this framework for science as well. We cannot access things “in and of themselves.” So this philosophy of science far predates quantum theory and was born in the aftermath of the original scientific revolution of Newton and his contemporaries. The quantum nature of the atomic realm merely reinforces it.

Nevertheless many or even most scientists have not accepted this paradigmatic change and properly confronted its implications. Almost all are aware, but like Nietzsche’s Christians many avoid or deny it, lest they fall into a kind of nihilism of the scientific spirit. (for a recent example of this, see the recent awakening of the great Steven Weinberg to the problem)

It may be that this sort of cognitive dissonance is what drives much of the research on alternative interpretations of quantum theory. Many people are drawn to science in general and physics in particular because of the desire to reach for absolute objective truth. It’s a psychological impossibility for them to accept that the findings of their pursuit undermine its original purpose. Famously, even Einstein could not accept these lessons. And how could he? His belief in “Spinoza’s God” had led him to General Relativity, one of the greatest intellectual achievements of human history.

But if we’re honest, we’re faced with a problem. How can we accept some of the more sobering lessons of fundamental physics without losing our sense of purpose in the process? If the moon isn’t there unless we’re looking at it, should we really bother to wonder about it?

There are of course technological reasons to pursue scientific research. The practical value of science is not diminished, and of course that matters. But what of the scientific spirit? Are we left with some form of scientific nihilism? An existential despair of the scientific spirit?

This is worth working through and worth being discussed. But I can offer a personal solution based on my own journey coming to terms with quantum mechanics. I’ve found that, for myself, connection can replace transcendence as a motivation for a scientific life. When I reflect carefully about what’s driven me to study science, it was really a desire to feel connected to the universe, and to broaden and sharpen my experience of reality. None of the revolutions of quantum mechanics need diminish this. I think there was always an unnecessary arrogance in the idea of an absolute objectivity, the belief that we could access a “view from nowhere”, that our minds could be mirrors for the world, that we could reconstruct the universe in totality in our heads. To instead treat science as more about connecting ourselves to the world around us, and about our relationship to it, feels more honest. It doesn’t diminish my motivation while having the added benefit of embodying a humility commensurate with the reverence with the world around us that all scientists feel.

How to Be a Scientist

bill_nye_the_science_guy_title_screen

Science is an unsettling enterprise. This is much of what makes it different from law, or politics, or art, or religion, or other noble and worthwhile pursuits. Science is not interested in conveying fundamental truths, or in ensuring people’s safety, or in improving lives, or telling stories, although it has played a role in all these endeavors and will continue to do so. It is first and foremost a commitment to the unknown and a desire to transform it into something tractable; as Galileo put it, “to measure what is measurable, and make measurable what is not so.” This quote offends some of my friends as dehumanizing, and rightly so. The fact of the matter is that science often tells us things we don’t want to hear, in ways that are hard to understand unless you have a PhD, and for reasons that are opaque to outsiders. Insiders spend most of their time either doing experiments that don’t produce publishable findings or throwing out bad ideas. Even successful papers are boring. We do acknowledge all this kerfuffle as worthwhile and meaningful, and we certainly depend on science’s discoveries. But science, pound for pound, is a mercurial idol, and we haven’t learned how to worship it in a way that feels collectively satisfying.

If we instead look to pop culture, and consider what most people who are looking for intellectual fulfillment or affirmation find exciting and stimulating, it seems that philosophy wins out. Western thinking, after all, was inaugurated by a troll. Socrates’ metaphor of the cave in The Republic remains an acceptable and accessible metaphor for how almost everyone lives their lives and, in the figure of those who escape to the sunlight, the image of what we all strive to achieve. Never mind that most people don’t actually try to do this; the idea is a beautiful one, and pleasing to think about. It is much easier to imagine yourself as someone striving to reach for ultimate, real, uncompromised Truth than as a scientist, trying to make out shadows on the wall in ever greater and verifiable detail. It is true that modern people have a drive for distinction, but in almost all cases this is expressed as a drive for consumption rather than self-expression—the shows we watch, food we eat, people we elect to hang out with, etc. Perhaps out of laziness or wishful thinking, we like to have our own perspectives be regularly affirmed. We allow ourselves to expect some easy alchemy from our ties and associations even though we know that lead and gold are atomically independent elements.

This game of social tautology has its limits. We watch Neil DeGrasse Tyson and expect some kind of realization from his descriptions of the cosmos. We watch The Big Bang Theory not exclusively because we like to make fun of nerds, but because Sheldon Cooper reminds us of Buster Keaton. Bill Nye is kind of sexy. There is an insecurity in this that needs to be unpacked. Why do we sometimes expect science to speak to ourselves in ways that it demonstrably cannot? What task can it perform for our own spiritual, cultural, or libidinal fulfillment?

I am not trying to level an attack on science as a professional endeavor. Nor is my focus to vouchsafe the idea, which already lies at the foundation of Western philosophy, of living scientifically, even as I recognize that idea’s importance in my own life. Instead, I am suggesting that the vocation of science and the idea of living scientifically are even more wondrous when they are brought together—that Constructed Truth and Absolute Truth should be one and the same. This may be impossible to achieve, but it can at least act as a principle that guides our actions. It is not that science is our way of understanding the True, or that the True is an object of endless inward search, but that science can and should be approached as a vehicle for self-discovery, just as that process of discovery can only be called a scientific one.

There is a tradition of American thinking that is attuned to this. Thoreau’s grand experiment in Walden expresses the sentiment well: “It is something to be able to paint a particular picture, or to carve a statue, and so to make a few objects beautiful; but it is far more glorious to carve and paint the very atmosphere and medium through which we look, which morally we can do. To affect the quality of the day — that is the highest of arts. Every man is tasked to make his life, even in its details, worthy of the contemplation of his most elevated and critical hour.” Religion and art may more often serve this purpose, and they are less cognitively demanding, but our insecurity can be explained by our collective sense that scientists, however boring and self-contained their machinations are, are tapping into the “quality of the day” and embodying it in a register that other cultural luminaries cannot. Slavoj Zizek can convince me that capitalism monetizes my fetishes, the Dalai Lama can help me be more mindful, but science allows me to read them on my smartphone. Science takes the inner reality of nature and somehow makes it into a vehicle for our own self-affirmations. All four physical forces must have been accounted for and operationalized for me use my laptop to watch Rachel Maddow remind me I am not dreaming even though Donald Trump is now President-elect.

I could also cite Emerson: “There are always sunsets, and there is always genius; but only a few hours so serene that we can relish nature or criticism.” The problem here is that the relation of genius to the world is not just expressed in a shortage of time but in the multiple ways of appreciating that world. If how we experience the world determines how we interpret it, then a vast division of interpretive labor is necessary for its richness to be available to each of us. We crave a unity from this manifold, but we have a thirst that the very shape of our mouths makes impossible to quench.

Is it possible, then, to do science both professionally and existentially? To be both heroes at once? It’s a tall order. In my experience, many who pursue science as a vocation fail to live scientifically. Doing science is hard enough—being it is a whole other level. In fact, these two forms of science seem to actively oppose each other. I have studied at some of the world’s best universities, and I have interviewed some of the best scientists at those universities, and within a few moments of conversation it would often become clear that I was speaking with a person whose fundamental worldview became entrenched the moment an authority figure complemented their ability to intelligently transpose some trusted pattern of thought. The cultural values that accrue in the wake of this pivotal moment of ego reinforcement—among them the laudable traits of independent thinking and critical reflection—are distressing for not themselves embodying a hypothesized and systematically tested relation to the person’s life. They were adopted by force of habit, and crystallized as a person excelled at creating what he or she was told would be recognized by one’s elective community as honorable and good and worthwhile insofar as it would set up others to skillfully create in a likewise recognizable way. In this way, the existential foundations of science, and especially the most reproducible science, are almost always dogmatic in spirit.

The vocation of science encourages and even depends on deliberation. But it is almost antithetical to living deliberately. The school of Romanticism—willfulness, subjectivity, free expression, poeticizing, seduction—was itself formed in opposition to the pretention of science to discover the fundamental nature of reality. Having studied the intellectual history of that movement, I have always found it ironic and even tragic that this body of beliefs is essentially just a hypothesized instance of counterfactual reasoning: if the values of science are basically wrong, then the opposite ones might be better. Many of the modern world’s most brilliant artists and storytellers have been moved to rage against the spiritual “disenchantment” that science has caused, and told wonderful stories as a result, and in doing so they were living scientifically. The deliberate refusal of Schlegel, Kierkegaard, Shelley, and Novalis to see life singularly, and instead to see it as a playful and experimental domain for their own fantasies—what is more scientific than that?

We have here two models of heroism. The former is dependable, analytical, reproducible, and black box-able. Scientists train themselves into compartmentalizing Truth, and we gladly buy up their Truth-furniture, and use it to furnish our lives. The latter is protean, brave, original, daring, and intense. Artists paint experience in their own image, and we let these paintings color our own dreams and aspirations beyond the lives we now live. These two cultures, and their temperaments, seem utterly orthogonal. What might it mean to bring them together?

As you get older you realize that who you distinctively are is defined more and more by the breadth of your experiences than the field of your opportunities. People treat you more as a product of where you’ve been than what you could potentially become. What Thoreau and Emerson are getting at is that it is desirable, if not to reverse this process, then at least to operationalize it into a controllable variable. It need not just be “a part of life” or a force of nature. To make oneself into something that could be falsified, but nevertheless seems to be true—to transform the world into a laboratory for one’s own exposition—that is the scientific calling in the fullest sense of the term. Life must be lived forwards, but it can also be understood forwards, if understanding is elevated to a principle of vitality rather than reflection.

This all sounds very postmodern. David Foster Wallace, in his brilliant commencement address at Kenyon College, spoke of a liberal arts education similarly as empowering us to choose what we pay attention to and to choose how we make meaning from experience. But I would go a step farther and assert that the scientific life has not just an aesthetic but also a moral and even religious wind at its back. It is a lifestyle choice, but also a vocation, and even more fundamentally a kind of calling. The goal is not to escape boredom or to become happy with one’s own station but to reorient the logic of one’s relation to the world into something testable. We cannot escape the expectations of others, but we can willfully subject those expectations to a court of fair play where the result, i.e. ourselves, is more than arbitrary. To be fulfilled, rather than simply happy, means your happiness passes a test of statistical significance with the reality you have had a hand in constructing.

So to be a scientist, and to also live scientifically, would be to make the entirety of oneself into something tractable and fungible. Even as I write this, it sounds abhorrent and gross. And yet we know from the biographies of great writers and artists, who went out of their way to accumulate interesting and unprecedented life experiences and then tell new kinds of stories out of them, that they were much better at living scientifically than many so-called scientists themselves. We need not make this compromise so long as the truth that you want to discover and make understandable is the latent truth of yourself, something that is itself made in the course of your search for it. The maxim here is to make this process into something more than something that just happens. To be a scientist means to master the conditions of oneself.

Let my lovers be case studies, let my days be data points, let me sleep under the covering-laws of my own Procrustean bed.

How to Not Be a Metaphysician

Most people live in such a way that they are the center of their own lives. I believe that mindset can be happy and valuable when deliberately chosen, but we run into problems when it becomes our default setting. I call someone who is unreflectively self-centered in this way a “metaphysician.” In philosophy, a metaphysician is someone who spends time thinking about the first principles of things. I mean it in a more existential sense: someone who assumes that he or she has access to the bones of reality inside himself or herself, rather than considering that they may lie somewhere on the outside. Recently, I became interested in developing some practical tips on how to become something other than this. They are included below:

-Find a reliable way of reminding yourself that you do not have anything close to all the answers.

-Meditate twice a day.

-Understand the difference between knowing and understanding.

-Make sure that your own decisions have a proximate relation to the forces driving your life.

-Maintain friendships with people from different parts of your life.

-Don’t be mysterious on purpose.

-Indulge yourself as often and as richly as it is interesting to do so.

-Ruminate forward, not backward.

-See movies without consulting Rotten Tomatoes or Metacritic.

-If it’s not an emotion you’re comfortable sharing with anyone, you probably shouldn’t be having it.

-Grow in ways that encourage you to grow.

-Keep track of the times when you start to be a certain way and of what had just happened.

-Be more interested in what you hear and see than in what you have to say and reveal.

-If it’s not a decision you can avoid, it’s also not a decision you should rush.

-See how long you can go without forming an opinion.

-Fail as often as you are able to learn from failure. Which is most of the time.

-Cultivate a meaningful distance between your decisions and your actions.

-Use anger as a forge, not as armor.

-Aim for a filter that keeps things spiffy and doesn’t just keep things hidden.

-Try to get better at trying.

-Trust your gut, it knows more than you do.

-Aim for skillful coping, not managing success.

-Decide which parts of your life you aren’t going to try to make better right now.

-Experiment in such a way that you don’t foreclose the possibility of future experiments.

-Experiment as often as possible.

-Don’t play games with yourself you can’t win.

-When all else fails, recognize that survival is its own virtue.

-Cultivate a relationship between your ends and your means of realizing them.

-Fill what’s empty, empty what’s full, scratch where it itches.

-Make the dysfunctional functional.

-Check that the parts of your life you don’t think about are the parts that are working.

-Forge meaning, build identity.

-Be willing to sacrifice uniqueness for distinctiveness.

-Learn the difference between being outstanding and standing out.

-Date.

-Fuck up artfully. At least make it a good story.

-Do not throw away your shot, but wait for it.

-When in doubt, get more data.

-Be aware of the things you put inside your body and your mind.

-Be discreet and not duplicitous.

-Master the art of forgiving yourself.

-Ensure the ambiguities of your life are healthy.

-Be glad to share the story of your past. Be afraid to know the story of your future.

-Get in the habit of forming habits.

-If you can’t laugh about it, don’t talk about it to people you don’t trust.

-Know what it would take for you to be happy, and don’t be indifferent to the stakes.

-Discover life’s contradictions, treat them as challenges, and share them as parables.

-See fear as a resource.

-Minimize fantasies, maximize projects.

-Gain intuition about your own limits. Be the person who decides what they are.

-Front-load your pain.

-Befriend your insecurities.

-Never be afraid to compromise in the pursuit of becoming yourself. In fact, jump at the chance.

-Live one day at a time.

-Attempt to please others, and work hard for what you care about, but strive to be yourself.

-Don’t try to be a great “man.” Just be a “man,” and let history make its own judgments.

-Take a gender studies class.