“In the seventh century,” writes Lewis Mumford in Technics & Civilization, “by a bull of Pope Sabinianus, it was decreed that the bells of the monastery be rung seven times in the twenty-fours hours. These punctuation marks in the day were known as the canonical hours, and some means of keeping count of them and ensuring their regular repetition became necessary.” The instrument that would help the monasteries ring bells on a regular basis was the mechanical clock, whose “‘product’ is seconds and minutes.” Standard, measurable sequences of time not a latent property of the universe, but the output of a man-made machine. Mumford proposes that the monastic desire for order, the desire to cultivate a way of being where surprise, doubt, caprice and regularity were put at bay, was the cultural foundation that created the clock, but that the clock went on to “give human enterprise the regular collective beat and rhythm of the machine; for the clock is not merely a means of keeping track of hours, but of synchronizing the actions of men.”
This effect continues to impact us today. Most of us structure our existence by the synchrony of the industrial work week, waking up Monday through Friday at a certain time, commuting on crowded trains or highways with everyone else at a certain time, breaking for lunch at a certain time, showing up to meetings punctually and ordering the exchange of information and ideas to fit a pre-determined 30 or 60 minutes (the skill of managing meetings to maximize communicative efficacy a byproduct of the need to keep time), coveting weekends or vacations because we crave a moment of unstructured respite, crave the opportunity to vaunt our enlightened ability to take a device-free day (so we can return fresher and more productive on Monday), all-the-while watching Monday peer over the horizon and looking forward, once more, to the following Friday at 5:00 pm.
Perhaps more profoundly (or perhaps as a byproduct of the way we live day to day), we continue to share the assumption that time working normally is time that flows at the same, standard pace for all individuals and for the same individual at different periods in their life. I infer that this is a standard assumption because of how much it interests us to explore the contrary, namely that our subjective experience of time is not standard, that you might think our activity dragged on for hours while I thought it flashed by in seconds, that time from our childhood seemed to pass so much more slowly than it does in old age.
At 35, I cannot give a rich, inner account of what time feels like for a 70-year-old or 80-year-old. But over the past few weeks, I asked a handful of people 70 and above to describe their experience of time and account for why they think time feels faster as they get older. What follows are four accounts for why time speeds up as we age and a few suggestions for things we can do to slow time down. I take it for granted that slowing down time increases our sensation of living a meaningful life. For there is power in the continuum: if there’s an upper bound on the number of years we can live, why not focus on expanding our perception of the duration of each year, of each instant? At the theoretical limit, we would achieve immortality in a moment of living (an existentialist take on Zeno’s paradox, which any good pragmatist should and could easily shut down, and which Jorge Luis Borges elegantly explored in the Secret Miracle).
Why Does Time Speed Up as We Age?
Let’s start with a physics argument proposed by Duke professor Adrian Bejan. In his short article Why the Days Seem Shorter as We Get Older, Bejan focuses on how the structure of the eye changes with age, lengthening the periods between which we can perceive a change in a succession of images, i.e., can experience a unit of time:
Time represents perceived changes in stimuli (observed facts), such as visual images. The human mind perceives reality (nature, physics) through images that occur as visual inputs reach the cortex. The mind senses ‘time change’ when the perceived image changes…The sensory inputs that travel into the human body to become mental images–‘reflections’ of reality in the human mind–are intermittent. They occur at certain time intervals (t1), and must travel the body length scale (L) with a certain speed (V)…L increases with age because the complexity of the flow path needed by one signal to reach one point on the cortex increases as the brain grows and the complexity of [path flows in the eye] increases…At the same time, V decreases because of the aging (degradation) of the flow paths. Two conclusions follow: (i) More recorded mental images should be from youth and (ii) The ‘speed’ of time perceived by the human mind should increase over life.
So, essentially, because we perceive fewer changes in images as we get older, time seems to flow more quickly.
Neuroscientist David Eagleman makes a similar argument with different rationale. Eagleman builds on experiments where people are shown images of cats, each image for 0.5 seconds. Experiment participants are presented with an image of the same cat multiple times, and then presented with an image of a different cat: results show that participants feel like the new cat is on the screen for a longer period of time (even though all images are presented for only 0.5 seconds). Eagleman’s conclusion is that “when the brain sees something that’s novel, it has to burn more energy to represent it, because it wasn’t expecting it. The feeling that things are going in slow motion is a trick of memory.” As children, he continues, we are constantly bombarded by novelty as we work to figure out the rules of the world and, importantly, write down a lot of memory. By contrast, in old age we sink into routines and habits, and no longer need to sample as much from the world to navigate it. Because we perceive less, we remember less, and looking back on the past year it seems to have flown by. Note the mechanisms accounting for time speeding up differ slightly from those provided by Bejan: Eagleman includes the mechanisms of expectation and memory, of our brain’s models of the world as the core explanation for why we seem to be taking in and recording less data from the environment we inhabit. Similar between the two, however, is that these mechanisms occur beyond the horizon of our own conscious perception–they happen without our knowing it or being able to describe the experience.
When I asked people in their 70s and 80s how time feels as they age, they all reported it seems to go by faster, but focused their explanations on higher-level relative experience.
Men tended to focus on the relative length of one day’s existence vis-a-vis the total amount of time they’d lived or the total amount of time they presumed they had left to live. So, if you’ve only lived 365 days, each day is 1/365th of your total existence; if you’ve lived 27,375 days (75 years old), each day is 1/27,375th of your total existence. As a much smaller percentage of your total lived experience it will feel like the day passes more quickly. The converse is the sense that there are fewer years left to live, as well as the increasing awareness of the inevitability and proximity of death. Here, each day feels more important, as there are only so many more to live. The subjective experience of time feels faster because it is more precious.
Women tended to focus on the perception of how quickly younger people (i.e., grandchildren) in their lives change. We grow and learn quickly as children (and sometimes as adults), but inhabiting our own consciousness, aren’t (often or always) observing ourselves in time-lapse. The changes occur quickly but nonetheless gradually and continuously, and absent epiphany we don’t factor our own change into our perception of time. But many grandparents, especially in North America, don’t see their grandchildren on a daily basis. They can observe massive week by week, or month by month, or year by year changes, which occur much faster than the sameness they perceive in their own minds, bodies, and existences. It was interesting to me that women focused more on their experience of others, that their very notion of time was relative to how they experience others. I don’t want to claim gender essentialism, and am sure there are men out there who would also focus their sense of time on changes they see in others. But this was what I found in my small interview sample.
How Can We Slow Down Time?
As mentioned above, I’m going to take for granted that we’d want to slow down time to live a richer and more meaningful life (rather than wanting time to speed up as we age to just get things over with). Here are a few ideas on things on to make that happen:
Introduce novelty into daily life – In the video above, Eagleman references activities as simple as brushing our teeth with the opposite hand or wearing our watch on the opposite hand. Various small actions we can take to break mundane habits that nonetheless force the brain to do more work than it normally does. An extreme example would be learning to ride a bike whose handles steer the wheel in the opposite direction we learned originally.
Travel – There’s a banal account of travel that would focus on seeing a new culture and exploring an environment different from the one we see every day. Another take on novelty. But I think the impact on travel can be much more profound. First, it’s an opportunity to consciously activate more of our senses than we pay attention to in our daily lives. I’m against the idea that viewing an image of a foreign place or experiencing it through virtual reality is enough to satisfy our craving to experience that new place. This betokens a focus on sight as the primary sense for knowledge, forgetting the importance of sound, smell, touch, and taste. When I visited Bangalore and Madurai, India, I was constantly amazed by the extreme juxtaposition of smells on the streets, where one second’s sensation of exhaust and filth would be followed by the second’s of jasmine whiffs from the woven necklaces at a small street stand. These smells oriented my sense of space, oriented what I saw around me and shaped my memories of the experience. Second, some of the most profound memories I have of time dilated almost to a standstill have been while traveling alone. Living outside any social connections and pressures, abstracted from my past as from my future to focus on the gait of passersby, of the tiredness I sense in my legs on day three after walking around cities or mountains to take in sights, on sensing aloneness without feeling the pangs of loneliness that one feels ensconced in a social context. Everything is heightened, even if I wish I had the opportunity to share what I’m experiencing with others. This isn’t just about novelty; it’s about providing a context to practice sensing more from the surrounding environment, without the myopia of driving an outcome or goal.
Do one thing at a time – This as living mindfully. Not just meditating 20 minutes per day, but doing daily activities in a mindful way. One of the hardest is to eat without doing anything else. I don’t mean focusing on the social bond created by sharing a meal with others. I mean eating by oneself without reading something or writing an email or watching TV or doing something else while eating. Just focusing on taste, texture, how long the food stays in the mouth before swallowing, what the plating looks like, what the colors look like, how it tastes to combine two things together (or whether it’s cleaner to keep them separate), what the temperature is like on tongue or lips or cheeks or glasses, what a utensil looks like as it interacts with food (like a spoon going in and out of soup), how the body’s sensations change during the meal. Eating is a good starting point that could serve as a practice ground for many other simple activities in life: walking, reading (without getting distracted by the internet), nursing a baby, caring for a sick person, sitting and breathing.
Enliven the present through analogy and memory – I recently read The Art of Living, which Buddhist monk Thich Nhat Hanh wrote at the age of 91. In one passage, Nhat Hanh shows how “ten minutes is a lot of a little [depending] on how we live them.” He goes on to describe how, when preparing for a talk, he “opened the faucet a little sot hat only a few drops came out, one by one.” He then imagined these icy drops as melted snow falling in his hand, which transported him back to memories of the Himalayan mountains he’d experienced in his youth, far away now that he was in a hut in a monastery in France. He then abided in the metaphor, seeing dew on the grasses he passed outside as he walked to his talk as more drops of Himalayan snow, seeing the water on his face and in his body as connected to this Himalayan snow. His account is interesting because it skirts how we normally think about mindfulness. This wasn’t about focusing attention to observe what’s there, but about hopping from one analogical connection to another to bring out unity in experience. Key, of course, was that he wasn’t focused on what came next, wasn’t anxious about what others would think about him when he gave his upcoming talk. He dilated a few drops of water into a grand theory of interconnectedness, traveling through the vehicle of his own associations and memories. As we age, we carry with us this lapsed time, these lapsed experiences. It may be that it’s how we relate to our own past that is the secret for how we expand the meaning of our present.
This post was primarily about sharing thoughts I’ve had over the past few weeks. The real significance is to try to recover the habit of writing regularly, a habit which has dwindled over the past year. One must start somewhere. I’ve always found that a regular writing practice expands what I perceive around me, as I feel motivated to capture as much as possible as material for what I’ll write. Perhaps that’s the dual significance of this post.
The featured image was taken at sunset in Grenadier Pond in High Park in Toronto in December, 2019. Mihnea and I were on an evening walk. The sun grew as it capped the horizon, pitching grass tufts into relief. A few people walked by with dogs; a woman insisted we walk further south to catch this beauty in a thick French accent. Time slowed as we focused on the changing hue of the grass tufts, which became darker at their center and lighter around the edges.
Most writing defending the value of the humanities in a world increasingly dominated by STEM focuses on what humanists (should) know. If Mark Zuckerberg had read Mill and De Tocqueville, suggests John Naughton, he would have foreseen political misuse of his social media platform. If machine learning scientists were familiar with human rights law, we wouldn’t be so mired in confusion on how to conceptualize the bias and privacy pitfalls that sully statistical models. If greedy CEOs had read Dickens, they would cultivate empathy, the skill we all need to “put ourselves in someone else’s shoes and see the world through the eyes of those who are different from us.”
I agree with Paul Musgrave that “arguments that the humanities will save STEM from itself are untenably thin.” Reading Aristotle’s Nicomachean Ethics won’t actually make anyone ethical. Reading literature may cultivate empathy, but not nearly enough to face complex workplace emotions and politics without struggle. And given how expensive a university education has become, it’s hard to make the case of art for art’s sake when only the extremely elite have the luxury not to build marketable skills.
But what if training in the humanities actually does build skills valuable for a STEM economy? What if we’ve been making the wrong arguments, working too hard to make the case for what humanists know and not hard enough to make the case for how humanists think and behave? Perhaps the questions could be: what habits of mind do students cultivate in the humanities classroom and are those habits of mind valuable in the workplace?
I wrote about the value of the humanities in the STEM economy in early 2017. Since that time, I’ve advanced in my career from being an “individual contributor” (my first role was as a marketing content specialist for a legal software company) to leading teams responsible for getting things done. As my responsibilities have grown, I’ve come to appreciate how valuable the ways of speaking, writing, reading, and relating to others I learned during my humanities PhD are to the workplace. As a mentor Geoffrey Moore once put it, it’s the verbs that transfer, not the nouns. I don’t apply my knowledge of Isaac Newton and Gottfried Leibniz at work. I do apply the various critical reading and epistemological skills I honed as a humanities student, which have helped me quickly grow into a leader who can weave the communication fabric required to enable teams to collaborate to do meaningful work.
This post describes a few of these practical skills, emphasizing how they were cultivated through deep work in the humanities and are therefore not easily replaced by analogue training in business administration or communication.
David Foster Wallace’s 2005 Kenyon College Commencement speech shows how powerful arguments in favor of a liberal arts education can be when they need not justify material payoff
Socratic Dialogue and Facilitating Team Discussion
Humanities courses are taught differently from math and science courses. Precisely because there is no one right answer, most humanities courses use the Socratic Method. The teacher poses questions to guide student dialogue around a specific topic, helping students question their assumptions and leave with a deeper understanding of a text than they did when they came to the seminar. It’s hard to do this well. Students get off topic or cite arguments or examples that aren’t common knowledge. Some hog the conversation and others are shy. Some teachers aren’t truly open to dialogue, and pretend the discuss when they’re really just leading students to accept their interpretation.
At Stanford, where I did my graduate degree, History professor Keith Baker stands out as the king of Socratic dialogue. Keith always reread required reading before class and opened discussion with one turgid question, dense with ways the discussion might unfold. Teaching D’Alembert and Diderot’s preface to the French Encyclopédie, for example, he started by asking us to explain the difference between an encyclopedia and a dictionary. The fact this feels like common sense is what made the question so poignant, and, for the French Enlightenment authors, the distinction between the two revealed much about the purpose of their massive work. The question forced us to step outside our contemporary assumptions and pay attention to what the same words meant in a different historical context. Whenever the discussion got off track, Keith gracefully posed a new question to bring things back on point without offending a student, fostering a space for intellectual safety while maintaining rigor.
The habits of mind and dialogue trained in a Socratic seminar are directly applicable to product management, which largely consists in facilitating structured discussions between team members who see a problem differently. In my work leading machine learning product teams, I frequently facilitate discussions with scientists, software developers, and business subject matter experts. Each thinks differently about what should be done and how long it will take. Researchers are driven by novelty and discovery, by developing an algorithm that pushes the boundary of what has been possible but which may not work given constraints from data and the randomness in statistical distributions. Engineers want to find the right solution to meet the constraints for a problem. They need clarity and don’t mind if things change, but need some stability so they can build. The business team represents the customer, focusing on success metrics and what will please or hook a user. The product manager sits between all these inputs and desires, and must take into account all the different points of view, making sure everyone is heard and respected, but getting the team to align on the next action.
Socratic methods are useful in this situation. People don’t want to be told what to do; they want to be part of a collective decision process where, as a team, they have each put forth and understood compromises and trade-offs, and collectively decided to go forward with a particular approach. A great product manager starts a discussion the same way Keith Baker would, by providing a structure to guide thinking and posing the critical question to help a group make a decision. The product manager pays attention to what everyone says, watches body language and emotional cues to capture team dynamics. She nudges the dialogue back on track when teams digress without alienating anyone and builds a moral of collective and autonomous decision making so the team and progress forward. She applies the habits of mind and dialogue practiced in the years in a humanities classroom.
Philology and Writing Emails for Many Audiences
My first year in graduate school, I took a course called Epic and Empire, which traced the development of the Western European epic literary tradition from Homer’s Iliad to Milton’s Paradise Lost. The first thing we analyzed when starting a new text was how the opening lines compared and contrasted to those we’d read before. Indeed, epics start with a trope called the invocation of the muse, where the poet, like a journalist writing a lede, informs the reader what the subject of the poem is about using a humble-boasting move that asks the gods to imbue him with knowledge and inspiration.
So Homer in the Iliad:
Sing, Goddess, sing the rage of Achilles, son of Peleus— that murderous anger which condemned Achaeans
And Vergil signaling that the Aeneid is Rome’s answer to the Iliad, but that an author as talented as Vergil need not depend on the support from the gods:
Arms and the man I sing, who first made way, predestined exile, from the Trojan shore to Italy, the blest Lavinian strand.
And Ariosto, an Italian author, coyly signaling that it’s time women had their chance at being the heroines of epics (the Italian starts with Le Donne):
Of loves and ladies, knights and arms, I sing,
Of courtesies, and many a daring feat;
In the same strain of Roland will I tell
Things unattempted yet in prose or rhyme…
And finally Milton, who, at the time when Cromwell was challenging the English monarchy, signals his aim to critique contemporary politics and society by daftly merging the Judaic and Greek literary traditions:
OF Mans First Disobedience, and the Fruit
Of that Forbidden Tree, whose mortal tast
Brought Death into the World, and all our woe,
With loss of Eden, till one greater Man
Restore us, and regain the blissful Seat,
Sing Heav’nly Muse…
Studying literature this way, one learns not just knowledge of historical texts, but the techniques authors use to respond to others who came before them. Students learn how to tease out an extra layer of meaning above and beyond what’s written. The first layer of meaning in the first lines of Paradise Lost is simply what the words refer to: this is a story about Adam and Eve eating the forbidden fruit. But the philologist sees much more: Milton decides to hold the direct invocation to the muse until line six so he could foreground a succinct encapsulation of the being in time of all Christians, waiting from the time of the fall until the coming of Christ; does that mean he wanted to signal that first and foremost this is a Christian story, with the Greek tradition, signaled by the reference to Homer, only arriving 5 lines later?
Reading between the lines like this is valuable for executive communications, in particular in the age of email where something written for one audience is so easily forwarded to others without our intending or knowing. Business communications don’t just articulate propositions about states of affairs; they present facts and findings to persuade someone to do something (to commit resources to a project, to spend money on something, to hire or fire someone, to alter the way the work, or simply to recognize that everything is on track and no worry is required at this time). Every communication requires sensitivity to the reader’s presumed state of mind and knowledge, reconstructing what we think they know or could know to ensure the framing of the new communication lands. Each communication should build on the last communication, not using the stylistic invocation to the muse like Homer, but presenting what’s said next as a step in a narrative in time. And at one moment in time, different people in different roles interpret communications differently, based on their particular point of view, but more importantly their particular sensitivities, ambitions, and potential to be threatened or impacted by something you say. Executives have to think about this in advance, write things as if they were to be shared far beyond the intended recipient with his or her point of view and stakes in a situation. Philology training in classes like Epic and Empire is a good proxy for the multi-vocal aspects of written communications.
Making Sense of Another’s World: The Practice of Analytical Empathy
In 2013, I gave a talk about why my graduate work in intellectual history formed skills I would later need to become a great product marketer. As in this post, my argument in that talk focused not on what I knew about the past, but how I thought about the past: as an intellectual historian focused on René Descartes’ impact on 17th-century French culture, I sought to reconstruct what Descartes thought he was thinking, not whether Descartes’ arguments were right or wrong and should continue to be relevant today or relegated to the dustbin of history (as a philosopher would approach Descartes).
Doing this well entails that one get outside the inheritance of 400 years of interpretation that shape how we interpret something like Descartes’ famous Cogito, ergo sum, I think, therefore I am. Most philosophers get accustomed to seeing Descartes show up as a strawman for all sorts of arguments, and consider his substance dualism (i.e., that mind and body are totally separate kinds of matter) is junk in the wake of improved understanding about the still mysterious emergence of mind from matter. They solidify an impression of what they think he’s saying as seen from the perspective of the work philosophy and cognitive science seeks to do today. As an intellectual historian, I sought to suspend all temptation to read contemporary assumptions into Descartes, and to do what I could to reconstruct what he was thinking when he wrote the famous Cogito. I read about his upbringing as a Jesuit and read Ignatius of Loyola’s Spiritual Exercisesto better understand the genre of early-modern meditations, I read the texts by Aristotle he was responding to, I read seminal math texts from the Greeks through the 16th-century Italians to understand the state of mathematics at the time he wrote the Géometrie, I read not only his Meditations, but also all the surrounding responses and correspondence to better understand the work he was trying to accomplish in his short and theatrical philosophical prose. And after doing all this work, I concluded that we’ve misunderstood Descartes, and that is famous Cogito isn’t a proposition about the prominence of mind over body, but rather a meditative mantra philosophers should use to train their minds to think “clear and distinct” thoughts, the axiomatic backbones for the method he wanted to propose to ground the new science. And Descartes was aware we are all to prey to fall into old habits, that we had to practice mantras every day to train the mind to take on new habits to complete a program of self-transformation. I didn’t care if he was right or wrong; I cared to persuade readers of my dissertation that this was the work Descartes thought the Cogito was doing.
A talk I gave in 2012 about why my training in intellectual history helped be become a good product marketer, a role that requires analytical empathy.
This skill, the skill of suspending one’s own assumptions about what others think, of not approaching another’s worldview to evaluate whether its right or wrong, but of working to make sense of how another lives and feels in the world, is critical for product management, product marketing, and sales. Product has migrated from being an analytical discipline focused on triaging what feature to build next to maximize market share to being an ethnographic discipline focused on what Christian Madsbjerg calls analytical empathy (Sensemaking), a “process of understanding supported by theory, frameworks, and an engagement with the humanities.” This kind of empathy isn’t just noticing that something might be off with another person and searching to feel what that other person likely feels. It’s the hard work of coming to see the world the way another sees it, patiently mapping the workflows and functional significance and emotions and daily habits of a person who encounters a product or service. When trying to decide what feature to build next in a software product, an excellent product manager doesn’t structure interviews with users by posing questions about the utility of different features. They focus on what the users do, seek to become them, just for one day, watch what they touch, what where they move cursors on screens, watch how the muscles around their eyes tighten when they get frustrated with a button that’s not working or when they receive a stern email from a superior. They work to suspend their assumptions about what they assume the user wants or needs and to be open to experiencing a whole different point of view. Similarly, an excellent sales person comes to know what makes their buyers tick, what personal ambitions they have above and beyond their professional duties. They build business cases that reconstruct the buyers’ world and convincingly show how that world would differ after the introduction of the sellers’ product or service. They don’t showcase bells and whistles; they explain the functional significance of bells and whistles within the world of the buyer. They make it make sense through analytical empathy.
Business school, in particular as curriculum exists today, isn’t the place to practice analytical empathy. And humanities courses that are diluted to hone supposedly transferrable skills aren’t either. The humanities, practiced with rigor and fueled by the native curiosity of a student seeking deeply to understand an author they care about, is an avenue to build the hermeneutic skills that make product organizations thrive.
Narrative Detail Helping with Feedback and Coaching
It’s table stakes that narrative helps get early funding and sales at a startup, in particular for founders who lack product specificity and have nothing to sell but an idea (and their charisma, network, and reputation). But the constraints of the pitch deck genre are so formulaic that humanities training may be a crutch, not an asset, to succeeding at creating them. Indeed, anyone versed in narratology (the study of narrative structure) can easily see how rigid the pick deck genre is, and anyone with creative impulses will struggle to play by the rules.
I first understood this while attending a TechStars FinTech startup showcase in 2016. A group of founders came on stage one by one and gave pithy pitches to showcase what they were working on. By pitch three, it was clear that every founder was coached to use the exact same narrative recipe: describe the problem the company will address; imagine a future state changed with the product; sketch the business model and scope out the total addressable market; marshal biographical details to prove why the business has the right team; differentiate from competitors; close with an ask to prospective investors. By pitch nine, I had trouble concentrating on the content beyond the form. It reminded me of Vladimir Propp’s Morphology of the Folktale.
This doesn’t mean that storytelling isn’t part of technology startup lore. It is. But the expectations of how those stories are told is often so constrained that rigorous humanities training isn’t that helpful (and it’s downright soul-destroying to feel forced to adopt the senseless jargon of most tech marketing). In my experience, narrative has been more poignant and powerful in a different area of my organizational life: coaching fellow employees through difficult interpersonal situations or life decisions.
A first example is the act of giving feedback to a colleague. There are many different takes on the art of making feedback constructive and impactful, but the style that resonates most with me is to still all impulses towards abstraction (“Sally is such a control freak!”) and focus on the details of a particular action in a particular moment (“In yesterday’s standup, Sally interrupted Joe when he was overviewing his daily priority to say he should do his task differently than planned.”). As I described in a former post, what sticks with me most from my freshman year Art History 101 seminar was learning how to overcome the impulse towards interpretation and focus on observing plain details. When viewing a Rembrandt painting, everyone defaulted to symbolic interpretation. And it took work to train our vision and language to articulate that we saw white ruffled shirts and different levels of frizziness in curly hair and tatters on the edges of red tablecloths and light emanating on from one side of the painting. It’s this level of detailed perception that is required to provide constructive feedback, feedback specific enough to enable someone to isolate a behavior, recognize it if it comes up again, and intentionally change it. When stripped of the overtones of judgment (“control freak!”) and isolated on the impact behavior has had on others (“after you said that, Joe was withdrawn throughout the rest of the meeting”), feedback is a gift. Now, no training in art history or literature prepares one to brace the emotional awkwardness of providing negative feedback to a colleague. I think that only comes through practice. But the mindset of getting underneath abstraction to focus on the details is certainly a habit of mind cultivated in humanities courses.
A second example is in relating something from one’s own experience to help another better understand their own situation. Not a day goes by where a colleagues doesn’t feel frustrated they have to do task they feel is beneath them, anxious about the disarray of a situation they’ve inherited, confused about whether to stay in a role or take a new job offer, resentful towards a colleague for something they’ve said or done, etc… When someone reaches out to me for advice, I still impulses to tell them what to do and instead scan my past for a meaningful analogue, either in my own experience or someone else’s, and tell a story. And here narrative helps. Not to craft a fiction that manipulates the other to reach the outcome I want him or her to reach, but to provide the right framing and the right amount of detail to make the story resonate, to provide the other with something they can turn back to as they reflect. Wisdom that transcends the moment but can only be transmitted in full through anecdote rather than aphorism.
I’ll close with an example of an executive speech act that humanities education does not help prepare for. A constructive and motivating company-wide speech is, at least in my experience, the hardest task executives face. Giving an excellent public speech to 2000 people is a cakewalk in contrast to giving a great speech about a company matter to 100 colleagues. The difficulty lies in the kind of work a company speech does.
The work of a public speech is to teach something to an audience. A speaker wants to be relevant, wants to know what their audiences knows and doesn’t know, reads and doesn’t read, to adapt content to their expectations and degree of understanding. Wants to vary the pace and pitch in the same way an orchestra would vary dynamics and phrasing in a performance. Wants to control movement and syncopate images and short phrases on a slide with the spoken word to maximally capture the audiences’ attention. There are a lot of similarities between giving a great university lecture and giving a great talk. This doesn’t mean training in the humanities prepares one for public speaking. On the contrary, most humanists read something they’ve written in advance, forcing the listener to follow long, convoluted sentences. Training in the humanities would be much more beneficial for future industry professionals if the format of conference talks were a little more, well, human.
The work of a company speech is to share a decision or a plan that impacts the daily lives and sense of identity of individuals that share the trait that, at this time, they work in a particular organization. It’s not about teaching; the goal is not to get them to leave knowing something they didn’t know before. The goal is to help them clearly understand how what is said impacts what they do, how they work, how they relate to this collective they are currently part of, and, hopefully, to help them feel inspired by what they are asked to accomplish. Unnecessary tangents confuse rather than delight, as the audience expects every detail to be relevant and cogent. Humor helps, but it must be tactfully displayed. It helps to speak with awareness of different individuals’ predispositions and fears: “If I say this this way, Sally will be reminded of our recent conversation about her product, but if I say it that way, Joe will freak out because of his particular concern.” People join and leave companies all the time, and a leader has to still impulses towards originality to make sure newcomers hear what others have heard many times before without boring people who’ve been in the company for a while. The speech that resonates best is often extremely descriptive and leaves no room for assumption or ambiguity: one has to explicitly communicate assumptions or rationale that would feel cumbersome in most other settings, almost the way parents describe every next movement or intention to young children. At the essence of a successful company talk is awareness of what everyone else could be thinking, about the company, about themselves, and about the speaker, as one speaks. It’s a funhouse of epistemological networks, of judgment reflected in furrowed brows and groups silently leaving for coffee to complain about decisions just after they’ve been shared. It’s really hard, and I’m not sure how to train for it outside of learning through mistakes.
What This Means for Humanities Training
This post presented a few examples of how habits of mind I developed in the humanities classroom helped me in common tasks in industry. The purpose of the post is to reframe arguments defending the value of the humanities from knowledge humanists gain to the ways of being humanists practice. Without presenting detailed statistics to make the case, I’ll close by mentioning Christian Madsbjerg’s claim in Sensemaking that humanities students may be best positioned for non-linear growth in business careers. They start off making significantly lower salaries than STEM counterparts, but disproportionately go on to make much higher salaries and occupy more significant leadership positions in organizations in the long run. I believe this stems from the habits of mind and behavior cultivated in a rigorous humanities education, and that we shouldn’t dilute it by making it more applicable to business topics and genres, but focus on articulating just how valuable these skills can be.
The featured image is of the statue of David Hume in Edinburgh. His toe is shiny because of tourist lore that touching it provides good fortune and wisdom, a superstition Hume himself would have likely abhorred. I used this image as the nice low-angle shot made it feel like a foreboding allegory for the value of the humanities.
This is the fourth post in an indefinite series on love. Here are post 1, post 2, and post 3. Someday the love series will coalesce into a book.
A master in the dark arts of anxiety with a mystic’s appreciation for the beauty latent in our day-to-day, I’ve listened to countless podcasts about meditation. My favorites, like this Tim Ferriss podcast with Jack Kornfield or this Krista Tippett podcast with Carlo Rovelli, brought tears to my eyes as I listened to them walking to work. I’d take a moment to recompose before entering the office, hang up my coat, and walk–more like jog–directly to Charu Jaiswal and Katharine Marek, two former colleagues, gushing with enthusiasm about what I’d heard and watching them react with combination of surprise at being accosted, curiosity about the content, and joy at the purity of my intent. Through all these hours of listening, however, I never heard anyone speaking about meditating with another. I don’t mean next to someone, but entangled with someone, touching him or her, looking into his or her eyes, meditating together to feel, think, and breath as one. Love as meditation, or meditation as love.
I bet it would strike most as horrendously awkward and invasive to meditate looking directly into another’s eyes, even if they were the eyes of a lover or spouse. Context alters what feels natural: gazing into someone’s eyes is romantic on a first date, bonding after sex (when authentic…), and cherished during a proposal. But pausing our lives to sit down and look at one another without speaking just feels weird. It’s certainly not something we’d practice at a yoga class or meditation retreat. Sitting for five minutes in silence with strangers at a respectable distance is hard enough; having strangers sit on top of us and look into our eyes would annihilate any sliver of the inner peace and focus meditation is designed to promote.
But it’s deeper than that. The way of being we cultivate during meditation is a often a being disentangled from the minds and emotions of others. The quest in the west is to use meditation to appease our anxiety, depression, and fear, to chip away at crusty carapaces of self-hatred we’ve built over the years. It’s a hermetic place guarded in ritual, a precious state of mind that is relief from the turbulence and distraction we return to the minute we turn off our app and check email or Facebook. Many meditators find it difficult to bring the inner stillness they find on the mat to the office chair or the grocery store check-out line. The world impinges on us. Others impinge on us, entangling their histories, their emotions, their states with ours. And the spell is broken.
This isn’t to say that this is what meditation should and could be about. On the contrary, practices like metta (loving kindness) are geared towards focusing on others, wishing well-being, safety, and health to loved ones and strangers alike. The enlightened are beyond the petty, world-forming powers of anxiety, envy, and ambition, able to embrace all with equanimity and love divorced from the pain of projected possibility and power. In Buddhism, as far as I understand it, each person’s meditation promotes all beings’ ability to liberate ourselves from samsara, the purgatorial repetition of earthly existence characterized by suffering. So there should and could be value in practicing being fully connected with another, finding the same inner silence we encounter sitting by ourselves while our minds and emotions are entangled with another’s. Making space for emotions and thoughts we can’t control or directly observe and all the while experiencing deliverance.
My fiancé Mihnea and I recently started an almost daily practice of meditating with one another. On one another. In one another. We didn’t start on a quest for entangled enlightenment. We started because I wanted to meditate and we wanted to be together and it didn’t feel right to meditate next to one another, apart and independent, because we don’t love that way. It’s not our way of being. Ours is a love of oneness and interdependence, so it was more natural for us to meditate on one another than next to one another. So began a practice that yielded challenging and graceful experiences neither of us expected.
Here’s a sample. Mihnea has told me that he recognizes his own experiences in these descriptions.
How We Sit
Both Mihnea and I are prolific creators. We create to find stillness. We write, learn, build teams, build futures, ground our constant creativity in disciplined habits and rituals. As creators, we both have the instinct to experiment with meditation techniques and poses, to make meditation itself an instance of creative power. And yet, we (almost) always meditate in the same position: he sits on a couch or chair and I straddle him with my knees bent. Our perspectives are different: I tilt my head down and he tilts his up, when he bends his neck I see the crown of his head and when I bend mine he sees the tip of my forehead, I look down into his eyes and he looks up into mine.
My hunch is that we keep returning to this same position because it foregrounds the ability to look directly into one another’s eyes (or perhaps not, perhaps it’s all awaiting and preparation, twenty minutes spent building magnetic potential like rising bread that finds release in an embrace that is deep, pure, reaching towards essence, in the moment when I collapse into his chest like a child and he softens his hands to stroke my back and kiss my hair, a moment bursting with waiting, watching, noticing what wonders emerge when the world stretches flat from the density of a concentrated gaze). During a recent sit, Mihnea’s gaze hit me like a solid beam, unearthing a latent memory of the early Italian Renaissance Neoplatonic philosopher Marsilio Ficino, who described the eyes as a conduit to exchange blood and spirits, capable of beaming soul-rays into another. Here’s his commentary on Plato’s Phaedrus in De Amore :
Put before your eyes, I beg of you, Phaedrus the Myrrhinusian, and that Theban who was seized by love of him, Lysias the orator. Lysias gapes at the face of Phaedrus. Phaedrus aims into the eyes of Lysias sparks of his own eyes, and along with those sparks transmits also a spirit. The ray of Phaedrus is easily joined to the ray of Lysias, and spirit easily joined to spirit. This vapor produced by the heart of Phaedrus immediately seeks the heart of Lysias, through the hardness of which it is condensed and turns back into the blood of Phaedrus as before, so that now the blood of Phaedrus, amazing though it seems, is in the heart of Lysias. Hence each immediately breaks out into shouting: Lysias to Phaedrus: “O, my heart, Phaedrus, dearest viscera.” Phaedrus to Lysias: “O, my spirit, my blood, Lysias.”
Humoring Renaissance Humorism and Christian-mystic-transubstantation, there is something intense and powerful about our eyes becoming vessels to exchange blood and bile. While we meditate, it’s clear that our connection is centered through our eyes. The many other channels of communication and connection–my hands on his stomach breathing his breath, my bottom sensing pulses and twitches in his quadriceps, the heat from his body creating a temperature differential between my chest (facing him) and back (facing away)–are present but dim in contrast to the encompassing power of Mihnea’s gaze, the window to the inside, the locus that dominates my awareness and makes the rest feel like static in the background, surprising me when it comes to the fore. Given this connection, we often meditate with our eyes open; indeed, it feels slightly awkward for me when I close my eyes, as if I’m cutting off our connection and imparting distance. I’m learning to see with my eyes closed, to be ok if his eyes remain open and watching even when my eyes are closed. To feel blood seep through eyelid gates and pump his heart with mine.
My knees inevitably get so sore I have to lie flat and stretch them after we sit. Once we inverted our positions and he sat on me; he wasn’t heavy but fear that he would be tightened his leg muscles. I tried to relax him. Someday I’d like us to stand holding hands, stand with our hands down near our haunches but with some part touching one another, lie on our sides facing one another, lie as if we were two dead people in coffins with my back touching his front, sit with our backs facing one another, etc. There’s no rush.
Breathing in Syncopation
One of the first things a new meditator learns is how to focus on the breath. Breathing is a marvelous anchor because we all easily recognize it as an eminently noticeable act that almost always goes unnoticed as we think about or do something else. The psychologist William James went so far as to claim that consciousness is nothing but breath, and that what we mistake for consciousness is a fictitious thing philosophers made up to name the thing that knows its thinking:
I am as confident as I am of anything that, in myself, the stream of thinking (which I recognize emphatically as a phenomenon) is only a careless name for what, when scrutinized, reveals itself to consist chiefly of the stream of my breathing. The ‘I think’ which Kant said must be able to accompany all my objects, is the ‘I breath’ which actually does accompany them. There are other internal facts besides breathing (intracephalic muscular adjustments, etc., of which I have said a word in my larger Psychology), and these increase the assets of ‘consciousness,’ so far as the latter is subject to immediate perception; but breath, which was ever the original of ‘spirit,’ breath moving outwards, between the glottis and the nostrils, is, I am persuaded, the essence out of which philosophers have constructed the entity known to them as consciousness. That entity is fictitious, while thoughts in the concrete are fully real. But thoughts in the concrete are made of the same stuff as things are. (Italics original)
There are different breathing techniques and different techniques for attending to breath. I like to start a meditation session with controlled, hyper-dilated breathing: 60-second inhale-exhale cycles dabbled with rapid cycles if I get short of breath. Five breaths (minutes) in, I notice that my brain feels different. It’s hard to describe, but it’s as if the center of my brain’s activity shifts from the forehead to the back of the skull, as if a spidermonculus had emerged from hibernation in my axons to canvas my neural pathways in delicate, shimmying webs. Eventually I stop controlling the pace and observe myself breathing naturally. While keeping partial attention on the breath, I sometimes expand awareness to scan the sensations in one localized body part, like my right pinky toe; this lopsided focus is most fun when it induces pinky toe hallucinations, stretching my torso and face into oblong pizza dough like a reflection in a funhouse mirror. Dogmatic mindfulness meditators insist that one shouldn’t control the breath actively, that the practice is about noticing what the mind and body are doing and stilling the instinct to control. The breath, in mindfulness, is a home base to return to when thoughts do what thoughts do and plan and worry and wonder and plan and worry and wonder and criticize and compare and plan and worry and criticize and evaluate and self-hate and worry about planning and remember and ruminate and plan about worrying and ruminate about self-hate and remember about planning and plan about worrying ad infinitum. In the pranayama tradition, mediators actively restrict the breath using fingers or hold the breath at the top or bottom of a cycle. Given my proclivities for experimentation, I like to experiment with different techniques and observe what happens.
When Mihnea and I started meditating together, I began with my habitual practice of long, controlled inhales and exhales. But it didn’t work. My prefrontal cortex stayed engaged, comparing my breathing cycle with his. I observed myself as I imagined he observed me, projecting my meditative I into his gaze such that the I watching me breathe was no longer the meditative I, but a socialized I, an outside I seeing my face and skin, judging me within my projections of what another sees. This doesn’t mean this is what Mihnea sees, or even what I think Mihnea sees. It was rather that his presence activated my superego, activated a bifurcation of the observing I that includes the awareness that others are watching. The discomfort was exacerbated by frustration: I sought to replicate the experience I cherish when meditating alone and felt budding frustration that I wasn’t able to replicate it, that the situation was different, that I didn’t have the same control.
So I pivoted. Focused on his breath instead of mine. It’s a different experience, a different way to cultivate inner stillness, but a way better suited to meditation with another. When I focus on my own breath, the act of attention is coupled with an act of will that controls the observed phenomenon: as I observe myself breathing at a pace I dictate, it’s as if my body were an extension of my attention. When I focus on his breath, the act of attention is decoupled from the observed phenomenon: sometimes Mihnea will imitate my breathing pace, but it normally lasts no more than 2-3 cycles. I notice the union and the difference, but the syncopation doesn’t bug me or create distance between us. It brings me closer to him, attunes me to him to the point where I stop noticing me and identify entirely with his breathing instead. It’s even deeper when I place my hands on his front body, palms facing down, one hand on his chest and one hand on his abdomen. My hands become like eyes, absorbing the heat and movements from his body as if they were x-rays observing his inner motions. When my hands breathe his breath, I close my eyes to amplify the sensation. It’s one of the few times when I feel more comfortable meditating with him with my eyes closed.
Mihnea recently built an application that synchronizes breathing in a group of people. Users put a belt around their chest to measure and join breath cadences. His research has shown that people who breathe in sync are more likely to register and remember the same thing: they become like one observer. I’ve certainly experienced communion like this in group meditation sessions, the experience strongest when the sound of my Om harmonizes with the resounding Oms of others (best when there are baritones and basses present). I think this makes the syncopation between Mihnea and my breath in our private meditation all the more interesting. We don’t breath in sync. But it’s precisely the syncopation that draws me out of myself and onto him, to decouple attention from will and give myself to his being.
Knowing the Self Through Touch
Whereas I place my hands on Mihnea’s chest and torso to breathe his breath, he laces his fingers behind the small of my back to balance and support me. The awareness of me in his hands, of his hands on me, interestingly, is an outside-in way of perceiving the self, a meditation practice emphasizing self and mind as integrated with body (Yoga is similar, just aligning mind with body through movement rather than focusing on the self as body through touch). Note that the organic awareness of the boundaries of the self through touch is very different from the deleterious projections of how another would see the self as described above. Watching hands don’t judge, the feel their way to vision.
Finding self awareness in touch (rather than through a recursive loop in the mind) reminds me of responses some French Enlightenment philosophers had to Cartesian epistemology. Descartes emphasized the split between the res cogitans (thinking thing) and the res extensa (extended thing, or matter), claiming that we can build the world (and God) from a clear and distinct perception of the self because it’s impossible to say “I don’t exist” (who’s the I who says I don’t exist?). A corollary was that knowledge does not depend on sensory experience, that truth is pre-wired in the mind (explanation beyond the scope of this post). Enlightenment empiricists thought this was bollocks and worked to show how all knowledge starts with sensory experience. One of my favorite pieces in the tradition (referenced in a former post on consciousness), is Etienne Bonnot de Condillac‘s Treatise on Sensations, which opens with a fable about a statue that comes to know herself by touching another, implying that the mind alone does not suffice to stratify the self. Here’s how I paraphrased Condillac on May 21, 2010:
Imagine a statue that can only smell. Waft a rose under its nose. To an observer, it will be a statue that smells a rose. But to itself, it will simply be the oder of rose, of carnation, of jasmine, of violet, according to the objects that stimulate it. The odors the statue smells will seem to it not as properties of an external object, but rather as its own manners of being. Now think that the statue can only hear. Again, when the wind blows the oak leaves and rustles the willows, it will be that rustle and when the rain pitters the roof above, it will pitter with the rain. Now let the statue only be able to taste and smell. Place on its tongue your honey and thyme, and it will be that honey and thyme. Place on its tongue your cream and your salt, and it will be your cream and salt. It will be a collection of manners of being.
Now let the statue touch. First let the statue touch its own hands, its own legs. Then, place a rock on the table in front of your statue and let it touch it. It will rapidly pull back its hand in fear! For the statue will know that its me, the me that feels modified in its hands, does not feel modified in its body when it touches the rock. It is, then, the sensation of touch by which the soul passes from itself outside itself. By touching the rock, your statue will awake to its existence as different from the rest of the external world.
Now, Pygmalion, let the statue touch your own hand. Wait out your statue’s initial fear. Gradually, she will recognize that you are like her, a form similar to her own. But she will recognize that you are more than her and will think that her existence might change places and pass entirely into this second half of herself. She will want to give you all her being; a vivid desire will return and take over her whole existence, as a new manner of being, as a new awareness of a self that is a complete surrender of self into another. She will feel the birth of a sixth sense. Let us call this sixth sense dependence, vulnerability, or love.
Mihnea’s hands, however, do more than anchor me in the present as a body in space. They open another dimension and transport me through time. He has stubby fingers, incommensurate with the grace of his being. But he’s an extremely skilled pianist who expresses the musicality of his being–and the musicality of the world channeled through his being–through touch, absorbing and reflecting my needs and emotionality. When he strokes my hair, he transports me back to my childhood: I am three again, five again, comforted at last by my parents’ touch after hours of fretful insomnia. His hands cradle my fear and remind me I am no longer alone, ease me to sleep after the storm. Time unfolds in our moment of meditation, collapsing my life into the sensation of his fingers on my back. It pulses, breathing like a seal asleep under beach sun.
Practicing Stillness When Entangled With Another’s Mind
A Mind Like Sky is one of my favorite Jack Kornfield meditations. It calls for expansive attention (rather than the focused and controlled attention described above) to cultivate a mind “vast like space, where experiences both pleasant and unpleasant can appear and disappear without conflict, struggle or harm.” (From the Majjhima Nikaya) Every once and a while when I practice like this, the boundaries between myself and other relax (it’s too strong to say disappear). I identify with the sky, with the vase of breadsticks sitting askant to my left as I write, with the red spindly branches of the still leafless plant on our back porch. When the boundaries of the self expand to include and encompass everything, our ethical calculus changes. The golden rule stops making sense.
I have yet to feel this kind of universal identification when meditating with Mihnea. I trust it will come in time, but so far having my mind entangled with his has thwarted my ability to generalize my self awareness: his powerful and immediate presence grounds me in an us that is part of but not inclusive of everything. This is in part caused by the inviscid movements of our non-verbal communication. When Mihnea notices something that stimulates him, his eyes flicker and twitch with activity. When his eyes spark during meditation, I wonder what he’s thinking, what he noticed, get locked in his mind’s movement. Sometimes he opens his eyes so wide that the skin on his forehead folds like waves on a pond. His mouth opens slightly. He looks at me in utter surprise and I can’t help but laugh.
The openness and stillness our meditation cultivates is the kind needed to be good friends and colleagues. It’s practice being able to register the actions, words, and emotions of others in an encounter, rather than focusing on one’s own inner world, emotions, and thoughts. One of the limitations many meditators encounter is the difficulty transferring the same grounded bliss from the mat to the boardroom: the tendency is to default right back to our same old selves upon entering into the magnetic field of a given epistemological network and context. I’ve felt this frustration, and wondered if all the morning practice would ever amount to transformative change at work and in the rest of my life. There’s a good case to be made that meditating with someone else is better practice for the entangled consciousness we experience with others. It’s awkward at first, but it’s where the real work takes place. Almost like the initial resistance to taking an improv class that can go on to work wonders for one’s ability to act bravely and brazenly in other areas of life. I’d love to transform myself into a blank and open vessel, always open to others, able to see them for who they are, with strengths and weaknesses and beauty and blemishes, and to help them grow with equanimity. To register every last detail of every encounter and replicate the details in my mind. It’s a work in progress.
Seeing the Details
Meditating with Mihnea gives me time to study him with the minute gaze of an entomologist. I study the curve of his chin, the two little freckles near his right eye (left-looking to me as we face one another), the curve of the bottom of his earlobes into the side of his face (he has attached earlobes like Clint Eastwood), the distribution of grey and black beard hairs at different lengths depending on when he last shaved, the shape of his lips, the chappedness of his lips at any given time, the puncture a single beard hair through the center of his bottom lip, stains on the side of his teeth, the odor of his breath (so often hinting tangerine oil), the smell of love in his veins and through his cartilage, the aura that emanates from behind his neck, like a halo on a Fra Angelico fresco, the furrow of his brow, the uneven distribution of his forehead pores, the even distribution of his skin melatonin, the precision of his hair line depending when he last got a haircut, the depth of the inlay into his spine in his lower back. It can go on to infinity, confirming the glorious skepticism of all that is there to be known.
I like to pay particular attention to the details of Mihnea’s eyes, and see different things every time. Their colors are enormously complex: he has hazel eyes with brown concentrated near the pupils (which sometimes dilate or constrict extremely rapidly after he returns his head back to look at me after bending his neck, only to encounter the shock of the bright lights above), followed by rays of green that eventually give to violet lining around the iris. His sclera tend to be bloodshot in the evening, which is when we tend to sit on weekdays. His eyes are galaxies, Leibnizian monads whose lines narrate a universe’s worth of history. And every time we meditate, water collects at the bottom of Mihnea’s eyes. I always wonder if it harkens tears. Sometimes it does. When it does, and I ask him afterwards why he cried, he says it is to wash away my pain.
One time I examined his tears. We didn’t let them keep us from meditating. We stayed silent. The tears were slow to fall, and collected in concave meniscuses like water in a glass. They hung in suspension, stopping time in the dense event horizon of his hands laced behind my back. Finally they fell. They fell down his cheek and dampened the two freckles under his right eye. I wiped away the rivulets with the pad of my index fingers, reabsorbing the pain he felt for me as we sat.
Just different, this time.
 I immediately bought and devoured Rovelli’s The Order of Time after hearing him on the On Being podcast. The book inspired this post. I got my mom a copy for Christmas and learned that my dad loved it after speaking with him on a recent phone call (Mihnea and I got my dad multiple physics books for Christmas, but didn’t think to get him this one. My dad appreciates lyricism, however, so I’m not surprised he loved it too). The book’s lyrical style inspired me: it gave me license to incorporate my own emotionality into the book I thought I was going to write about machine learning. Naturally, as I work on the book, the subject matter has changed slightly. I’m still in this purgatorial space where it’s trying to figure out its identity and is currently a gangly teenager with braces experimenting with different genders. Anxiety doesn’t help much and my dear friend reason-simulating-excuses likes to do pull-ups near my right ear over my right shoulder, whispering that this is character building, reminding me that wrestling with the content is the only way to write something worth reading. She is a rascal.
 Perhaps my most meaningful experience at integrate.ai was my weekly meditation session with Katharine Marek. Our meditation club started off much larger. In the first session, a whole slew of us sat together in a glass-exposed room at the WeWork offices at Yonge and Bloor and did our best to concentrate and focus in the mid afternoon as passersby buzzed by like worker bees. Participation fell like lemmings off a cliff. Shradha Mittal staid with us for a few more mornings, but she only worked part-time so the habits weren’t regular. It ended up being just me and Katharine. And it was marvelous. As it was just the two of us, we could begin each session sharing our worries, doubts, anxieties, emotions, thoughts. She shared with me and I shared with her. I’d teach her different techniques, mindfulness one day, metta the next, to expose her to various kinds of meditation and let her pick which one stuck. Ever suffering from misophonia (a byproduct of being highly sensitive and anxious), I’d do my best to ignore the sound of Yevgeniy Kissin tapping his spoon on a ceramic bowl as he ate a bowl of oatmeal like clockwork at the same time every morning (it didn’t help that I knew exactly when the clinking would start; Yev knows this and knows I love him). Katharine would smirk with joy at the oddity. Being with her reminded me of the course I taught at Stanford that had only one student, the magnificent Josefina Massot. We read Milton and Hobbes and Rousseau and Kleist like two Renaissance scholars sharing ideas. Josefina also had episodes of depression, and we supported one another through the quarter, teacher loving student loving teacher. I live for these experiences and vow to keep the details alive through frequent remembrance.
 I’m a fan of long podcasts interviews. Tim Ferriss’s interview with Jack Kornfield, for example, is 03:02:03. I don’t feel any need to consume content within a window of time. To have a book end and have a thought be quick and compact like a teeshirt we can tidily fold and tuck away into a drawer. I listen to podcasts like I read books: I read, stop, bookmark the page, and pick up where I left off. The assumption that content should be easily digestible is patronizing. Just noticing while writing this that we use food metaphors for content, we consume and digest words. I may the odd man out here, as I heard from others who self-identify as productive, busy, important executives that they want their learning to churned and scoped down like the tasks they jump around from in their day-to-day lives.
 It’s in The Phaedrus that Plato references that the technology of writing may have a deleterious impact on human memory (for if we outsource memory to writing, we won’t exercise the capability and it will fade over time). It’s a good reminder to see that there’s long-standing fear about how some new technology will lead to our gradual degradation into use automata. I tend to think this stems from a lack of imagination and proclivities towards fear: we ground our predictions in what the future will look like after some change in the context of what we see and know today. The wonderful thing about participating in non-linear and complex systems is that they are complex and can react in ways we don’t predict in advance. Tim Harford does an excellent job showing the complex entanglement of social developments from new technologies and inventions in 50 Inventions That Shaped the Modern Economy, illustrating, for example, the relationship between birth control and gender equality in the workplace. Plato feared writing. Some people fear AI. I tend to most fear, based on my own experience, distraction-inducing technologies, even something as simple as the notifications on our devices. Notifications annihilate coherence, which Mihnea and I both prize. He recently shared that enough Generation Zsters use closed captioning to keep their attention focused on movies and videos that it’s capturing media attention.
 I wanted to show a picture of pranayama finger positions but everything I found looked ridiculous. Here, for example, is the wikiHow picture. In my brief but fruitless search, I also learned that Hilary Clinton swore by pranayama techniques to calm herself in the aftermath of the 2016 presidential election. Naturally, this is only interesting because it’s Hilary Clinton, both because of the star factor (our curiosity to know famous people’s habits, as if this where some sort of privileged knowledge; the phenomenon of completely disregarding privacy when it comes to famous people is bizarre. Is it cultivated by their constant visibility in the media or an intrinsic default of human group psychology and dominance hierarchies?) and the surprise factor thinking about Hilary behaving this way. Then again, Jeb Bush was on a paleo diet leading up to the elections. Politicians are people, too.
 Three very smart Johns I know, John Hall (CEO of Intapp), John Frankel (Managing Partner at ffVC), and John Deighton (Professor at Harvard Business School), all believe that technology moves in cycles between centralization and decentralization: the mainframe was followed by the client-server, which, after virtualization, was followed by the public cloud, which, now that we’re getting queasy about the power of so much centralized data and have a network of mobile devices and IoT-enabled cars and toasters and things hopping around the world, will be followed by decentralization once more once we can get GPUs and TPUs small enough to work well on mobile devices. We are very, very close. Distributed ledgers and databases are also harbingers of what’s to come. I’m keen to know what it means for the ideological superstructures on top of the material backbone of society. Or maybe information technologies alter some of Marx’s axioms? It’s definitely the case that we need a new economic model for data-powered software, the same way SaaS subscription business models were created for the cloud.
 It’s funny that Bertrand Russell decided to embody the first-order logic paradox that yielded Gödel’s incompleteness theorem as the barber paradox: “The barber is the “one who shaves all those, and those only, who do not shave themselves.” The question is, does the barber shave himself?” I haven’t seen a lot recently in the AI community grounded consciousness on recursive loops the way Hofstadter did back in the days of symbolic AI as canonized in Gödel, Escher, Bach. As intimated in this post, Mihnea and I are both after an articulation of minds as entangled phenomenon, selves not as static brains in vats but as dynamic and complex systems entangled in different social contexts.
 Mihnea is a priest of language. He ends his book Inside Man with a gesture towards the communicative ethics that guide his internal dialogues as much as his dialogues with others. He cultivates minute precision in language in part because generalities and abstractions leave room for an interlocutor to extend a comment or criticism to encompass their entire being: “try tilting your wrist a little to the right when you swing the tennis racket” turns into “you suck at tennis and that means you can’t learn anything new and that means your career is ruined and that means you’re a pathetic failure, oh yeah, and that also means I think you’re a pathetic failure.” He’s helped me come to understand how dangerous it can be when others cannot refer to a particular behavior or activity when they provide feedback, and instead have couched multiple vague impressions into a narrative that leads to nothing but harm to a student or teammate.
The featured image is the Contemplative Bodhisattva, National Treasure of Korea No. 83. Insured for an estimated 50 billion won, it is the most expensive Korean national treasure. The semi-seated figure is Maitreya, a bodhisattva (someone working towards Buddhahood, but who has not yet attained it) prophesied to appear on Earth, achieve enlightenment, and teach the dharma, the way of being in line with the right order of the universe. I learned how remarkable this statue is when I attempted to crop the photo to make the proportions fit more nicely into the frame of my posts (I prefer square images or flat images with narrow heights, rather than long upright rectangles). It lost its aura when I cropped it, There is a majestic balance between the narrow, smooth lines of Maitreya’s chest, the silent grace of his necklaces, and the textured flow of the draping cloth. His bent knee carries forth the line sketched by the upturned rim of cloth upon his seat. He’s leaning forward slightly (apparent when you view the figure in profile) and it’s as if the lower half is required to cradle his balance and keep the painting unified and whole. The chipped enamel reminds me of the skin of yellow beats, shedding ground dirt to canvass concentric circles that beam inside.
Not every event in your life has had profound significance for you. There are a few, however, that I would consider likely to have changed things for you, to have illuminated your path. Ordinarily, events that change our path are impersonal affairs, and yet are extremely personal. – Don Juan Matus, a (potentially fictional) Yaqui shaman from Mexico
The windowless classroom was dark. We were sitting around a rectangular table looking at a projection of Rembrandt’s Syndics of the Drapers’ Guild. Seated opposite the projector, I could see student faces punctuate the darkness, arching noses and blunt hair cuts carving topography through the reddish glow.
“What do you see?”
Barbara Stafford’s voice had the crackly timbre of a Pablo Casals record and her burnt-orange hair was bi-toned like a Rothko painting. She wore downtown attire, suits far too elegant for campus with collars that added movement and texture to otherwise flat lines. We were in her Art History 101 seminar, an option for University of Chicago undergrads to satisfy a core arts & humanities requirement. Most of us were curious about art but wouldn’t major in art history; some wished they were elsewhere. Barbara knew this.
“A sort of darkness and suspicion,” offered one student.
“Smugness in the projection of power,” added another.
“But those are interpretations! What about the men that makes them look suspicious or smug? Start with concrete details. What do you see?”
No one spoke. For some reason this was really hard. It didn’t occur to anyone to say something as literal as “I see a group of men, most of whom have long, curly, light-brown hair, in black robes with wide-brimmed tall black hats sitting around a table draped with a red Persian rug in the daytime.” Too obvious, like posing a basic question about a math proof (where someone else inevitably poses the question and the professor inevitably remarks how great a question it is to our curious but proud dismay). We couldn’t see the painting because we were too busy searching for a way of seeing that would show others how smart we were.
“Katie, you’re our resident fashionista. What strikes you about their clothing?”
Adrenaline surged. I felt my face glow in the reddish hue of the projector, watched others’ faces turn to look at mine, felt a mixture of embarrassment at being tokenized as the student who cared most about clothes and appearance and pride that Barbara found something worth noticing, in particular given her own evident attention to style. Clothes weren’t just clothes for me: they were both art and protection. The prospect of wearing the same J Crew sweater or Seven jeans as another girl had been cruelly beaten out of me in seventh grade, when a queen mean girl snidely asked, in chemistry class, if I knew that she had worn the exact same salmon-colored Gap button-down crew neck cotton sweater, simply in the cream color, the day before. My mom had gotten me the sweater. All moms got their kids Gap sweaters in those days. The insinuation was preposterous but stung like a wasp: henceforth I felt a tinge of awkwardness upon noticing another woman wearing an article of clothing I owned. In those days I wore long ribbons in my ponytails to make my hair seem longer than it was, like extensions. I often wore scarves, having admired the elegance of Spanish women tucking silk scarves under propped collared shirts during my senior year of high school abroad in Burgos, Spain. Material hung everywhere around me. I liked how it moved in the wind and encircled me in the grace I feared I lacked.
“I guess the collars draw your attention. The three guys sitting down have longer collars. They look like bibs. The collar of the guy in the middle is tied tight, barely any space between the folds. A silver locket emerges from underneath. The collars of the two men to his left (and our right) billow more, they’re bunchy, as if those two weren’t so anal retentive when they get dressed in the morning. They also have kinder expressions, especially the guy directly to the left of the one in the center. And then it’s as if the collars of the men standing to the right had too much starch. They’re propped up and overly stiff, caricature stiff. You almost get the feeling Rembrandt added extra air to these puffed up collars to make a statement about the men having their portrait done. Like, someone who had taste and grace wouldn’t have a collar that was so visibly puffy and stiff. Also, the guy in the back doesn’t have a hat like the others.”
Barbara glowed. I’d given her something to work with, a constraint from which to create a world. I felt like I’d just finished a performance, felt the adrenaline subside as students’ turned their heads back to face the painting again, shifted their attention to the next question, the next comment, the next brush stroke in Syndics of the Drapers’ Guild.
After a few more turns goading students to describe the painting, Barbara stepped out of her role as Socrates and told us about the painting’s historical context. I don’t remember what she said or how she looked when she said it. I don’t remember every class with her. I do remember a homework assignment she gave inspired by André Breton’s objet trouvé, a surrealist technique designed to get outside our standard habits of perception, to let objects we wouldn’t normally see pop into our attention. I wrote about my roommate’s black high-heeled shoes and Barbara could tell I was reading Nietzsche’s Birth of Tragedy because I kept referencing Apollo and Dionysus, godheads for constructive reason and destructive passion, entropy pulling us ever to our demise. I also remember a class where we studied Cindy Sherman photos, in particular her self portraits as Caravaggio’s Bacchus and her film still from Hitchcock’s Vertigo. We took a trip to the Chicago Art Institute and looked at few paintings together. Barbara advised us never to use the handheld audio guides as they would pollute our vision. We had to learn how to trust ourselves and observe the world like scientists.
In the fourth paragraph of the bio on her personal website, Barbara says that “she likes to touch the earth without gloves.” She explains that this means she doesn’t just write about art and how we perceive images, but also “embodies her ideas in exhibitions.”
I interpret the sentence differently. To touch the earth without gloves is to see the details, to pull back the covers of intentionality and watch as if no one were watching. Arts and humanities departments are struggling to stay relevant in an age where we value computer science, mathematics, and engineering. But Barbara didn’t teach us about art. She taught us how to see, taught us how to make room for the phenomenon in front of us. Paintings like Rembrandt’s Syndics of the Drapers’ Guild were a convenient vehicle for training skills that can be transferred and used elsewhere, skills which, I’d argue, are not only relevant but essential to being strong leaders, exacting scientists, and respectful colleagues. No matter what field we work in, we must all work all the time to notice our cognitive biases, the ever-present mind ghosts that distort our vision. We must make room for observation. Encounter others as they are, hear them, remember their words, watch how their emotions speak through the slight curl of their lips and the upturned arch of their eyebrows. Great software needs more than just engineering and science: it needs designers who observe the world to identify features worth building.
I am indebted to Barbara for teaching me how to see. She is integral to the success I’ve had in my career in technology.
Of all the memories I could share about my college experience, why share this one? Why do I remember it so vividly? What makes this memory profound?
I recently read Carlos Casteñeda’s The Active Side of Infinity and resonated with book’s premise as “a collection of memorable events” Casteñeda recounts as an exercise to become a warrior-traveler like the shamans who lived in Mexico in ancient times. Don Juan Matus, a (potentially fictional) Yaqui shaman who plays the character of Casteñeda’s guru in most of his work, considers the album “an exercise in discipline and impartiality…an act of war.” On his first pass, Casteñeda picks out memories he assumes should be important in shaping him as an individual, events like getting accepted to the anthropology program at UCLA or almost marrying a Kay Condor. Don Juan dismisses them as “a pile of nonsense,” noting they are focused on his own emotions rather than being “impersonal affairs” that are nonetheless “extremely personal.”
The first story Casteñeda tells that don Juan deems fit for a warrior-traveler is about Madame Ludmilla, “a round, short woman with bleached-blond hair…wearing a a red silk robe with feathery, flouncy sleeves and red slippers with furry balls on top” who performs a grotesque strip tease called “figures in front of a mirror.” The visuals remind me of dream sequence from a Fellini movie, filled with the voluptuousness of wrinkled skin and sagging breasts and the brute force of the carnivalesque. Casteñeda’s writing is noticeably better when he starts telling Madame Ludmilla’s story: there’s more detail, more life. We can picture others, smell the putrid stench of dried vomit behind the bar, relive the event with Casteñeda and recognize a truth in what he’s lived, not because we’ve had the exact same experience, but because we’ve experienced something similar enough to meet him in the overtones. “What makes [this story] different and memorable,” explains don Juan, “is that it touches every one of us human beings, not just you.”
Don Juan calls this war because it requires discipline to see the world this way. Day in and day out, structures around us bid us to focus our attention on ourselves, to view the world through the prism of self-improvement and self-criticism: What do I want from this encounter? What does he think of me? When I took that action, did she react with admiration or contempt? Is she thinner than I am? Look at her thighs in those pants–if I keep eating desserts they way I do, my thighs will start to look like that too. I’ve fully adopted the growth mindset and am currently working on empathy: in that last encounter, I would only give myself a 4/10 on my empathy scale. But don’t you see that I’m an ESFJ? You have to understand my actions through the prism of my self-revealed personality guide! It’s as if we live in a self-development petri dish, where experiences with others are instruments and experiments to make us better. Everything we live, everyone we meet, and everything we remember gets distorted through a particular analytical prism: we don’t see and love others, we see them through the comparative machine of the pre-frontal cortex, comparing, contrasting, categorizing, evaluating them through the prism of how they help or hinder our ability to become the future self we aspire to become.
Warrior-travelers like don Juan fight against this tendency. Collecting an album of memorable events is a exercise in learning how to live differently, to change how we interpret our memories and first-person experiences. As non-warriors, we view memories as scars, events that shape our personality and make us who we are. As warriors, we view ourselves as instruments and vessels to perceive truths worth sharing, where events just so happen to happen to us so we can feel them deeply enough and experience the minute details required to share vivid details with others. Warriors are instruments of the universe, vessels for the universe to come to know itself. We can’t distort what others feel because we want them to like us or act a certain way because of us: we have to see others for who they are, make space for negative and positive emotions. What matters isn’t that we improve or succeed, but that we increase the range of what’s perceivable. Only then can we transmit information with the force required to heal or inspire. Only then are we fearless.
Don Juan’s ways of seeing and being weren’t all new to me (although there were some crazy ideas of viewing people as floating energy balls). There are sprinklings of my quest to live outside the self in many posts on the blog. Rather, TheActive Side of Infinity helped me clarify why I share first-person stories in the first place. I don’t write to tell the world about myself or share experiences in an effort to shape my identity. This isn’t catharsis. I write to be a vessel, a warrior-traveller. To share what I felt and saw and smelled and touched as I lived experiences that I didn’t know would be important at the time but that have managed to stick around, like Argos, always coming back, somehow catalyzing feelings of love and gratitude as intense today as they were when I first experienced them. To use my experiences to illustrate things we are all likely to experience in some way or another. To turn memories into stories worth sharing, with details concrete enough that you, reader, can feel them, can relate to them, and understand a truth that, ill-defined and informal though it may be, is searing in its beauty.
This post features two excerpts from my warrior-traveler album, both from my time as an undergraduate at the University of Chicago. I ask myself: if I were speaking to someone for the first time and they asked me to tell them about myself, starting in college, would I share these memories? Likely not. But it’s a worthwhile to wonder if doing so might change the world for the good.
When I attended the University of Chicago, very few professors gave students long reading assignments for the first class. Some would share a syllabus, others would circulate a few questions to get us thinking. No one except Loren Kruger expected us to read half of Anna Karenina and be prepared to discuss Tolstoy’s use of literary from to illustrate 19th-century Russian class structures and ideology.
Loren was tall and big boned. A South African, she once commented on J.M. Coetzee’s startling ability to wield power through silence. She shared his quiet intensity, demanded such rigor and precision in her own work that couldn’t but demand it from others. The tiredness of the old world laced her eyes, but her work was about resistance; she wrote about Brecht breaking boundaries in theater, art as an iron-hot rod that could shed society’s tired skin and make room for something new. She thought email destroyed intimacy because the virtual distance emboldened students to reach out far more frequently than when they had to brave a face-to-face encounter. About fifteen students attended the first class. By the third class, there were only three of us. With two teaching assistants (a French speaker and a German speaker), the student:teacher ratio became one:one.
Loren intimated me, too. The culture at the University of Chicago favored critical thinking and debate, so I never worried about whether my comments would offend others or come off as bitchy (at Stanford, sadly, this was often the case). I did worry about whether my ideas made sense. Being the most talkative student in a class of three meant I was constantly exposed in Loren’s class, subjecting myself to feedback and criticism. She criticized openly and copiously, pushing us for precision, depth, insight. It was tough love.
The first thing Loren taught me was the importance of providing concrete examples to test how well I understood a theory. We were reading Karl Marx, either The German Ideology or the first volume of Das Kapital. I confidently answered Loren’s questions about the text, reshuffling Marx’s words or restating what he’d written in my own words. She then asked me to provide a real-world example of one of his theories. I was blank. Had no clue how to answer. I’d grown accustomed to thinking at a level of abstraction, riding text like a surfer rides the top of a wave without grounding the thoughts in particular examples my mind could concretely imagine. The gap humbled me, changed how I test whether I understand something. This happens to be a critical skill in my current work in technology, given how much marketing and business language is high-level and general: teams think they are thinking the same thing, only to realize that with a little more detail they are totally misaligned.
We wrote midterm papers. I don’t remember what I wrote about but do remember opening the email with the grade and her comments, laptop propped on my knees and back resting against the powder-blue wall in my bedroom off the kitchen in the apartment on Woodlawn Avenue. B+. “You are capable of much more than this.” Up rang my old friend imposture syndrome: no, I’m not, what looks like eloquence in class is just a sham, she’s going to realize I’m not what she thinks I am, useless, stupid, I’ll never be able to translate what I can say into writing. I don’t know how. Tucked behind the fiddling furies whispered the faint voice of reason: You do remember that you wrote your paper in a few hours, right? That you were rushing around after the house was robbed for the second time and you had to move?
Before writing our final papers, we had to submit and receive feedback on a formal prospectus rather than just picking a topic. We’d read Franz Fanon’s The Wretched of the Earth and I worked with Dustin (my personal TA) to craft a prospectus analyzing Gillo Pontecorvo’s Battle of Algiers in light of some of Fanon’s descriptions of the experience of colonialism.
Once again, Loren critiqued it harshly. This time I panicked. I didn’t want to disappoint her again, didn’t want the paper to confirm to both of us that I was useless, incompetent, unable to distill my thinking into clear and cogent writing. The topic was new to me and out of my comfort zone: I wasn’t an expert in negritude and or post-colonial critical theory. I wrote her a desperate email suggesting I write about Baudelaire and Adorno instead. I’d written many successful papers about French Romanticism and Symbolism and was on safer ground.
Her response to my anxious plea was one of the more meaningful interactions I’ve ever had with a professor.
Katie, stop thinking about what you’re going to write and just write. You are spending far too much energy worrying about your topic and what you might or might not produce. I am more than confident you are capable of writing something marvelous about the subject you’ve chosen. You’ve demonstrated that to me over the quarter. My critiques of your prospectus were intended to help you refine your thinking, not push you to work on something else. Just work!
I smiled a sigh of relief. No professor had ever said that to me before. Loren had paid attention, noticed symptoms of anxiety but didn’t placate or coddle me. She remained tough because she believed I could improve. Braved the mania. This interaction has had a longer-lasting impact on me than anything I learned about the subject matter in her class. I can call it to mind today, in an entirely different context of activity, to galvanize myself to get started when I’m anxious about a project at work.
The happiest moments writing my final paper about the Battle of Algiers were the moments describing what I saw in the film. I love using words to replay sequences of stills, love interpreting how the placement of objects or people in a still creates an emotional effect. My knack for doing so stems back to what I learned in Art History 101. I think I got an A on the paper. I don’t remember or care. What stays with me is my gratitude to Loren for not letting me give up, and the clear evidence she cared enough about me to put in the work required to help me grow.
 This isn’t the first time things I learned in Barbara’s class have made it into my blog. The objet trouvé exercise inspired a former blog post.
 I ended up having my own private teaching assistant, a French PhD named Dustin. He told me any self-respecting comparative literature scholar could read and speak both French and German fluently, inspiring me to spend the following year in Germany.
 I picked up my copy of The Marx-Engels Reader (MER) to remember what text we read in Loren’s class. I first read other texts in the MER in Classics of Social and Political Thought, a social sciences survey course that I took to fulfilled a core requirement (similar to Barbara’s Art History 101) my sophomore year. One thing that leads me to believe we read The German Ideology or volume one of Das Kapital in Loren’s class is the difference in my handwriting between years two and four of college. In year two, my handwriting still had round playfulness to it. The letters are young and joyful, but look like they took a long time to write. I remember noticing that my math professors all seemed to adopt a more compact and efficient font when they wrote proofs on the chalkboard: the a’s were totally sans-serif, loopless. Letters were small. They occupied little space and did what they could not to draw attention to themselves so the thinker could focus on the logic and ideas they represented. I liked those selfless a’s and deliberately changed my handwriting to imitate my math professors. The outcome shows in my MER. I apparently used to like check marks to signal something important: they show up next to straight lines illuminating passages to come back to. A few great notes in the margins are: “Hegelian–>Too preoccupied w/ spirit coming to itself at basis…remember we are in (in is circled) world of material” and “Inauthenticity–>Displacement of authentic action b/c always work for later (university/alienation w/ me?)”
 There has to be a ton of analytic philosophy ink spilled on this question, but it’s interesting to think about what kinds of thinking is advanced by pure formalisms that would be hampered by ties to concrete, imaginable referents and what kinds of thinking degrade into senseless mumbo jumbo without ties to concrete, imaginable referents. Marketing language and politically correct platitudes definitely fall into category two. One contemporary symptom of not knowing what one’s talking about is the abuse of the demonstrative adjective that. Interestingly enough, such demonstrative abusers never talk about thises, they only talk about thats.This may be used emphatically and demonstratively in a Twitter or Facebook conversation: when someone wholeheartedly supports a comment, critique, or example of some point, they’ll write This as a stand-alone sentence with super-demonstrative reference power, power strong enough to encompass the entire statement made before it. That’s actually ok. It’s referring to one thing, the thing stated just above it. It’s dramatic but points to something the listener/reader can also point to. The problem with the abused that is that it starts to refer to a general class of things that are assumed, in the context of the conversation, to have some mutually understood functional value: “To successfully negotiate the meeting, you have to have that presentation.” “Have that conversation — it’s the only way to support your D&I efforts!” Here, the listener cannot imagine any particular that that these words denote. The speaker is pointing to a class of objects she assumes the listener is also familiar with and agrees exist. A conversation about what? A presentation that looks like what? There are so many different kinds and qualities of conversations or presentations that could fit the bill. I hear this used all the time and cringe a little inside every time. I’m curious to know if others have the same reaction I do, or if I should update my grammar police to accept what has become common usage. Leibniz, on the other hand, was an early modern staunch defender of cogitatio caeca (Latin for blind thought), which referred to our ability to calculate and manipulate formal symbols and create truthful statements without requiring the halting step of imagining the concrete objects these symbols refer to. This, he argued against conservatives like Thomas Hobbes, was crucial to advance mathematics. There are structural similarities in the current debates about explainability of machine learning algorithms, even though that which is imagined or understood may lie on a different epistemological, ontological, and logical plane.
 People tell me that one reason they like my talks about machine learning is that I use a lot of examples to help them understand abstract concepts. Many talks are structured like this one, where I walk an audience through the decisions they would have to make as a cross-functional team collaborating on a machine learning application. The example comes from a project former colleagues worked on. I realized over the last couple of years that no matter how much I like public speaking, I am horrified by the prospect of specializing in speaking or thought leadership and not being actively engaged in the nitty-gritty, day-to-day work of building systems and observing first-person how people interact with them. I believe the existential horror stems from my deep-seated beliefs about language and communication, in my deep-seated discomfort with words that don’t refer to anything. Diving into this would be worthwhile: there’s a big difference between the fictional imagination, the ability to bring to life the concrete particularity of something or someone that doesn’t exist, and the vagueness of generalities lacking reference. The second does harm and breeds stereotypes. The first is not only potent in the realm of fiction, but, as my fiancé Mihnea is helping me understand, may well be one of the master skills of the entrepreneur and executive. Getting people aligned and galvanized around a vision can only occur if that vision is concrete, compelling, and believable. An imaginable state of the world we can all inhabit, even if it doesn’t exist yet. A tractable as if that has the power to influence what we do and how we behave today so as to encourage its creation and possibility.
 I believe this is the first time I’ve had a footnote referring to another footnote (I did play around with writing an incorrigibly long photo caption in Analogue Repeaters). Funny this ties to the footnote just above (hello there, dear footnote!) and even funnier that footnote 4 is about demonstrative reference, including the this discursive reference. But it’s seriously another thought so I felt it merited it’s own footnote as opposed to being the second half of footnote 5. When I sat down to write this post, I originally planned to write about the curious and incredible potency of imagined future states as tools to direct action in the present. I’ve been thinking about this conceptual structure for a long time, having written about it in the context of seventeenth-century French philosophy, math, and literature in my dissertation. The structure has been around since the Greeks (Aristotle references it in Book III of the Nicomachean Ethics) and is used in startup culture today. I started writing a post on the topic in August, 2018. Here’s the text I found in the incomplete draft when I reopened it a few days ago:
A goal is a thinking tool.
A good goal motivates through structured rewards. It keeps people focused on an outcome, helps them prioritize actions and say no to things, and stretches them to work harder than they would otherwise. Wise people say that a good goal should be about 80% achievable. Wise leaders make time reward and recognize inputs and outputs.
A great goal reframes what’s possible. It is moonshot and requires the suspension of disbelief, the willingness to quiet all the we can’ts and believe something surreal, to sacrifice realism and make room for excellence. It assumes a future outcome that is so outlandish, so bold, that when you work backwards through the series of steps required to achieve it, you start to do great things you wouldn’t have done otherwise. Fools say that it doesn’t matter if you never come close to realizing a great goal, because the very act of supposing it could be possible and reorienting your compass has already resulted in concrete progress towards a slightly more reasonable but still way above average outcome.
Good goals create outcomes. Great goals create legacies.
This text alienates me. It reminds me of an inspirational business book: the syncopation and pace seem geared to stir pathos and excitement. How curious that the self evolves so quickly, that the I looking back on the same I’s creations of a few months ago feels like she is observing a stranger, someone speaking a different language and inhabiting a different world. But of course that’s the case. Of course being in a different environment shapes how one thinks and what one sees. And the lesson here is not one of fear around instability of character: it’s one that underlines to crucial importance of context, the crucial importance of taking care to select our surroundings so we fill our brains with thoughts and words that shape a world we find beautiful, a world we can call home. The other point of this footnote is a comment on the creative process. Readers may have noted the quotation from Pascal that accompanies all my posts: “The last thing one settles in writing a book is what one should put in first.” The joy of writing, for me, as for Mihnea and Kevin Kelly and many others, lies in unpacking an intuition, sitting down in front of a silent wall and a silent world to try to better understand something. I’m happiest when, writing fast, bad, and wrong to give my thoughts space to unfurl, I discover something I wouldn’t have discovered had I not written. Writing creates these thoughts. It’s possible they lie dormant with potential inside the dense snarl of an intuition and possible they wouldn’t have existed otherwise. Topic for another post. With this post, I originally intended to use the anecdote about Stafford’s class to show the importance of using concrete details, to illustrate how training in art history may actually be great training for the tasks of a leader and CEO. But as my mind circled around the structure that would make this kind of intro make sense, I was called to write about Casteñeda, pulled there by my emotions and how meaningful these memories of Barbara and Loren felt. I changed the topic. Followed the path my emotions carved for me. The process was painful and anxiety-inducing. But it also felt like the kind of struggle I wanted to undertake and live through in the service of writing something worth reading, the purpose of my blog.
 About six months ago, I learned that an Algerian taxi driver in Montréal was the nephew of Ali La Pointe, the revolutionary martyr hero in Battle of Algiers. It’s possible he was lying, but he was delighted by the fact that I’d seen and loved the film and told me about the heroic deeds of another uncle who didn’t have the same iconic stardom as Ali. Later that evening I attended a dinner hosted by Element AI and couldn’t help but tell Yoshua Bengio about the incredible conversation I had in the taxi cab. He looked at me with confusion and discomfort, put somewhat out of place and mind by my not accommodating the customary rules of conversation with acquaintances.
The featured image is the Syndics of the Drapers’ Guild, which Rembrandt painted in 1662. The assembled drapers assess the quality of different weaves and cloths, presumably, here, assessing the quality of the red rug splayed over the table. In Ways of Seeing, John Berger writes about how oil paintings signified social status in the early modern period. Having your portrait done showed you’d made it, the way driving a Porsche around town would do so today. When I mentioned that the collars seemed a little out of place, Barbara Stafford found the detail relevant precisely because of the plausibility that Rembrandt was including hints of disdain and critique in the commissioned portraits, mocking both his subjects and his dependence on them to get by.
I did my first TED Talk in October, part of a TED Salon in New York City. I’ve thanked Alex Moura, TED’s technology coordinator, for inviting me and coaching me through the process, but it never hurts to say thanks multiple times. I’d also like to extend thanks to Cloe Sasha, Crawford Hunt, Corey Hajim, and the NYC production and makeup crew. As I wrote about last summer, stage crews are pre-talk angels that help me metabolize anxiety through humor, emotion, and the simple joy of connecting with another individual. At the TED headquarters, the crew gave me a detailed run down of how the audio and video systems work, how they edit raw recordings, how different speakers behave before talks, and why they decided to be in their field. I learned something. I can precisely recall how the production manager laced his naturally spindly, Woody-Allenesque voice with a practiced hint of care, cradling the speakers with confidence before we took the stage. His presence exuded joy through focused concentration, the joy of a professional who does his work with integrity.
Here’s the talk. My hands look so theatrical because I normally release nervous energy as kinetic energy by pacing back and forth. TED taught me that’s a chump move that distracts the audience: a great presenter stays put so people can focus. The words, story, tone, pitch, dynamics, emotions, and face cohere into an impression that draws people in because multiple parts of their brain unite to say “Pay attention! Valuable stuff is coming your way!” I have some work to do before I master that. When I gave the talk, my mind’s albatross meta voice was clamoring “Stay put! Resist the urge! Stop doing that! In channel 8 you’re at chunk 3 of your AI talk right? Enter spam filter to illustrate the difference between Boolean rules and a learned classifier…what’s that person in row 2 thinking? Is the nod…yeah, seems ok but then again that eyebrow…interference from channel 18 your fucking vest isn’t hooked…shit…” resulting in moments in my talk where my gestures look like moves from Michael Jackson’s Thriller. Come to think of it, I’m often appalled by the emotions my face displays in snapshots from talks. Seeing them is stranger than hearing the sound of my voice on a recording. My emotional landscape shifts quickly, perhaps quickly enough that the disgust of one femtosecond resolves into content in the next, keeping the audience engaged as time flows but leading to alien distortions when time freezes.
As I argued in my dissertation (partially summarized here), I’m a fan of Descartes’ pedagogical opinion that we learn more from understanding how someone discovered something–and then challenging ourselves to put this process into practice so we can come to discover something on our own–than we do from studying the finished product. I therefore figured it may be useful for a few readers if I shared how I wrote this talk and a few of the lessons I learned along the way.
Like Edgar Allen Poe in his Philosophy of Composition, I started by thinking about the intention of my talk. Like Poe, I wanted to write a talk “that should suit at once the popular and the critical taste.” This was TED, after all, not an academic lecture at the University of Toronto. I needed an introduction to AI that would be accessible to a general audience but had enough rigor to merit respect from experts in the field. I often strive to speak and write according to this intention, as I find saying something in plain old English pressure tests how well I understand something. Poe then talks about the length of a poem, wanting something that can be read in one sitting. Most of the time, the venue tells you how long to speak, and it’s way harder to write an informative 10-minute talk than a cohesive 45-minute talk. TED gave me a constraint of 12 minutes. So my task was to communicate one useful insight about AI that could change how a non-expert thinks about the term, to shows this one insight a few different ways in 12 minutes, and to do this while staving off the inevitable grimaces from experts.
Alex and I started with a brief call and landed, provisionally, on adapting a talk I gave at Startupfest in 2017. The wonderful Rebecca Croll, referred by the also wonderful Jana Eggers, invited me to speak about the future of AI. As that’s just fiction, I figured I’d own it and tell the fictional life story of one Jean-Sebastian Gagnon, a boy born the day I gave my talk. The talk pits futurism alongside nostalgia, showing how age-old coming-of-age moments, like learning about music and falling in love, are shaped by AI. He eventually realizes I’m giving a talk about him and intervenes to correct errors in my narrative. Alex suggested I adapt this to a business applying AI for the first time. I was open to it, but didn’t have a clear intuition. It sat there on the back burner of possibility.
The first idea that got me excited came to me on a run one morning through the Toronto ravines. The air was crisp with late summer haze. Limestone dust bloomed mushroom clouds that hovered, briefly, before freeing the landscape into whitewashed pastel. The flowers had peaked, each turgid petal blooming signs of eminent decay. Nearby a woman with violet grey hair that matched the color of her windbreaker and three-quarter length leggings used her leg strength to hold back a black lab who wanted nothing more than to run beside me down the widening path. With music blaring in my ears, I was brainstorming different ideas and, unsurprisingly, found myself thinking about an anecdote I frequently use ground the audience’s intuitions about AI when I open my talks. It’s a story about my friend Andrew bootstrapping a legal opinion from a database of former law student responses (I wrote about here and used to open this talk). I begin that narrative with a reference to Tim Urban’s admirable talk about procrastination. And, suddenly it hit: a TED talk within a TED talk! Recursion, crown prince of cognitive delight. “OMG that’s it! I’ll generate a TED talk by training a machine learning model on all the past TED talks! Evan Macmillan’s analysis of word frequency in all the former Data Driven talks worked well. Adapt his approach? Maybe. Or, I can recount the process of creation (meta reference not intended) to show what happens when we apply AI: show the ridiculous mistakes our model will make, show the differences between human and machine creation, grapple with the practical tradeoffs between accuracy and output, and end with a heroic message about a team bridging the different ways they see the same problem to create something of value. Awesome!” I called my parents, thrilled. Wrote slack messages to engage two of my colleagues. They were thrilled. We were off the to races, pushed by the adrenaline of discovery amplified by the endorphins of a morning run.
The joy of my mind racing, creating at a million miles an hour, provided uncanny relief. It was one of the first morning runs I’d taken in a while, after experimenting over the summer with schedule that prioritized cognitive output. As my mind is clearest between 6:00-10:00, I got to the office before 6:30 am and shifted exercise to the afternoons or evenings. I didn’t neglect my body, but deprioritized taking care of it. And in doing so, I committed the cardinal sin of Western metaphysics, fell into the Cartesian trap I myself tried to undo in my dissertation! The mistake was to think of the mind and its creations as independent from the body. Feeling my creativity surge on that morning run hit home: I thought in a way I hadn’t for months. I’ve since changed my schedule, and do make time for creative work in the morning. But I also exercise. It fuels my coherence and my generative range, the two things I value most.
I wrote to Alex at TED to tell him about my awesome idea. He was like, “yeah it’s cool, but we don’t have enough time to do that. Can you send me a draft on another topic so we can iterate together asap?”
Back to the drawing board.
Now, having a day job as an executive, I don’t have a lot of time to write my talks. I’ve perfected the art of putting aside two or three hours just before I have to give a presentation to tailor a talk to an audience. These kinds of constraints empower and fuel me. They don’t leave enough time for the meta critical voices to pop up and wave their scary-ass Macbeth brooms in my mind. I just produce. And, given the short time constraints, I never write my talks. Instead, I write the talk equivalent of a jazz scaffold, with pictures like a chord series upon which the musicians improvise. My talk’s logic is normally partitioned into phrases, stanzas, chunks. Another way to think about the slides are like the leitmotifs bards used in oral poetry, like the “rosy-fingered dawn” we see pop up in Homer. And I’ve done this, with reward feedback loops, for a few years. It’s ossified into a method.
A TED talk isn’t jazz. It’s like a classical concerto. You write it out in advance, practice it, and execute the harmonics, 16th notes, and double stops with virtuosic precision (I’m a violinist, so these are kinds of things I worry about). You rehearse, you don’t improvise.
That challenged me in a few ways.
First, breaking my habits spurred the nervousness we feel when we learn how to do something new. When I sat down to write a draft, my head went into talk writing mode. I did what I always do: started with an anecdote like the Andrew story, composed in chunks, represented the chunks with mnemonic devices in the form of random pictures that mean something to me and me only (until the audience sees the humorous, synthetic connection to the content). I booked an hour in my calendar to make progress on a draft but that was it: I had to move on and deliver the outcomes I’m responsible for as an executive. I just couldn’t prioritize the talk yet. So the anxiety would arise and fester and fuel itself on the constraints. I’d cobble together a few pictures and some quick notes and send them to the TED team and some friends who wanted to help review, knowing what I was sending was far from decent and near to incomprehensible.
For, of course, they couldn’t make heads or tails of the random pictures that work at a different level of meaning than the talk itself. They’d respond with compassion, do their best to gently communicate their befuddlement. It was humorously bad. And every time they expressed confusion, it eroded my confidence in the foundation of the talk. I felt a need to start over and find a new topic. I face communication foibles like this in other areas of my life and, as I work to be a better leader, try to make the time to think through the perspectives of those receiving a communication before I send it. I imagine other artists or former academic face similar challenges: we are used to prizing originality, only writing or saying something if it hasn’t been said before. In business, there’s value in repetition: saying something multiple times to increase the probability that people have heard and understood it, saying the same thing different ways in different genre and for different audiences, teaching everyone in the company to say the same thing about what you do so there is unity of message for the market. It’s against my instincts, require vigilance to preserve predictability for my teams and consistency for the market. Even for my TED talk, I felt a need to say something new. To titrate AI to its very essence, to its extraordinary quintessence. Some people coaching me said, “no, darling, they just want you to say what you’ve said a million times before. You say it so well.”
Finally, I wasn’t used to rehearsing, sharing drafts, or exposing the finished product outside the act of performance. There are two sub-points here. The first has to do with the relative comfort different people have in exposing partial ideas to others. I’m all for iteration, collaboration, getting feedback early to save time and benefit from the possible compounded creativity of multiple minds. But, as an introvert often mistaken to be an extrovert and someone trained in theoretical mathematics, I feel at ease when I have time to compose compound thoughts, with deliberately ordered, sequential parts, before sharing them with others. Showing a part is like starting a sentence and stopping midway. Not sharing early enough or quickly enough is another foible I have to constantly overcome to fit into tech culture; I believe many introverts feel similarly. Next, the pragmatic and performative context of the speech act changes it. As mentioned above, I’m a violinist, so have rehearsed and then performed many times before. But I don’t rehearse talks and really only have to myself as audience, repeating turns of phrase out loud until they hit the pithy eloquence I prize. What I didn’t know how to do was to rehearse before others. And it’s a hell of a lot harder to give a talk to one person than it is to an audience. It’s as if my self gets turned inside out when I give a talk to a single observing individual. I feel the gaze. Project what I suspect they see into a distorted clown mirror. It’s like a supreme act of objectification, all men’s eyes gazing intently on the body of Venus. Naked. Bare. Exposed. It’s 50,000 times easier to give a speech to an audience, where each individual becomes a part of an amorphous crowd, enabling me, as speaker, to speak to everyone and no one, focusing on the ideas while also picking up the emotional feedback signals acting on a different level of my brain. Learning how to collaborate, how to overcome these performative fears, was useful and something I’ll carry forward.
Midway upon this journey, finding myself within a forest dark, having lost the straightforward pathway, I came across the core idea I wanted to communicate: to succeed with AI, businesses have to invert the value of business processes from techniques to reduce variation/increase scale to techniques to learn something unique about the world that can only be learned by doing what they do. Most people think AI is about using data to enhance, automate, or improve an existing business process. That will provide sustaining innovation, but isn’t revolutionary. Things get interesting when existing products are red herrings to generate data that can create whole new products and business lines. I liked this. It was something I could say. I could see the work of my talking being to share a useful conceptual flip. My friend Jesse Bettencourt helped formulate the idea and sent me this article about Google Maps as an example of how Google uses products and platform to generate new data. I sat down to read it one Saturday and footnote six caught my attention:
I loved the oddness of the detail. And the more I searched, the stranger it got. Thomas Edison helped Henry Ford started a charcoal production facility after he had too much wood from the byproduct of his car business? Kingsford charcoal was a spinoff, post acquisition by Clorox, of a company founded by Henry Ford? Henry Ford pulled one of those oblique marketing moves like the Michelin Brothers, using the charcoal businesses to advertise the wonderful picnics that waited at the end of a long car ride to give people somewhere to drive to? It gripped me. Provided joy in telling it to others. I wanted people to know this. Gripped me enough that it became almost inevitable that I begin my talk with it. After all, I needed a particular story to ground what was otherwise an abstract idea. I go back and forth on my opinion of the Malcolm Gladwellian New Yorker article, which I (potentially erroneously, as I am not Gladwell expert) structure as:
Start with an anecdote that instantiates an abstract idea
Zoom out to articulate the abstraction
Show other examples of the abstraction
Potentially come back to unpack another aspect of the abstraction
Give a conclusion people can remember
This is more or less how my talk is structured. I would have loved to use a whole new form, pushing the genre of the talk to push the boundaries of how we communicate and truly lead to something new. All in due time.
Even after finding and getting excited about Henry Ford, I prevaricated. I wasn’t sure if the thesis was too analytical, wasn’t sure if I wanted to use these precious 12 minutes to show the world my heart, not my mind. There was a triumphant 40 or so hours where I was planning to tell the world’s most boring story from the annals of AI, a story about a team at a Big Four firm that builds a system to automate a review of the generally acceptable accounting principles (GAAP) checklist. I liked it because it’s moral was about teams working together. It showed that real progress with AI won’t be the headlines we read about, it will be the handwritten digit recognition systems that made the ATM possible, the natural language processing tools that made the swipe type on the Android possible. Humble tools we don’t know about but that impact us every day. This would have been a great talk. Aspects from it show up in many of my talks. Maybe some day I’ll give it.
In the end, I never actually wrote my TED talk. I told people I was memorizing an unwritten script. For me, the writing you read on this blog is a very different mode of being than the speaking I do in talks. The acts are separate. My mind space is separate. So, I gave up trying to write a talk and went back to my phrases, my chunks. Went back to rehearsing for an invisible audience. I walked 19 miles up the Hudson River on the day I gave my talk. I had my AirPods in and pretended I was talking on the phone so people didn’t think I was crazy. I suspect I needed the pretense to cushion my concentration in the first place. It was a gorgeous day. I took photos of ships and docks and branches and black struts and rainbow construction cones in bathroom entrances. I repeated sentences out loud again and again until I found their dormant lyricism. I practiced my concerto by myself. And then, I practiced in front of my boyfriend in the hotel room one last time before the show. He listened lovingly, patiently, supportively. He was proud. I felt comfortable having him watch me.
I pulled off the performance. Had a few hiccups here and there, but I made it. I reconnected with Teresa Bejan, a fellow speaker that evening and a former classmate from the University of Chicago. I did one dress rehearsal with my team at work and will always remember their keen attention and loving support. And I found a way to slide my values, my heart, into the talk, ending it with a phrase that encapsulates why I believe technology is and always will be a celebration of human creativity:
Machines did not see steaming coffee, grilled meats and toasted sandwiches in a pile of sawdust. Humans did. It’s when Ford collaborated with his teams that they were able to take the fruits of yesterday’s work and transform it into tomorrow’s value. And that’s what I see as the promise of artificial intelligence.
 In Two lessons from giving talks, I explained why having the AV break down two thirds of the way into a talk may be a hidden secret to effective communication. It breaks the fourth wall, engaging the audience’s attention because it breaks their minds’ expectation that they are in “listen to a talk” mode and engages their empathy. When this has happened to me, the sudden connection with the audience then fuels me to be louder and more creative. We imagine the missing slides together. We connect. It’s always been an incredibly positive feedback loop.
 Michael Jackson felt the need to “stress”, at the beginning of the Thriller music video, that “due to his strong personal convictions, [he wishes] to stress that [Thriller] in no way endorses a belief in the occult.” It’s worth pausing to ask how it’s possible that someone could mistake Thriller as a religious or cult-like ritual, not seeing the irony or camp. The gap between what you think you are saying and what ends up being hard never ceases to amaze me. It’s bewilderingly difficult to write an important email to a group of colleagues that communicates what you intend it to communicate, in particular when emotions and selective information flow and a plurality of goals and intentions kicks in. One of my first memories of appreciating that acutely was when people commented on a 5-minute pitch version of my dissertation I gave at the Stanford BiblioTech conference in 2012. One commenter took an opinion I intended to attribute to Blaise Pascal as something I endorsed at face value. I thought I was reporting on what someone else thought; the other heard it as something I thought. These performative nuances of meaning are crucial, and, I believe, a crucial leadership skill is to be able to manage them as a virtuosic novelist manages the symphony of voices and minds in the characters of her novel. This is one of the many reasons that we need to preserve and develop a rigorous humanistic education in the 21st century. The nuances of how humans make meaning together will be all the more valuable as machines take over more and more narrow skills tasks. Fortunately, the quantum entanglement of human egos will keep us safe for years to come. I’d like to resurrect programs like BiblioTech. They are critically important.
Side note 2. In May, 2005, an old Rasta, high as a kite, told me at the end of a hike through a mountain river near Ocho Rios that I looked like Michael Jackson, presumably towards the end of his life when his skin was more white. That was also a bizarre moment of viewing myself as others view me.
 Alex also told me about an XPRIZE for an AI-generated TED talk back in 2014. Also, my dear friend and forever colleague Dan Bressler came up with the same idea down the line and we had a little moratorium eulogizing a stillborn.
 Andrew Ng has shared similar ideas in many of his talks about building AI products, like this one.
 The Michelin restaurant guide, in my mind, predated the contemporary fad of selling experiences rather than products. Imagine how difficult it would be to market tires. A standard product marketing approach would quickly bore of describing ridges and high-quality rubber. But giving people awesome places to drive to is another story. I think it’s genius. Here’s a photo of an early guide.
The featured image is of an unopened one-pound box of Ford Charcoal Briquets, dating back to 1935-37 and available for purchase on WorthPoint, The Internet of Stuff TM. Seller indicates that she does “not know if briquettes still burn as box has not been opened.” The briquettes were also son by the ton. Can you imagine a ton of charcoal? One of my favorite details, which made its way into my talk, is that a “modern picnic” back in the 1930s featured “sizzling broiled meats, steaming coffee, and toasted sandwiches.” The back of the box marketed more of the “hundred uses” of this “concentrated source of fuel”: to build a cheerful fire in the home (also useful for broiling steak and popping corn), to add a distinct flavor to broiled lobster and fish on boats and yachts without smoking up the place, to make tastier meats in restaurants that keep customers coming back. I find the particularity of the language a delightful contrast to the meaninglessness of the language we fill our brains with in the tech industry, peddling jargon that we kinda think refers to something but that we’re never quite sure refers to anything except the waves of a moment’s trend. Let’s change that.
You might prefer to read the second part of this post first. It will seem more familiar, as its I is closer in voice and referent to the I in other posts on this blog. Or, you might prefer to read the first part first and see how you feel.
It became possible when I noticed him noticing the quick assuredness of Agilulf’s hands arranging pine cones in an isosceles triangle at dawn. It was one of the early moments where he identified completely with nonexistent knight. Where they shared a feeling. Where his need to feel in Agilulf a presence more solid and concrete than the other paladins was met with and mirrored by Agilulf’s own need to count objects, arrange them in geometric problems, resolve problems of arithmetic, apply himself in any way possible to recover precision into a world faintly touched, just breathed on by light. In that hour in which one is least certain of the world’s existence. I was manically focused on Agilulf at the time, so focused that I was unable to recognize Raimbault’s sensitivity through the mask of his youth. But here, now, having come into my desire, the recollection changes. I am able to see that, had Raimbault not sought solidity, and, what’s more, not questioned this very act of seeking as he started to sense that the tiresome need to tuck himself into a ready-made belief system, a system retrofit with ritual and rules of conduct, might actually signal cowardice, he wouldn’t have felt oppression upon seeing the nonexistent knight counting trees, leaves, stones, lances, pine cones, anything in front of him. There was undeniable kinship when we first met, but it was hoplite kinship, the homoerotic, fraternal bonds Plato describes in the Symposium, the mute community born in battle. I stayed silent. Refrained from speech lest anyone, even he whom I protected, discern my womanhood. I’ve grown accustomed to the dull pain of absconding my identity. It rings in my ears like tinnitus. Reminds me that I will always be excluded because I am a woman. When I first entered the knighthood, I tried to be one with them, to participate in the fraternity that arises when they, we, together, act according to the oaths we have taken as knights. But my path is one of solitude. As woman, I am unable to fulfill their aching needs on the battlefield. I watch how they relive Gilgamesh’s love for Enkidu, recover Achilles’ love for Patroclus, how they seek a mirror self to offset the traits they now know they lack and therefore desperately seek to replenish in another. I can only feel this bond, this mute community, from a place of pretense, by covering my gender, a portion of my identity, so they see what they need see and so I can continue doing what I was born to do. What I love. This is where Raimbault, at first, was so mistaken. He reduced me to a caricature, assumed, because of the excellence of my practice, that I existed, that I was definite. He couldn’t grasp that what I sought was an entirely different way of existing, one that reached the apotheosis of form in the form only embodied by the nonexistent knight. That the vagaries of men tired me. Their slothfulness. Their corpulence. Their farts and burps. I wanted more. Would go so far as to enter a nunnery to learn the dark arts of sublimation, of esteeming the permanent above the fleeting joys of the world. It was a bold act of autonomy, a clamoring for existence so that I no longer had to endure the alienation. In retrospect, I’ve come to feel pity for Charlemagne. Like me, he has been rendered myth. Like mine, his story has been written and rewritten so many times. My entrance into the narrative space of play was set in stone epics ago, imprinted on sandstone by Virgil and twisted, like a variation on a musical theme, through Ariosto and Tasso. The Christians fight the Moors (or the Greeks the Turks, or the Romans the Celts). The damsel comes, our virgin Sophronia. She is abducted and flown on a hippogriph through farfetched twists and turns of fortune to a dragon’s den, her virginity kept sacrosanct so the knights have their way. Her skin is white as lilies, her hair long and flowing like the Nile for the privileged one (few?) who see it free from diadems and braids. And in the heat of battle, just when a hero is about to meet his fate at the brutal swords of the Moors, I appear, a man. I appear with mastery and skill, brandishing the enemy to save the hero. He then seeks me to express gratitude to his kin. But, of course, I’ve disappeared, to the riverside to wash my wounds and calves. When he stops looking (and, concomitantly, the reader has forgotten), he finds me. And discovers, what ho!, that I’m a woman. And since I’ve already forgone the stereotypes of that define my gender, so too might I forgo the innocence of idealized sexuality. At least in the freer times of Ariosto, I am naturally also the representation of homoerotic desire. The voice of the oppressed. For everyone who reads, including the knights, want nothing more than to watch as another, a woman, caresses my milk white breasts. And that is precisely why my path is solitary. Why I had no authentic place on the battlefield. Why, tired of this narrative, I entered the convent to subvert it. This time playing a different female role, that of a nun, of Sister Theodora of the order of Saint Colomba. But even here I found myself caught in a new nexus of alienation, bearing the weight of my elected verisimilitude. For what would a nun, who has no experience apart from religious ceremonies, triduums, novenas, gardening, harvesting, vintaging, whippings, slavery, incest, fires, hangings, invasion, sacking, rape and pestilence, know of battle and knighthood? Nothing. Of battles, I now feign to say, I know nothing. I must rid myself of my past precisely as I go about reconstructing it in my tale. But perhaps it is this distancing from my past that grants me freedom to create a new future? Perhaps it is this distancing that has granted me the ability to create worlds within from a pen stroke act of will, as here on the river’s bank I set a mill, and there, beyond the town I trace out a wood, and in this wood follow Agilulf as he scours it through and through, follow him to Priscilla’s palace where I live my own dream of chaste seduction as the nonexistent knight subverts all direct acts of concupiscence and sexuality. There was a rush to the power. A draw to a newfound ability, to a creativity absconded from the pressures and pulls of others, a place of repose. But I also felt fear at how precarious and coincidental the space available to my imagination turned out to be. One morning, for example, as I was writing I was constantly distracted by the clatter of copper and earthenware as the sisters in my convent washed platters for dinner. It reeked of cabbage. The smells infiltrated me. And when I went back the next day to observe my creation I was appalled to see I’d brought the convent to the book, describing the mess hall and how out of place Agilulf seemed at the feast. The contingent determinism of my environment, of a bunch of nuns making cabbage soup, seemed crass in contrast to the dusty aura of the epics that had grafted my existence before. So I stopped. Wrote more and more quickly. Abandoned the details. Didn’t retain the discipline required to recreate the scene, to help you, reader, live it, feel it, enter it fully. Jumped from France to England, England to Africa, and Africa to Brittany with utter disregard for the Aristotelian rules of time and place. I relaxed the constraint that weighed me down, the discipline of a cohesive third-person chronicle and even went so far as to address the book I was writing in the vocative like I did when I wrote in my journal as a child. Book, I wrote, now you have reached your end. And miraculously, at this moment of abandon and decadence, I heard a horse come up a narrow track. I recognized the voice of Raimbault. And while I’ll never love him ardently, never find in him the elision I seek with another to finally know the world, I know from what I’ve chronicled how much he loves me, how he has loved me since he noticed me peeing in the stream after I saved his life. I’ll rush to meet him, let him guide my pen as life urges along, and mount the crupper of my horse to find my future. Because no one would have expected it. It’s not the plot. Nothing I’d create. It could not be Bradamante. Therein, perhaps, I’ll find the possibility of freedom.
What you’ve just read is an experiment. An attempt to become a better reader.
First, it meant engaging with the text actively–and returning to it a few times in a short time frame–to better register and remember it. This took effort, even emotional effort. Reading literature and philosophy passively is at once an indulgent pastime and an attempt to keep a former self intact and alive, a self who spent most of her time reading books and writing about books, whose job it was to say something about books that no one else had said before. My professional success no longer hangs on my knowledge of literature and the artfulness and ingenuity of my interpretation. I changed. Moved to technology. Strive for excellence according to the standards and conventions of a different social circle and profession. But, pretend though I may, my transition was not a complete epigenetic phase shift. Reading still matters to me. And I experience unnerving discomfort when a passage I read just a week ago, a passage that was so alive and vivid while I was reading it, has disappeared like vapor on a car window or footsteps washed away by the sea. My emotional discomfort, therefore, stemmed partially from self-criticism, frustration that total recall wasn’t a given, didn’t effortlessly arise from passive consumption. It was a recognition that I had work do to, coupled with the desire to keep on doing the same because it was easier. So I had to make it fun. Do something creative. Trick myself into making engagement effortless to silence to lacerations of the superego.
Which lead me to fiction. Writing a companion piece that grappled with the question: who is the narrator of The Nonexistent Knight? I’d engage with the book by replicating it, adapting it, making it my own by assimilating it.
This is almost accurate. I actually started by composing a different blog post, one whose I was close in voice and referent to the I in most of my other posts. But I felt too exposed. Projected judgment around the triteness of my conclusions. Felt silly writing a post about trying to remember what I’d read. I needed fiction to protect myself. Needed Bradamante to exorcise my fear. Without her, I wouldn’t feel comfortable writing these words.
Fortunately this cloak of fiction added a layer of reflexivity. The Nonexistent Knight is itself a kind of reading, where, like a medieval artist painting her version of the annunciation or the pietà, Calvino engages with the familiar stories of the Carolingian epics. His writing is a reading of Ariosto, of Tasso, of scenes and memes intimately familiar to his readers. Or at least to some readers. For us, today, there can be no presumption that people know those tales. No awareness of the tradition into which the stories fall. The text doesn’t have a shadow. It lacks the trappings of identity it would have had in a time where it was a given that people know Bradamante, knew Charlemagne, had grown up with the tales. We’re bid to ask what means it means to rewrite a Renaissance Romance in an age when people don’t recognize it’s a recapitulation, but are reading it for the first time. It kaleidoscopes the nonexistence of the protagonist in the text.
And there was more.
Why did I care about becoming a better reader in the first place? What did I seek? Why did my lack of recall create such a rabid sense of discomfort and shame?
The system of identity Calvino grapples with through the character of the nonexistent knight constitutes selfhood through the embodiment and application of codes of conduct and structures of belief. Knights do X in Y setting; one becomes a knight by passing through Z ritual. Take away a paradigm sanctioned by others’ recognitions that you fit into their code, that you act in a way that confirms how they view themselves, and all that remains is the raw encounter with experience. The constitution of self through and via an amassed collection of experiences. But this self as conversation with the world stands on precariously flimsy stilts unless one can recall with fidelity, unless there is a distancing from the vagaries of momentary subjectivity. In short, it mattered that I could remember things accurately because my very identity was at stake, an identity constituted from a series of encounters and experiences. I wanted, needed, to know the book for what it was, to love it for what it was, so I myself could stand on firmer ground. It’s a lesson I can apply elsewhere, a moral attitude for engaging with the world. An attempt to know the world and its people for what it is and who they are.
One final thought.
There are a few passages in The Nonexistent Knight that I’d remember without all this effort and alienation. The introduction of Gurduloo in chapter three, for example, is hilarious. Gurduloo is the foil of Agilulf, the man who exists but doesn’t think he does as opposed to the man who doesn’t exist but thinks he does. He’s marvelous. Sees ducks and joyfully becomes duck. Sees frogs and mindlessly becomes frog. Sees the king and impudently becomes king. It’s a variation on the joker from an opera di buffa, who speaks the truth no one else can tell. Like Gurduloo to his surroundings, the passages existed without needed to think they existed. They just were. There to be enjoyed without all the reflexive reflection. It feels cliché, but it’s too true not to acknowledge. It’s the bliss of the poet. The ability to be so engaged with the world that it sticks with us and shapes us.
 In retrospect, I wish I had been a more dialogical scholar of literature. I admire how my Stanford colleague Harris Feinsod, now a professor at Northwestern, wrote articles in response to and in dialogue with other active literary critics. Responding to someone wasn’t on my mind when I was a graduate student. I engaged with secondary literature, engaged with others’ ideas about the text I was working on, but felt I was arguing with an absent ideal rather than a person.
The featured image is from Pino Zac’s 1971 film adaptation of Italo Calvino’s The Nonexistent Knight. I presume (because I only skimmed the film) the man with his arms raised is the King of the Grail, who abdicates the moral weight and responsibility of killing innocent people in the name of fanaticism, which is too often guised as perverse form of Enlightenment. It is a paragon B film, weaving technicolor, black and white, and animation to visually represent the different ontological levels Calvino sculpts in the book (Agilulf and Raimbault as real characters in color; Charlemagne and the other knights as animations; Sister Theodora, the author of the work, imprisoned in her penance of black and white). My partner Mihnea found its style to be unmistakably in the tradition of Federico Fellini. I saw hints of Alejandro Jodorowsky. It’s a fascinating artifact. I’m glad it exists.
On Wednesday evening, I was the female participant on a panel about artificial intelligence. The event was hosted at the National Club on Bay Street in Toronto. At Friday’s lunch, a colleague who attended to support mentioned that the venue smelled like New York, carried the grime of time in its walls after so many rain storms. Indeed, upon entering, I rewalked into Manhattan’s Downtown Association, returned to April, 2017, before the move to Toronto, peered down from the attic of my consciousness to see myself gently placing a dripping umbrella in the back of the bulbous cloak room, where no one would find it, feeling the mahogany enclose me in peaty darkness, inhaling a mild must that could only tolerate a cabernet, waiting with my acrylic green silk scarf from Hong Kong draped nonchalant around my neck, hanging just above the bottom seam of my silk tunic, dangling more than just above the top seam of my black leather boots, when a man walked up, the manager, and beaming with welcome he said “you must be the salsa instructor! Come, the class is on the third floor!” I laughed out loud. Alfred arrived. Alfred who was made for another epoch, who is Smith in our Hume-Smith friendship, fit for the ages, Alfred who had become a member of the association and, a gentleman of yore, would take breakfast there before work, Acshenbach in Venice, tidily wiping a moist remnant of scrambled eggs from the right corner lip, a gesture chiseled by Joseon porcelain and Ithaca’s firefly summer, where took his time to ruminate about his future, having left, again, his past.
Upstairs we did the microphone dance, fumbling to hook the clip on my black jeans (one of the rare occasions where I was wearing pants). One of my father’s former colleagues gave the keynote. He walked through the long history of artificial intelligence, starting with efforts to encode formal logic and migrating through the sine curve undulations of research moving from top-down intelligent design (e.g., expert systems) to bottom-up algorithms (e.g., deep convolutional neural networks), abstraction moving ever closer to data until it fuses, meat on a bone, into inference. He proposed that intellectual property had shifted from owning the code to building the information asset. He hinted at a thesis I am working to articulate in my forthcoming book about how contemporary, machine learning-based AI refracts humanity through the convex (or concave or distorted or whatever shape it ends up being) mirror of the space of observation we create with our mechanisms for data capture (which are becoming increasingly capacious with video and Alexa in every home, as opposed to being truncated to bilps in clickstream behavior or point of sale transactions), our measurement protocol, and the arabesque inversions of our algorithms. They key thing is that we no longer start with an Aristotelian formal cause when we design computational systems, which means, we no longer imagine the abstract, Platonic scaffold of some act of intelligence as a pre-condition of modeling it. Instead, as Andrei Karpathy does a good job articulating, we stipulate the conditions for a system to learn bottom-up from the data (this does not mean we don’t design, it’s just that the questions we ask as we make the systems require a different kind of abstraction that is affiliated with induction (as Peter Sweeney eloquently illustrates in this post)). This has pretty massive consequences for how we think about the relationship between man and machine. We need to stop pitting machine against man. And we need to stop spouting obsequious platitudes that the “real power comes from the collaboration of man and machine.” There’s something of a sham humanism in those phrases that I want to get to the bottom of. The output of a machine learning algorithm always already is, and becomes even more, as the flesh of abstraction moves closer to the bone of data (or vice versa?), the digested and ruminated and stomach acid-soaked replication of human activity and behavior. It’s about how we regurgitate. That’s why it does indeed make sense to think about bias in machine learning as the laundering of human prejudice.
A woman in the audience posed the final question to the panelists: you’ve spoken about the narrow capabilities of machine learning systems, but will it be possible for artificial intelligence to learn empathy?
A fellow panelist took the Turing Test approach: why yes, he said, there has been remarkable progress in mimicking even this sacred hallmark of the limbic system. It doesn’t matter if the machine doesn’t actually feel anything. What matters it that the machine manifests the signals of having felt something, and that may well be all that matters to foster emotional intelligence. He didn’t mention Soul Machines, a New Zealand-based startup making “incredibly life-like, emotionally responsive artificial humans with personality and character,” but that’s who I’d cite as the most sophisticated example of what UX/UI design can look like when you fuse the skill set of cinematic avatars, machine learning scientists, and neuroscientists (and even the voice of Cate Blanchett).
I disagreed. I am no affect expert (just a curious generalist fumbling my way through life), but believe empathy is remarkably complex for many reasons.
I looked at her directly, deeply. At not just at her, I looked into her. And what I mean by looking into her is that I opened myself up a little, wasn’t just a person protected by the distance of the stage (or, more precisely, the 4 brown leather bar stools with backs so low they only came up to vertebra 4 or 5, and all of us leaned in and out trying to find and keep a dignified posture, hands crossed into serenity, sometimes leaning forward). Yes, when I opened myself to engage with her I leaned forward almost to the point of resting my elbows on my thighs, no longer leaning back and, every few moments, returning my attention to the outer crevices of my eyes to ensure they were soft as my fellow panelists spoke. And I said, think about this. I’m up here on stage perceiving what I’m perceiving and thinking what I’m thinking and feeling what I’m feeling, and somehow, miraculously, I can project what I think you’re perceiving, what I think you’re thinking, what I think you’re feeling, and then, on top of that, I can perhaps, maybe, possibly start to feel what you feel as a result of the act of thinking that I think what you perceive, think, and feel. But even this model is false. It’s too isolated. For we’ve connected a little, I’m really looking at you, watching your eyes gain light as I speak, watching your head nod and your hands flit a little with excitement, and as I do this we’re coming together a little, entangling ourselves to become, at least for this moment, a new conjoint person that has opened a space for us to jointly perceive, think, and feel. We’re communicating. And perhaps it’s there, in that entangled space, where the fusion of true empathy takes place, where it’s sound enough to impact each of us, real enough to enable us to notice a change in what we feel inside, a change occasioned by connection and shared experience.
A emotional Turing test would be a person’s projection that another being is feeling with her. It wouldn’t be entangled. It would be isolated. That can’t be empathy. It’s not worthy of the word.
But, how could we know that two people actually feel the same feeling? If we’re going to be serious, let’s be serious. Let’s impose a constraint and say that empathy isn’t just about feeling some feeling when you infer that another person is feeling something, most often feeling something that would cause pain. It’s literally feeling the same thing. Again, I’m just a curious generalist, but know that psychologists have tools to observe areas of the brain that light up when some emotional experience takes place; so we could see if, during an act of empathy, the same spot lights up. Phenomenologically, however, that is, as the perceived, subjective experience of the feeling, it has to be basically impossible for us to ever feel the exact same feeling. Go back to the beginning of this blog post. When I walked into the National Club, my internal experience was that of walking into the Downtown Association more than 1.5 years earlier. I would hazard that no one else felt that, no one else’s emotional landscape for the rest of the evening was then subtly impacted by the emotions that arose during this reliving. So, no matter how close we come to feeling with someone when our emotional world us usurped, suddenly, by the experience of another, it’s still grafted upon and filtered through the lens of time, of the various prior experiences we’ve had that trigger that response and come to shape it. As I write, I am transported back to two occasions in my early twenties when I held my lovers in my arms, comforting and soothing them after each had learned about a friend’s suicide. We shared emotion. Deeply. But it was not empathy. My experience of their friends’ suicide was far removed. It was compassion, sympathy, but close enough to the bone to provide them space to cry.
So then we ask, if it’s likely impossible to feel the exact same feeling, then we should relax the constraint and permit that empathy need not be deterministic and exact, but can be recognized within a broader range. We can make it a probabilistic shared experience, an overlap within a different bound. If we relax that constraint, then can we permit a Turing test?
I still don’t think so. Unless we’re ok with sociopaths.
But how about this one. Once I was running down Junipero Serra Boulevard in Palo Alto. It was a dewy morning, dewy as so many mornings are in Silicon Valley. The rhythms of the summer are so constant: one wakes up to fog, daily, fog coming thick over the mountains from the Pacific. Eventually the fog burns and if you go on a bike ride down Page Mill road past the Sand Hill exit to 280 you can watch how the world comes to life in the sun, reveals itself like Michelangelo reveals form from marble. There was a pocket of colder, denser, sweeter smelling air on the 6.5-mile run I’d take from the apartment in Menlo Park through campus and back up Junipero Serra. I would anticipate it as I ran and was always delighted by the smell that hit me; it was the smell of hose water when I was a child. And then I saw a deer lying on the side of the road. She was huge. Her left foot shook in pain. Her eyes were pleading in fear. She begged as she looked at me, begged for mercy, begged for feeling. I was overcome by empathy. I stopped and stood there, still, feeling with her for a moment before I slowly walked closer. Her foot twitched more rapidly with the wince of fear. But as I put my hand on her huge, hot, sweating belly, she settled. Her eyes relaxed. She was calmed and could allow her pain without the additional fear of further hurt. I believe we shared the same feeling at that moment. Perhaps I choose to believe that, if only because it is beautiful.
The moment of connection only lasted a few minutes, although it was so deep it felt like hours. It was ruptured by men in a truck. They honked and told me I was an idiot and would get hurt. The deer was startled enough to jump up and limp into the woods to protect herself. I told the men their assumptions were wrong and ran home.
You might say that this is textbook Turing test empathy. If I can project that I felt the exact same feeling as an animal, if I can be that deluded, then what’s stopping us from saying that that the projection and perception of shared feeling is precisely what this is all about, and therefore it’s fair game to experience the same with a machine?
The sensation of love I felt with that deer left a lasting impression on me. We were together. I helped her. And she helped me by allowing me to help her. Would we keep the same traces of connection from machines? Should empathy, then, be defined by its durability? By the fact that, if we truly do connect, it changes us enough to stay put and be relived?
There are, of course, moments when empathy breaks down.
Consider breakdowns in communication at work or in intimate relationships. Just as my memory of the Downtown Association shaped, however slightly, my experience at Wednesday’s conference, so too do the accumulated interactions we have with our colleagues and partners reinforce models of what we think others think about us (and vice versa). These mental models then supervene upon the act of imagination to perceive, think, and feel like someone else. It breaks. Or, at the very latest, distorts the hyperparameters of what we can perceive. Should anything be genuinely shared in such a tangled web, it would be the shared awareness of the impossibility of identification. I’ve seen this happen with teams and seen it happen with partners. Ruts and little walls that, once built, are very difficult to erode.
Another that comes to mind is the effort required to empathize deeply with people far away from where we live and what we experience. When I was in high school, Martha Nussbaum, a philosopher at the University of Chicago who has written extensively about affect, came and gave a talk about the moral failings of our imagination. This was in 2002. I call her mentioning that we obsess far more deeply, we feel far more acutely, about a paper cut on our index finger or a blister on our right heel, than we do when we try to experience, right here and now, the pain of Rwandans during the genocide, of Syrian refugees packed damp on boats, of the countless people in North America razed from fentanyl. On the talk circuit for his latest book, Yuval Harari comments that we’ve done the conceptual work required to construct and experience a common identity (and perhaps some sort of communal empathy) with people we’ll never meet, who are far outside the local perception of the tribe, in constructing the nation. And that this step from observable, local community to imagined, national community was a far steeper step function than the next rung in the ladder from national to global identity (8,000,000 and 7,000,000,000 are more or less the same for the measly human imagination, whereas 8,000,000 feels a lot different than 20). Getting precise on the limits of these abstractions feels like worthwhile work for a 21st-century ethicists. After all, in its original guise, the trolley problem was not a deontological tool for us to pre-ponder and encode utilitarian values into autonomous vehicles. It was a thinking tool to illustrate the moral inevitability of presence.
I received LinkedIn invites after the talk. One man commented that he found my thoughts about empathy particularly insightful. I accepted his invitation because he took the time to listen and let me know my commentary had at least a modicum of value. I’ll never know what he felt as he sat in the audience during the panel. I barely know what I felt, as two and a half days of experience have already intervened to reshape the experience. So we grow, beings in time.
 Loyal blog readers will have undoubtedly noticed how many posts open with a similar sentence. I speak at a ton of conferences. I enjoy it: it’s the teacher’s instinct. As I write today, however, I feel alienated from the posts’ algorithmic repetition, betokening the rhythm of my existence. Weeks punctuated by the sharp staccato of Monday’s 15-minute (fat fully cut) checkins, the apportioned two hours to rewrite the sales narrative, the public appearances that can be given the space to dilate, and the perturbations flitting from interaction to interaction, as I gradually cultivate the restraint to clip empathy and guard my inside from noxious inputs. Tuesday morning, a mentor sent me this:
 This is a loaded term. I’m using it here as a Bayesian would, but won’t take the time to unpack the nuances in this post. I interviewed Peter Wang for the In Context podcast yesterday (slated to go live next week) and we spoke about the deep transformation of the concept of “software” we’re experiencing as the abstraction layer that commands computers to perform operations moves ever closer to the data. Another In Context guest, David Duvenaud, is allergic to the irresponsible use of the word “inference” in the machine learning community (here’s his interview). Many people use inference to refer to a prediction made by a trained algorithm on new data it was not trained on: so, for example, if you make a machine learning system that classifies cats and dogs, the training stage is when you show the machine many examples of images with labels cat and dog and the “inference” stage is when you show the machine a new picture without a label and ask it, “is this a cat or a dog?” Bayesians like Duvenaud (I think it’s accurate to refer to him that way…) reserve the term inference for the act of updating the probability of a hypothesis in light of new observations and data. Both cases imply the delicate dance of generalization and induction, but imply it in different ways. Duvenaud’s concern is that by using the word imprecisely, we lose the nuance and therefore our ability to communicate meaningfully and therefore hamper research and beauty.
 Franco Moretti once told me that similar areas of the brain light up when people read Finnegans Wake (or was it Ulysses? or was it Portrait of the Artist? and the Bible (maybe Ecclesiastes?).
The featured image is Edouard Manet’s Olympia, unveiled in Paris in 1856. In the context of this post, it illustrates the utter impossibility of our empathizing with Olympia. The scorn and contempt in her eyes protects her and gives her power. She thwarts any attempt at possession through observation and desire, perhaps because she is so distanced from the maid offering her flowers, deflecting her gaze out towards the observer but looking askance, protecting within her the intimations of what she has just experienced, of the fact that there was a real lover but it was and will never be you. Manet cites Titian’s Venus of Urbino (1534), but blocks all avenues for empathy and connection, empowering Olympia through her distance.
I don’t know if Andrei Fajardo knows that I will always remember and cherish our walk up and down University Avenue in Toronto a few months ago. Andrei was faced with a hard choice about his career. He was fortunate: both options were and would be wonderful. He teetered for a few weeks within the suspension of multiple possible worlds, channeling his imagination to feel what it would feel like to make choice one, to feel what it would feel like to live the life opened by choice two. He sought advice from others. He experimented with different kinds of decision-making frameworks to see how the frame of evaluation shaped and brought forth his values, curtailed or unfurled his imagination. He listened for omens and watched rain clouds. He broke down the factors of his decision analytically to rank and order and plunder. He spoke to friends. He silenced the distractions of family. He twisted and turned inside the gravity that only shines forth when it really matters, when the frame of identity we’ve cushioned ourselves within for the last little while starts to fray under the invitation of new possibilities. The world had presented him with its supreme and daunting gift: the poignancy of growth.
I’m grateful that Andrei asked me to be one partner to help him think about his decision. Our conversations transported me back, softly, to the thoughts and feelings and endless walks and desperate desire for the certainty I experienced in 2011 as I waded through months to decide to leave academia and pursue a career in the private sector. I wanted Andrei to understand that the most important lesson that experience taught me was about a “peculiar congenital blindness” we face when we make a hard choice:
To be human is to suffer from a peculiar congenital blindness: On the precipice of any great change, we can see with terrifying clarity the familiar firm footing we stand to lose, but we fill the abyss of the unfamiliar before us with dread at the potential loss rather than jubilation over the potential gain of gladnesses and gratifications we fail to envision because we haven’t yet experienced them.
When faced with the most transformative experiences, we are ill-equipped to even begin to imagine the nature and magnitude of the transformation — but we must again and again challenge ourselves to transcend this elemental failure of the imagination if we are to reap the rewards of any transformative experience. (Maria Popova in her marvelous Brain Pickings newsletter about L.A. Paul’s Transformative Experience)
I shared examples of my own failure of imagination to help Andrei understand the nature of his choice. For hard choices about our future aren’t rational. They don’t fit neatly into the analytical framework of lists and cross-examination and double-entry bookkeeping. It’s the peculiar poignancy of our existence as beings unfurling in time that makes it impossible for us to know who we will be and what knowledge the world will provide us through the slot canyon aperture of our particular experience, bounded by bodies and time.
As Andrei toiled through his decision, he kept returning to a phrase he’d heard from Daphne Koller in her fireside chat with Yoshua Bengio at the 2018 ICLR conference in Vancouver. As he shared in this blog post, Daphne shared a powerful message: “Building a meaningful career as a scientist isn’t only about technical gymnastics; it’s about each person’s search to find and realize the irreplaceable impact they can have in our world.”
But, tragically or beautifully, depending on how you view it, there are many steps in our journey to realize what we believe to be our irreplaceable impact. Our understanding of what this could or should be can and should change as our slot canyon understanding of the world erodes just a little more under with the weight of wind and rain to bring forth light from the sun. In my own experience, I never, ever imagined that just two years after believing I had made a binary decision to leave academia for industry, I would be invited to teach as an adjunct law professor, that three years later I would give guest lectures at business schools around the world, that five years later I would give guest lectures in ethics and philosophy. The past self had a curious capacity to remain intact, to come with me as a barnacle through the various transformations. For the world was bigger and vaster than the limitations my curtailed imagination had imposed on it.
Andrei decided to stay with our company. He is a marvelous colleague and mentor. He is a teacher: no matter where he goes and what he does, his irreplaceable impact will be to broaden the minds of others, to break down statistics and mathematics into crystal clear building blocks than any and all can understand. He’ll come to appreciate that he is a master communicator. I’m quite certain I’ll be there to watch.
What was most beautiful about our walk was the trust I had in Andrei and that he had in me. His awareness that I wanted what was best for him, that none of my comments were designed to manipulate him into staying if that weren’t what he, after his toil, after his reflection, decided was the path he wanted to explore. It was simply an opportunity to share stories and experiences, to provide him with a few different thinking tools he could try on for size as he made his decision. We punctuated our analysis with thoughts about the present. We deepened our connection. I gave him a copy of Rilke’s Letters to a Young Poet to help him come to know more of himself and the world. Throughout our walk, his energy was crystalline. He listened with attention rapt into the weight of it mattering. The attention that emerges when we are searching sponges sopping as much as we can from those we’ve come to trust. The air was chilled just enough to prickle goosebumps, but not so much as to need a sweater. The grass was green and the flowers had started to bud.
Yesterday was the first snow. There are still flowers; soon they’ll die. The leaves over Rosedale are yellow and red, made vibrant by the moisture. Andrei is with his dogs and his wife. I’ll see him tomorrow morning at work.
I found the featured image last weekend at the Thomson Landry Gallery in Toronto’s Distillery District. It’s a painting called “Choisir et Renoncer,” by Yoakim Bélanger. I see in it the migration of fragility, hands cradled open into reverent acceptance. I see in it the stone song of vulnerability: for it is the white figure–she who dared wade ankle-deep in Hades to hear Eurydice’s voice one more time–whose face glows brightest, who reveals the wrinkles of her character, who shines as a reflection of ourselves, unafraid to reveal her seashell cracks and the wisdom she acquired with the crabs. She etches herself into precision. She chisels brightly through the human haze of potential, buoyed upon the bronze haze of the self she once was, but yesterday.
My mom has done business in over 180 countries, her passport tattooed with stamps and fat with extra pages. Her vagrant soul never seems restless for stability; her vibrant energy never seems to dwindle into entropy. She seems at home anywhere, yet nowhere. She has instilled in me the tendency to notice commonalities before differences, teaching through her example how to speak and touch and look so that others may let down the walls of propriety and open the levees of expression and feeling. We see what peoples share, see what’s common across humanity more clearly than cultural differences. She’s written emails from English manors, anxious to share what it felt like to hear the Goldberg Variations echo in a dewy church. Called us from hospitals in Singapore, worrying us that the Meningitis would reappear. Sent photos from dry tents in Arabian deserts, hookah smoke billowing her digital folds.
Since childhood, Mom brought me on her business trips. She took my brother too, but not as frequently as me. He loved it, but didn’t live for it as I do. We went to Sydney and the Gold Coast and Moscow and Johannesburg and Mombasa and Madrid and London and Paris. Most of the time, she’d work all day as I occupied myself either alone or with a tour guide. Mom insisted I have a guide in places that were more risky and harder to navigate. In Moscow, Tatiana, who, miraculously, had taught Pushkin at the University of Chicago (where I went to college), held my hand as we left behind the blaring March sun in the Red Square and walked down into the utter darkness of Lenin’s tomb, pulled me to the left so I wouldn’t attract attention from the guard as I fumbled my feet in the darkness, and pressed the small of my back to keep me walking, moving, steady, around the light radiating off this small little man who looms so large in the Russian imagination alongside the staccato lyricism of Prokofiev and Ivan Tsarevich and the Wolf.
In Johannesburg, I told my guide Mandla I was more interested in seeing how people live than visiting tourist attractions, so we walked through the streets of Soweto and picked up his daughter at daycare, and, upon seeing me, the woman who ran the daycare center threw down the clothes she was hanging on the line and screamed at me in Zulu, screamed, pointed, accused, and I had no clue what was going on until I learned that she mistook me for the girl’s colored mother, and scolded me for having abandoned the girl she thought I had abandoned because her skin was a shade darker than my own.
At night, I would accompany Mom at business dinners. Her colleagues metabolized the initial strangeness of having a 14-, 15-, 16-, 17-year-old girl around relatively quickly. For I’d grown up being in the company of technology executives and, given my proclivities for imitation, had learned how to behave. I’d absorbed the topics and mannerisms by osmosis and they sensed they needn’t adapt the topic of conversation to placate my interests; that I would listen, reason, and pose questions that, on a few occasions, enabled them to see problems they were working on in a new and different way.
It is only with the hindsight of moderate maturity that I appreciate how valuable these experiences were for my future career. I have never questioned my validity as a woman in business, for I had my Mom as a strong role model and example from the day I was born. She showed me what was possible. Showed that one could wake up at 3:45 am to catch the 6:00 flight to Chicago and nonetheless look stunning in a suit and stilettos, graceful in her power and resilience. Showed, on the flip side, that the second day of work, the family work, could start at 6:00 pm with laundry spinning and chicken dipped in egg yolk and flour, and the anticipation of saffron and bittersweet double-boiled chocolate for mousse at the weekend party. Showed that femininity and feminism need not be incompatible, that a woman could drink Japanese executives under the table and feel close to death when the 6:30 alarm went off and nonetheless have the wherewithal to get the deal done. And showed me that it’s ok to need silence, that we all need rest, that the energy required to sustain the ideal must fray, eventually, into daylong movie sessions on the couch so the synapses could recover. It’s because of her that I sit tall and grounded in the presence of C-Suite executives.
People meet her and say they understand where my dynamism and charisma come from. Meet her and are transfixed by her energy and presence. Meet her and are touched by the love she bleeds for her family.
It is only with the hindsight of moderate maturity that I was able to grow into loving myself enough to love her with ease. I’m happy about that, as I want to care for her, focus on her, give her more than I give to myself.
Yesterday was a remarkable day in the life of a mother and her daughter. The tables turned. This time, Mom accompanied me on a business trip.
I gave the opening keynote and was interviewed in a fireside chat at the INSEAD AI Forum in Paris. Asked to demystify AI, I spent 40% of the time explaining how machine learning systems differ from rules-based, deterministic systems (which boils down to reminding people what functions are and showing them how much more powerful it is to map Xs to Ys in 50,000 dimensions than 2 dimensions) and why this is cool, and 60% of the time walking step-by-step through the decisions interdisciplinary teams have to make when they build a machine learning system that solves a particular problem in a particular context (in this case, the revenue optimization application Kanetix is using on the integrate.ai platform). The most important thing to demystify right now isn’t what machine learning is or how it works, but what happens when people in businesses with processes honed over years to manage deterministic technology try to implement it. To expose the friction all enterprises face when they grapple with the probabilistic outputs of mathematical functions that look like intelligent systems but are really narrow optimization tools (this doesn’t diminish the remarkable questions machine learning is forcing us to pose about our thinking, language, and being). I focus on these topics because I want to empower people. I want to change the incessant dialogue about the “scarcity of ML talent” and create a place for more heroes than the computer science PhDs. Because, and forgive the cliché, it actually does take a village.
During the fireside chat, Subi Rangan and I spoke about larger societal questions around AI. We discussed the interdependence between privacy and economic power (and I shared my thinking about why privacy should shift from the rights of the data subject to the obligations of the data processor to better address the privacy risks of machine learning systems), how MBAs need to get used to the persistent anxiety of switching roles and contexts as algorithms automate specific, narrow tasks, and why the simple act of participating in an on-the-ground proof on concept is the surest way to leave a mark in how technology will shape our future.
After our performance, I told Subi he has a gift: his demeanor evinces a grace that provides a safe space for an interviewee–or student–to think, to speak as clearly as possible, and allow her mind to creatively unfurl. He was not antagonistic. He didn’t threaten. He didn’t seek spectacle from jabs or irony. He sought to present a structural hierarchy of concepts that could unite the particular and the general, enable the concreteness of embodied experience to ladder up to the big questions policy makers and executives are grappling with today. It was helpful. It was a framing, but one that invited rather than constrain. He was touched by my comments and, off to see his own daughter for lunch, would share his pride at having done something meaningful.
It was energizing to have my Mom in the audience. I didn’t seek her approval. I didn’t seek her pride. I just wanted to give back. To mention to the whole audience how happy I was that she was there, to show her how much I love her, to allow her heart to smile in seeing that she had done well and that I had turned out ok. And that we will have many more business trips to share, but that we must savour the delicacy and uniqueness of each one as our allotted prism in love and in life.
 I’ve felt ashamed of the fact that I don’t seem to perceive differences like others do. I think it ultimately stems from a strong identification with assimilation. From what I’ve observed, most people have a more solidified identity than I do. They self-identify as a man, as an American, as a taxi driver, as a piano player, as an X, and therefore have a measuring stick against which they notice that Y thing around them is different from their normal habits of perception. They self-identify as visitors, as tourists. When I come to a new place, I self-identify in becoming the other as soon as humanly possible. I want to mimic their language, mimic their gestures, eat how they eat, change how I hold my fork, eat with my hands, change how I walk, mimic how they acquiesce or disagree. I suppose I do have the internal mental model of practiced habits, but I prefer to absorb the differences as opposed to recognizing them as other from myself. I am quite like that in many aspects of my life: writing in the style of what I’ve just read, aligning what I say to the context of a conversation, adapting the introductory description of what my company is and does to the demands of a situation, to fit the model I presume is most meaningful to my interlocutor. For that reason, it’s difficult for me to concatenate the many particulars into a static meme that can scale to rout repetition.
 Mihnea Moldoveanu and Martin Reeves’s cogent article about this is well worth the read. Aspects of my thinking on this topic appear in this post.
The featured image is of the Place des Vosges, tucked away in the Marais in Paris. It is one of my favorite places in the world. I remember the first time I visited it in 2002, a spry yet hypersensitive 18-year-old who had just spent 3 months living abroad in Burgos, Spain and was on vacation in Paris with four female friends. We bunked together in a boutique hotel with sea-green walls. I remember the weight of my friends’ hair, how their nipples looked so different from my own, remember how it felt to inhabit my thin frame. I remember when Nicolas showed up at the restaurant, at the end of the meal, just as we were preparing to leave, past the disappointment, past the acceptance, after the hope wafts were snuffed under the tannins on the back of my tongue. He arrived. My heart accepted the recognition of the desire it had pretended to put aside–out of self protection–like a napkin stained with tomato sauce. We just barely moved on the dance floor, siphoned inside blaring Haitian music and off from the world around us as if we’d reduced our dimensions to the sacred simplicity of a Rublev icon. How fascinating that dimensionality reduction betokens the sacred and the sublime; while in information representation, we covet higher dimensions as the promise land indexing knowledge. The next day, he took me to the Louvre. I could smell his body odor through his brown suede jacket as he showed me where to guide my eye along white marble arms and legs. I didn’t mind because it was him.
Years passed. Friday was the first time my mom saw the Place des Vosges. We ate goat cheese and steak tartare and crème brûlée. She had a cold. She listened without judgment.
Man is something that shall be overcome. Man is a rope, tied between beast and overman–a rope over an abyss. What is great in man is that he is a bridge and not an end. (Zarathustra, thus imaginarily reported by Friedrich Nietzche)
The AlphaGo documentary is about Man qua Man, or, more precisely, about one man by the name of Lee Sedol, who has a soft, high-pitched voice, a wife, and a daughter. In March of 2016, Sedol went from being well known to Go fans to being well known to everyone after losing 4 out of 5 games to AlphaGo, a computer built by machine learning engineers at Deepmind.
Here is what the film beckoned me to see, feel, and infer.
1. Fear eats Man’s mind
And thus the native hue of resolution/ Is sicklied o’er with the pale cast of thought (Hamlet, Act III, Scene I) 
Sedol is a champion. He has cultivated excellence, put in his 10,000 hours of practice. Played game after game after game to get where he is today, working humbly and patiently with his coach. Playing Go the way he plays is an act of respect towards his elders, his nation, his family.
For Sedol, therefore, the match against AlphaGo was much more than a match. It was the appointed time to exhibit elegance, grace, and creativity above and beyond standard play. The moment when he left the hallowed halls of practice to squint into the harsh lights of the stage. When they applauded. When he bowed, and, lifting his as slowly as possible to protract time into the infinite dilation of Cantor’s continuity, pupils dilating into eye drop blurs, seconds half-lived to infinitesimals, further, until he couldn’t stop it anymore, until, as raised his head back up, he noticed his sense of self had changed, he observed himself being observed, knowing everyone was watching, rid himself of the caterpillar cloak called Lee Sedol to stretch his powdery wings as Man. He had become an allegory of human intelligence pitted against the machine.
No biggie. You got this. Just a little blip in history. Just a game. Underwear. Chickens in underwear with scraggly little legs hobbling under the weight of tubby guts bloated with donuts and Budweiser. Just like yesterday when no one was watching.
What a horrible place to be.
And yet, we honor it. We honor the resilience of the golfer who keeps his cool after a dud shot hooks way too far left. We honor the focus of the concert violinist who can make her way through the Mephistophelian haze of a Paganini caprice. We honor the ease excellent TED Talk speakers find when they share an idea they believe in. We honor it because we know how hard it is. Because we recognize that the difference between good and excellent is the fortitude of practice and the gumption to keep the mind in check, to settle its sabotage, to focus.
We are all Hamlet. Some of us more than others.
Sedol is also Hamlet. The documentary does a marvelous job eliciting our empathy as we watch him doubt, furrow, fear, apologize, strategize, wrestle with the pastiche reflection of what he could have done, who he could have been, how the narrative could have gone if only he had done this move instead of that move. We never hear the voices in his head but we can infer their clamor: “calm down, stay here, focus.” Sedol plays the game in context. He knows the stakes of the match and has no choice but to devote a portion of his brain to the everything else that is not the local task. It’s plausible that only 30% of his brain power could be devoted to the actual game.
AlphaGo has no voices in its head. It has no runaway probabilities. The only probabilities it calculates span the trees it searches to find the next move and win.
2. Man is a social animal who relies on nonverbal communication
Man is by nature a social animal; an individual who is unsocial naturally and not accidentally is either beneath our notice or more than human. Society is something that precedes the individual. Anyone who either cannot lead the common life or is so self-sufficient as not to need to, and therefore does not partake of society, is either a beast or a god. (Aristotle, Politics) 
AlphaGo has no hands. It has no face. Unless Deepmind decides to embody future versions in a robot somersaulting down the uncanny valley, it will never feel the silky lamination of a Go stone, never calm its nerves by methodically circling the stone between the pads of its right thumb and index finger as it contemplates its next move.
Like the infamous Godot in Samuel Beckett’s play, in the documentary, AlphaGo feels more like a prop than a character. It’s undoubtedly there, ubiquitous, but somehow also absent. Sedol engages with AlphaGo through a ventriloquist named Aja Huang, a Taiwanese computer scientist on the DeepMind team who is also an amateur 6-dan Go player. Sedol never engages with AlphaGo directly: only with its diplomat, its emissary.
Huang is no throw-away character. The ventriloquist could have been anyone: his task was to look at the digital display indicating AlphaGo’s move and translate this to the physical board by placing the stone in the right place. He could have carried out this task with zero knowledge of what it meant. Brawn without brains. Pure, robotic execution.
The positions of the stones mean something to Huang. He bridges two ways of seeing the game, like a computer scientist charting probabilities and like a Go player strategizing moves.
And this means that his face could have relayed emotional content back to Sedol, allowing the champion to plunder the emotional cues that are such an integral part of the game. In the first match, Sedol felt alienated because when he looked up at Huang to gather information from his temples, eyebrows, forehead, pupils, cheeks, lips, chin, elbows, freckles, arm hairs, face hairs, eyes, sweat beads, breath, aura, the signals were absent. Huang didn’t exhibit the weight of concentration or even the active restraint of a bluff. It was almost worse that he wasn’t just a robot man because he had enough knowledge to lead Sedol to anticipate emotional cues but fell short because his ego wasn’t engaged. He was, in the end, only an observer. The stage shifted to a theater of deliberate alienation, as in the movie The Lobster.
This inverted uncanny valley tells us something about how we communicate. It’s cliché to underscore the importance of nonverbal communication, but it was quite powerful to see how much Sedol typically relies upon emotional cues as animal, as mammal, when he plays against a normal opponent, and how the absence of those cues threw him off. I suspect some of the reticence we feel around trust and explainability stems from our brains processing the world as animals. We don’t actually require explanations from people to trust them and obey them. Power and persuasion seep through different seams.
3. Computer scientists and subject matter experts see the same thing different ways
While Huang speaks neural network and speaks Go, most of the DeepMind scientists lack the same bilingual subject matter expertise (I may be incorrect, but I’m pretty sure not everyone who worked in AlphaGo knows the game). Indeed, one fascinating aspect of contemporary machine learning is that the systems can learn what aspects of the data are relevant for a prediction or classification task rather than having a person apply their knowledge to hand pick which aspects will be most relevant. This is not universally the case, and it’s not to denigrate the value of subject matter expertise: on the contrary, there is excellent research afoot to make it easier for people with subject matter expertise in some domain–be that cancer diagnostics or fashion taste or 50-years of experience tweaking knobs to offset the quirks of an office building in lower Manhattan–to represent their knowledge as distributions and parameters without needing to be a scientist do to so. But a characteristic of the deep learning moment is that a crafty scientist can consider a problem abstractly, move away from the particular details we observe as the problem’s phenotype (e.g., a move in the game of Go) and focus on the mathematical underpinnings of the problem (e.g., the number of hidden layers or some other architectural choices to make in a neural network). Add to this that what makes a machine learning problem a machine learning problem is that there is too much variance for us to deterministically write out all the rules: instead, we provide primers that enable the system to iterate fast (that’s where we need all the computational power) to map inputs to outputs until the mapping works well most of the time. It’s like selecting the yeast that will yield the best bread.
Programs for playing games often fill the role in artificial intelligence research that the fruit fly Drosophila plays in genetics. Drosophilae are convenient for genetics because they breed fast and are cheap to keep, and games are convenient for artificial intelligence because it is easy to compare a computer’s performance on games with that of a person. (John McCarthy and Ed Feigenbaum, Tribute to Arthur Samuel)
The AlghaGo documentary did a wonderful job juxtaposing how computer scientists tracked the game’s progress and how Sedol and the Go commentators tracked the game’s progress. The scientists viewed the game mathematically, as a series of abstract scores and probabilities. The players viewed the game phenotypically, as a series of moves on the board. It was two fundamentally different ways of viewing the same problem, illustrating the silos of communication companies that quickly emerge like tectonic plates shooting mountain sprouts in any enterprise. The endgame for the opponents was also quite different. The AlphaGo team was fundamentally interested in using Go as a testing ground for computational possibility, the particular use case required to explore the larger problem of building a system that can act intelligently. Sedol was fundamentally interested in playing perfect Go, and potentially abstracting lessons from play to other aspects of his life. These conflicting endgames are often at work in the dialectic of innovation, yin and yang dancing drunk through the discrete step changes of technological progress.
I do wonder if we could rewrite the narrative of Man versus Machine as one of two different ways of creating, encapsulating, and sharing knowledge. The documentary made this about Demis Hassabis and the AlphaGo Team versus Sedol, West versus East, traditional culture versus computer science, two ways of representing knowledge and viewing the world. It’s ultimately a more grounded narrative. In our HBR Ideacast episode, I suggested to host Sarah Green Carmichael that it’s helpful to reframe a supervised learning system as “one human judgment versus the statistical average of thousands of human judgments,” and then ask which one you’d rather rely on. Granted, the new AlphaGo Zero system is one of self play, not one that mines past human judgment. But the yeast primer is still coded and crafted by human minds with a particular way of framing problems as engineered mathematical models.
4. Algorithms change how Man makes sense of the world
In a 2011 TED Talk, Kevin Slavin explained how trading algorithms have reshaped the physical landscape (we build structures to transmit the fastest signal possible so our algos can outcompete one another by fractions of a second). In a 2018 phone conversation, my partner John Frankel at ffVC helped me crystallize my understanding that task-specific machine learning algorithms are poised to reshape–if not already actively reshaping–our cognitive landscape.
Much of the language used to describe AlphaGo betokens alienness and strangeness, facets of thought that are not only not human but antihuman. From a 2017 Atlantic article:
Since May, experts have been painstakingly analyzing the 55 machine-versus-machine games. And their descriptions of AlphaGo’s moves often seem to keep circling back to the same several words: Amazing. Strange. Alien.
“They’re how I imagine games from far in the future,” Shi Yue, a top Go player from China, has told the press. A Go enthusiast named Jonathan Hop who’s been reviewing the games on YouTube calls the AlphaGo-versus-AlphaGo face-offs “Go from an alternate dimension.” From all accounts, one gets the sense that an alien civilization has dropped a cryptic guidebook in our midst: a manual that’s brilliant—or at least, the parts of it we can understand.
AlphaGo makes a few moves in the match versus Sedol that flummox him. As a non Go player, I couldn’t make sense of the moves myself, but relied upon the commentary and interpretation offered by the film. What I took away was the sense that AlphaGo did not rely upon the same leading indicator heuristics that are the common tropes of seasoned Go players. If we think about it, it shouldn’t come as a surprise that the search space of 10172 positions (according to 11th-century Chinese scholar Shen Kuo) contains brilliance that has to date evaded master Go players. But what’s even more interesting is the cultural significance of knowledge transfer from generation to generation. If Go is staging ground for life, then mastery can and should be measured by analogical transferability and applicability. It’s like a pedagogical philosophy that values critical thinking: teach them Shakespeare, teach them whatever nouns you want, but focus on enabling them to transfer the verbs so they can shape shift to solve problems as they arise.
What comes off as alien is a system that is optimized for one task, regardless of analogy and transfer. And, one only win Go by one point, not many. The move with the highest likelihood of winning by the narrowest of margins will look different than the move that betokens potentially less likelihood of success, but a larger cushion. Map this to making big choices in life: most people study the safe subject to keep options open rather than following the risky path of studying what they love. The tradeoffs are different. Their optimizing for different types of outcomes and using a different calculus.
So, following John Frankel, I’d like to propose that our heuristics will change as our minds increasingly engage with tools that optimize ruthlessly against one task. Machine Go is different than Man Go because it’s not designed as a pedagogical tool to teach life lessons. It’s designed to win, designed to exploit the logic of one search space and one game and one set of rules. But that need not be all that bad. There’s something lovely in coming to terms with the fact that success only requires one point, that we need not rely upon the greedy heuristics that are familiar as we navigate the world. What’s deemed as alien is a means of coming to terms with our own predilections to generalize, when it may not always (or often) be the best bellwether of success. It’s the inverse interpretation of Bostrom’s paper clip optimization monster. An invitation for us to ponder our values and ethical stance as we increasingly interact with algorithms geared to optimize without questioning if that’s ultimately what we want and need.
The Pythagorean Cup is at once practical joke, physics lesson, and moral chastiser. If you are greedy and put too much wine into the cup, a siphon effect kicks in and all the liquid drains out. This kind of analogical triple meaning is the opposite of algorithmic thinking in its current form.
The AlphaGo documentary left me feeling empathy and admiration for Lee Sedol. Not as a Go champion, not as an allegory of Man’s Intelligence, but as a man. His humility was beautiful. His striving was admirable. His kindness towards his daughter was noticeable. His Korean duty was evident. He was many features cobbled into a being, with feelings and a heartbeat, and a mind. He learned something from the matches and lost gracefully, shaking Hassabis’ hand as he left the press conference, cameras flashing in his wake.
 Gender warriors, please do forgive me. There’s implicit critique about who controls the AI narrative in keeping the reference to Man, and I find the capitalization lends a curious aura of allegory to this post, which is riddled with references to male heroes.
 Rainer Werner Fassbinder (whose name I always mistake for Rainer Maria until I remember that’s the other Rainer, the Rilke Rainer) has a marvelous film entitled Angst Essen Seele Auf, translated as Ali: Fear Eats the Soul (which I should be translated as Ali: Fear Eat the Soul to better capture the grammatical error Ali, one of the film’s protagonists, makes when he speaks broken German without conjugating verbs) about an “almost accidental romance kindled between a German woman in her mid-sixties and a Moroccan migrant worker around twenty-five years younger.” While released in 1973, the lessons are all the more relevant today. The other eating metaphor on my mind is Andreessen’s software eating the world, and now Steven Cohen and Matthew Granade saying that models will run the world. What Cohen and Granade get right in their article is that AI systems are about much more than just Jupyter notebooks with models. You have to put models into production, use them as hypothesis to build closed-loop systems that get better as they engage with the world. So, so, so, so, so, so, so many companies still seem to miss this part. It’s hard, and requires work that isn’t deemed sexy by the cognoscenti and the rockstars (how awesome is the word cognoscenti?).
 I’ve been thinking a lot about the social value of work and the workplace and have the early, what-my-mind-does-walking-and-running-level intuitions of a blog post about why work is an opportunity to experience positive, Aristotelian freedom (where self-actualization occurs through participation in a common, social goal) versus negative freedom (how we normally conceptualize freedom as the absence of constraint for the individual) and what that means for team and meaning and also the intrinsic value of work (for the leisure promised by some UBI pundits rubs me the wrong way; not all UBI pundits believe self-actualization is an individual project, and the most sober ones think it’s a bump needed to become more socially connected (including Charles Murray, which is interesting…). Stay tuned.
 John has an uncanny ability to understand and represent the heart of the matter in emerging technologies. It’s a privilege to learn from him. I’ve mentioned this before on the blog, but John also has the world’s best out-of-office emails, which have inspired my own (mine are far less sardonic and far more earnest, not by choice but by the ineluctable traps of my style).
The featured image is from an article the newspaper Korea Portal posted March 15, 2016. In the article, Sedol says: “I wanted to end the tournament with good results, but feel sad that I couldn’t do it. As I said before, this is not a loss for man, but a loss for me. This tournament really showed what my shortcomings are.” As in the documentary, Sedol interprets his loss as a personal failure. He doesn’t view himself as the representative of mankind. This isn’t man versus machine. It is one match. One man versus his opponent. But because the opponent doesn’t feel like Sedol does, doesn’t care if it wins, it becomes one man versus himself.