I presented the following post as a talk April 19, 2019 at the MIT Tech Conference. It’s the first time I’ve written a talk as a post (I never write out my talks; I compose jazz talks, where a sequence of metaphorical images indexes ideas to riff on like minor seven chords). The talk had the same images and ideas as this post, but expressed the ideas in different word. I wanted to post this before the talk to make a meta point about the future being marvelously different from our models and predictions. But the writing process out-meta’d my intentions, and I didn’t have enough time to finish both the talk and the post. The post needed more time, more care, more space to gestate and come into life. Like a baby…
Something strange happened the morning of October 29, 2018. I was lying in savasana, the final corpse pose of yoga class. It must have been 7:26 am (hour long class that began at 6:30 am). As is sometimes my wont, I placed my right hand on my belly and my left hand on my chest so absorb my breathing in my hands. And as I inhaled, expanding my belly to its fullest, I suddenly had a vivid sensation that I was pregnant. So vivid it felt like my surroundings blurred as I stepped through a time warp into my future self, when and where I lie on my back on the slightly smelly wooden floor of the Iam Yoga studios at the intersection of Yonge and Isabella in Toronto, having walked up the dark stairs for the 6:30 am class, smiling at the toppled leaves of the ivy plants on the way up, still there, living their plant life in plant time and changing slower than we humans change, just pregnant this time. It was crystal clear to me that this was Mihnea’s baby. I’d never felt this before. Been in the pose with my hands on my belly and chest hundreds of time, and only felt air rising to relax the mind. The force of the vision was intense, jogging, real.
Mihnea and I weren’t officially dating yet. I texted him immediately to ask if he had time for a call. A reasonable modicum of fear that he’d deem me crazy laced my voice as I explained what happened. He was pure joy. “Her name is Clara.”
We spent more time together. We spent as many waking and sleeping moments as possible together. We loved each other as much as two humans are able. More than that. We love one another as much as the maximum quantity of love two any two humans can share (call this LMax) factored by all the love connections possible between all the world’s humans, which, right now, is 7 billion x 6 billion, or 42 billion x LMax, but that’s just now, so if we go all the way back in history to include all the possible love connections between all the humans on the planet at any moment in time, it would have to be at least 100 billion x LMax. Sometime soon, we hope to conceive Clara.
And when we do, if we were to tell Clara’s untellable life story as viewed through the tiny prism of how her life will be shaped by technology, we could, in one possible world, tell a story that begins once upon a time in a fluorescent-lit hospital room in downtown Toronto, at the witching hour of 4:56 am, when, after 37 hours of labor, Miss Clara, just like her Mom, decides she never wants to leave the placenta-world she knows, and therefore never breaks her water, leaving Dad to scream with delight when the doctors pierce placenta splaying amniotic fluid like a firehose as Clara slides into the world like a kid on a waterslide, her skin unblemished and her eyebrows thick and black from two extra weeks of gestation.
“My goodness, it’s another Brooke Shields!”
“OMG that’s exactly what the doctor said when I was born! This has to be the last generation where that statement will mean anything to anyone!”
Clara little heart struggles to take in oxygen as she enters our world. Despite my fierce resistance, the doctor takes her from me to put her in intensive care. I crumble with fear; Mihnea strokes my hair to calm me as my mother did when I was a child. For the next hour I check the iPad giving status updates about Clara at least every 15 seconds. Mihnea comments that the stylistic quality of the Arria language generation system is quite good and asks whether I’d find a woman’s voice more soothing, whether our attempts to humanize an interface and interaction with data should go further, whether it would be more powerful to have Cate Blanchett as a nurse avatar with the superpower of being able to be two places at once, observing Clara in one room and soothing me in another, as Soul Machines could render possible.
When the nurse brings Clara back to my arms a few hours later, she comments that her oxygen uptake is now outstandingly high, much higher than the average child. No brain damage. She just decided she’d syncopate the experience, just like her birth, waiting a little longer to enact her final performance with perfection.
We quickly understand this isn’t a fluke, but something constitutive of Clara’s personality. Like her father, she decides her first word won’t be just a word, but a fully-formed, grammatically-correct sentence: “Mommy, the words will make sense once you have enough examples.”
“Mihnea, did you just hear what Clara said? Who the fuck is this child?”
“Wonderful! I knew she was following along when we read Robbie the Robot Learns to Read!”
When Clara is born, my friend Noah Waisberg gives us a copy of his baby book about natural language processing and a onesie from Le Trango Studios, with my all-time favorite cast of baby clothes characters.
Mihnea translates Robbie the Robot into Romanian in real-time, deciding Clara should cultivate polyglot-hood as soon as babyly possible. Apparently Clara loves Alex the Owl, a wise friend of Robbie who advises him to stop trying so hard to break down semantics and syntax and simply read for examples:
Every time Mihnea and Clara read Robbie the Robot, Clara throws the book across the room (to Dad’s delight), crawls to pick it up, and then opens up the book to the page with Alex the Owl. I start to worry that her approach to language acquisition won’t be all that different from Robbie’s, based on pattern matching of thousands of examples:
Mihnea assures me, however, that there’s still something to Chomsky’s universal grammar, this innate structural and symbolic substrate layer we language-exchanging creatures are born with. Clara starts to add more words to her vocabulary with an astonishing clip: I notice she only needs to see a few instances of stuffed animal bears before she points and utters “bear” (her Mom’s first word), despite grandma’s genetically-inherited tendency to want to shower Clara with as many stuffed animals as possible. Robbie, by contrast, would have needed to see 50,000 bears (and jazz saxophonists) before he could distinguish between bear and jazz saxophonist. Clara also exhibits intuitions in physics, tracking objects over time and discount implausible trajectories or recognize that the size and sequence of blocks matter when she makes towers (and then knocks them over to make them again). In 1-2 years time, it’s still an open research question to train a machine to predict when a Jenga tower will fall.
When Clara is two, the tempestuous temper tantrums start. I’ve recently founded a company and am in pure startup mode: fighting to find talent, fighting to make sure the product teams are moving fast enough, expending tons of energy being rigorously present during internal and external meetings. It’s hard to come home to my lovely but admittedly pain in the ass screaming toddler.
“What if we use i2Eye to help Clara understand what she looks like when she behaves this way?” I suggest in desperation.
Mihnea built i2Eye in 2018, originally as a tool to help students in Rotman’s Self-Development Lab understand the emotional landscape they project to others when the give talks or attend meetings (a much more challenging landscape to navigate than an audience of thousands of people given the mental models we make to project what colleagues are thinking, what we know that everybody knows that everybody knows, what we know that some people know and other’s don’t know, what others think and feel when we speak, what others think but don’t say, and what we need other people to do, etc…). It combines a series of different kinds of machine learning classifiers (speech sounds, textual sentiment analysis on text, visual sentiment analysis based on facial features, etc…) to map how the emotions one projects when communicating change over time. In 2022, i2Eye is widely adopted by CHROs to help evaluate job candidates for culture fit, reduce gender bias in hiring, and improve team dynamics and collaboration.
i2Eye suggests, for example, that Donald Trump manages to have such power over and audience because there is a high correlation between the emotions his words project and the emotions his face projects. As in the image below, his dominate mode is disgust, a raw and visceral negative emotion that lacks the high-browed intellectualism of contempt. His rhetorical style is honed to project a simple, non-complex set of signals to his audience: they suck, you should be outraged, we are disgusted together.
When we show Clara how her face projects disgust, sadness, and anger (with intermittent hints of joy, depending on who’s present) when she has a temper tantrum, she simply doesn’t care. It almost seems to encourage her to reach even higher levels of rage, to quintuple the violence of her emotional outbursts and response.
“What if we use a variant on Lacan’s mirror theory and, instead of showing Clara images of her own behavior, help her understand that if she keeps behaving this way she may turn out like her Mom?” suggests Mihnea.
“Worth a try.”
Upon seeing how ridiculous I look displaying contempt in professional interviews, and forecasting that, if she were to continue having temper tantrums the way she does, she would likely end up looking like me in the future, Clara immediately ceases her temper tantrums. Our evenings are a bastion of erudition and calm.
Clara’s two grandmothers, Adina and Pat, adore spending time with Clara. Both extremely successful in their careers (Adina as a physician and Pat as a software C-Suite executive), they instill confidence, willpower, discipline, and grace in Clara from her youngest years. Both grandmas are creative women with vigorous imaginations. They never let Clara sit around and watch TV or play with devices mindlessly. Pat takes Clara for walks and lives with her in all sorts of imaginative worlds; Adina teaches Clara how to paint and diagnose illnesses based on observing a few symptoms acutely. Mihnea encourages both grandmas to engage Clara in turn-based conversation to build her empathic and reasoning skills, which works well enough until Clara gets the hang of it and starts to embrace Pat’s claims with a hefty amount of skepticism.
“I’m worried about Clara,” my mother says to us one evening. “You two are taking the magic out of the child’s world with your ruthless obsession with communicative ethics and reason. Can’t she believe in Santa Claus without having to dissect the mechanisms of how presents end up under the tree?”
“Mom, it’s better than when you stuffed my imagination with absurd associations, like when we went to the Philadelphia Zoo and you told me I’d turn into a dwarf if I touched the peacocks. Jesus, think about what that did to me! To this day, 30 years later, I see peacocks when I watch Bergman’s The Silence.”
“Well, I didn’t want those dirty birds to bite you and you simply wouldn’t be persuaded unless you were given some reason your little mind could grab on to.”
“What did Clara say?” asks Mihnea.
“Much of what she says reveals a budding sense of rebellion against Katie. It’s as if she’s identified the relationship between language and power, and resists any statements backed by will and hierarchy rather than evidence or logic. ‘Because Mom said so’ is sinful in Clara’s world.”
Joy illuminates Mihnea’s face. “Can you remember her exact words?”
“When I told her Katie told me that the tooth fairy would bring her eternal happiness if she were nice to her little brother, she said the ‘source was not sincere’ because Katie ‘says lots of things to get her to be nice.'”
Clara’s kindergarten field trips are remarkably different from the field trips we took when I was a kid (which, growing up in Massachusetts, included trips to Salem to learn about witch hunts and whaling, and watching a young Ben Affleck nearly die of hypothermia in the PBS educational series The Voyage of the Mimi).
In Clara’s world, field trips are augmented by augmented reality (AR) tools that automatically classify flora and fauna and trace the relationships between genus and species while they’re out walking through forests and botanical gardens. Hiking with Clara is an enlightening experience: whereas I used to wonder about the names of all the different trees I smelled and saw around me, now I have Clara to tell me the names and taxonomical relationships of the world around us (she quickly grew beyond needing her device).
More importantly, the AR tools help Clara refine her own perceptions, just like the Inuit see detail in snow because they have classified snow into fifty different kinds. She doesn’t see trees as amorphous leaf-bearing organisms that protect our neighborhood with oxygen and shade. She sees the oblong shape of oak leaves, the three-pronged shape of maple leaves, the relative thickness and spikiness of Balsam versus Fraser furs. Interestingly, Clara’s ornithology teacher, Mr. Jeremiah, still has the class carry around paper books with hand-drawn images when they go bird watching. Not because he’s a technology laggard but because the bird-watchers need very localized feature resolution to distinguish, say, the bowed wing arch of the red-shouldered hawk from the slightly bowed wing arch of the broad-winged hawk, which is easiest to depict from Platonic, idealized angles rather than captured in photos out in the wild.
When Clara is seven, we take a family vacation to Europe and visit the Uffizi and the Louvre. A nostalgic and stalwart proponent of the aura of the original, I insist that we stand and sweat for hours in lines to see Botticelli’s The Birth of Venus and Michelangelo’s David, and Da Vinci’s Mona Lisa (Clara is bigtime relieved that no one is in line to see Fra Angelico’s Annunciation in the Convent of Saint Marco, and she, Mihnea, and Felix (Clara’s younger brother) all suspect my forcing them to stand in the other lines is either a delicate style of torture or a covert strategy to help the kids appreciate the lesser-known but equally fine classics).
At the Uffizi, while our tour guide is explaining the significance of The Birth of Venus in the Italian Renaissance tradition, I lean down and whisper in Clara’s ear a meta-commentary the deplorable and kitschy effects of cheap mechanical reproduction, which has destroyed the aura of Botticelli’s masterpieces with cheap, sleazy reproductions as shower curtains, pillows, key chains, bottle openers, t-shirts, Dolce & Gabbana dresses.
Clara gives me one of those piercing stares I used to give my parents and pleads that I be quiet so she can pay attention to what the tour guide is saying. Over dinner that evening, she brings up art criticism and argues that, pace Mom’s having read Walter Benjamin countless times when she was in college in the early 2000s, the philosophical questions the art community is grappling with in the mid 2020s is quite different and much more closely aligned with late capitalism and mass personalization rather than early Marxist critiques of society of spectacle in mid-20th century capitalism.
“Mom, we’ve moved beyond the issues of mechanical reproduction and are now grappling with the issues of algorithmic reproduction. Since startups like Pikazo commercialized Gatys style transfer algorithms way back in 2016–four years before I was born!–we’ve been thinking more about the democratization of style, which was historically associated with artistic talent, the special hallmark of the genius of an individual artist. For more than 10 years, people have been able to create their web presence–God, what’s the name of the app all the old people use, Facebook?–in the style of Kandinsky or Mondrian or Van Gogh or Rembrandt. It’s not that we reproduce Van Gogh, but we personalize it, take the style and reshape our lives and experiences that way. People used to have all these parochial discussions about man versus machine or, like, machines debunking our assumptions about the preciousness of human creativity, but it was actually much more a social issue about anyone and everyone being able to mimic individual style.”
“The flip side of being able to abstract artistic style and replicate it on whatever image you’d like is restyling a famous picture in any artistic style. So, for example, Gene Kogan’s work bids us to ask what qualifies as the minimum viable Mona Lisa, the minimum abstract form a painting needs to retain for us to recognize it as the painting we know it to be. Dad told me that an interesting corollary is that what we consider to be style is what an algorithm attempting a classification task would consider noise, the details a machine needs to abstract away to accurately distinguish one object from another. But, like, could we carry out this same task with cubist paintings or does it work best with more natural-seeming forms of visual representation?”
When we travel to New York City, Clara’s favorite activity is to take the F train to Roosevelt Island to visit our old friend Helen Nissenbaum, who encourages Clara to gently rebel against Mom, Dad, and the larger surveillance state from the first time she meets her. Under Helen’s influence, Clara rejects the rights of the data subject on pragmatic grounds and, instead, seeks to deeply understand the mechanisms of any system so she can subvert it and protect her autonomy and freedom.
When Clara is eight, for example, she notices the Lighthouse surveillance system Mihnea put in the house when she was young to alert us if ever she were in danger. “OMG what is this? Are Mom and Dad monitoring me make sure I practice piano long enough? How am I supposed to learn if I’m constantly observed by a creepy panopticon lurking over my shoulder and recording every wrong note, every mistaken phrasing? Fuck this shit.”
Clara thinks back to the lessons Helen has taught her about obfuscation, recalling how pilots using chaff to evade radar detection back in WWII. “So I gotta trick the system using its own mechanisms, rather than seeking to minimize its use or, like, ask Dad only to use it in the kitchen rather than in the piano room. So it’s just a neural network detecting the presence of a body, maybe a body that looks close to mine and not someone else’s…”
Angela is Clara’s American Girl avatar, which comes on the market after Mattel buys New Zealand-based Soul Machines in 2022. How dolls impact girls’ self-image continues to be a problem eight years in the future. When I was young, the Pleasant Company made four original American Girl dolls, each of which was modeled after an historical character embedded in a particular context. The company sold life-size outfits so the girl who owned the doll could dress and act like her doll friend. I happened to look most like Samantha, an orphan adopted into a wealthy New York family who grows up prim and proper.
By 2019, mass personalization combined with a drive towards diversity and inclusion flipped the mimetic orientation between girl and doll: it’s no longer that girls seek to look like their dolls (for not everyone is a white brunette who looks like Samantha), but that girls “express” their looks through their dolls, the doll becoming a miniature outlet to mirror budding identity.
By 2028, American Girl Soul Machines take things a step further: Angela, Clara’s avatar, has grown up in our house and learned from local stimuli and agents (i.e., me, Mihnea, Clara, Felix, Argos (dog), and the stuffed animals (inanimate)). She isn’t an inanimate extension of Clara, onto whom she projects her thoughts and thoughts, but a convex reflection of Clara, whose silicon consciousness is shaped by interaction with our family. Angela is like a moralistic mirror who reflects to Clara the consequences of her different actions. The doll’s dialogue and diction is overfit to Clara’s speech, as she spends the most time bringing her to life, but it includes hints from Mom and Dad (whose skew is particularly recognizable). When Clara starts swearing like a sailor, I know I have to change how I speak at work.
Although Clara grows out of Angela when she’s six, she resurrects her at eight to dupe the video surveillance system in an obfuscation act of which Helen would be proud. Regularly, Clara places Angela at the piano so the Lighthouse camera gives Mihnea the impression she’s practicing piano: instead, Clara goes out and plays capture the flag in the ravines with her friends in peace. (She does practice piano later on, but feels much more comfortable without the big brother double of her superego judging every note and pitch).
Dad is also proud that Clara has outsmarted his device and ceremoniously throws the rusty Lighthouse camera with its old-school face recognition technology in the garbage on the curb.
When Clara is 13, I find I home DNA saliva kit in the mailbox when I come home one evening after work.
“Clara, honey, why are you sequencing your DNA?”
“Jeez, Mom, it’s none of your business! Gene sequencing is totally run of the mill anyway.”
“Yes, but what’s wrong? Are you feeling ok? I haven’t taken you to the doctor recently so can’t imagine what this is about. And I know that you know that our family has a history of high cholesterol and heart disease. Easy on the McNuggets, remember? Please, baby, you have to tell me what it’s about! I’ll go mad with worrying!”
“Calm down, Mom. I’m fine.”
“Ok, ok, it’s for a dating app.”
“Dating?!? What do you mean dating? You’re 13! You should think other kids have cooties!”
“Well, if the system works like it’s supposed to, it will weed out potential partners with whom I have immunological and endocrinological incompatibility. C’mon Mom: you’ve always bragged about how you mentioned ‘but what does she smell like?’ in your OKCupid profile in grad school. It’s mega important, right? That’s why data apps sequence DNA now. Look, this isn’t a thing kids do to rebel against their parents or grow up too fast. Some of my friends parents force their children to use the app so their kids will find partners who will keep them happy and create beautiful and super smart children. It’s about controlling the outcome, kinda like an arranged marriage. You and Dad with your stupid focus on freedom and autonomy and the self as fiction and all the meditation crap are totally behind the times. Teenagers don’t want to think for themselves anymore, Mom! We just want to be happy! And we all know it’s too hard to figure that out given our limited subjective perceptions!”
Clara goes on a few dates with boys with sweet-smelling sweaty t-shirts (to her, at least). They bore her to tears. So much so, in fact, that she becomes deeply skeptical of the simplifying assumptions of DNA-based theories of compatibility and seeks to better understand how epigenetic factors influence compatibility. She gives up dating for the moment (“God these conversations are making me dumber by the second! It’s like dating is an intellectual liability!”) and instead spends her free time in the urban microbiome club, canvassing the Toronto subway system to trace its bacterial history, building on the work the lab at Cornell did in New York City back in 2015.
Mihnea and I still have to drive Clara to her various after school activities and clubs, soccer practice, orchestra, urban microbiology, break dancing, basketball, sign language, hardware designers club, choir, mindfulness volunteering, the list goes on and on. Unfortunately self-driving cars never make it to prime time after a slew of fake videos generated using adversarial networks flood Twitter and Facebook in late 2019, creating vast public outcry that the populist administrations use to their advantage to shut down all autonomous vehicle research.
Autonomous vehicles do go on to leave a different legacy, impacting the structure of philosophy departments across the globe. When Clara is 16, she enrolls in the Trolleyology elective at her local high school, adapting the trolley problem beyond its origins as a thought experiment in ethics to learn domains as different as recursive systems, game theory, quantum mechanics, probability theory, and even existentialism.
By the time Clara goes to college, society prizes different subjects and expertise than it does today. As many of the repetitive white-collar jobs prized by the late 20th and early 21st century economy have been largely automated (radiology, investment banking, law, accounting, machine learning science, data science), society has come to increasingly prize the skills that make us quintessentially human: dialogical reasoning, judgment in quickly-changing environments, coherence and attention to long-form narrative, storytelling, emotional sensitivity, creativity, first-principles thinking, complex system design, machine learning task and objective function design. To keep her job prospects as wide as possible and hone her capacities for adaptability and learning, Clara studies the humanities. But the humanities don’t look anything like the disciplines we know today: she doesn’t sit around reading other people’s writing on Descartes, doesn’t waste her time listening to people peddle around inchoate terminology irresponsibly to play power games instead of expressing their ideas in simple and precise words. Instead, Clara and her classmates read classic texts and perform situations that make them visceral and real. They learn history using VR tools that simulate what it was like to be Alexander Hamilton (not totally dissimilar from how Lin-Manuel Miranda adapted Chernow’s Hamilton biography to reflect his own experience as an immigrant). They watch Kurosawa’s Rashomon and make films that reveal how different subjects register and represent the same phenomenon. They build software interfaces that align with human needs, conducting ethnographic research in diverse communities and representing the inner world of their subjects in stories. They learn science and math, too, but accompany their quantitative reasoning skills with sensitivity towards human perception and phenomenology.
When Clara is twenty, she discovers a collection of postcards visual artist Jean-Marc Côté created in 1899, depicting visions of the world in the year 2000.
She’s taken by the quaintness of the images, taken by how our visions of the future are ineluctably anchored in our experience of the present, and even more taken by the fact that the kinds of tasks Côté designed to shape with technology aren’t that much different from the things people still automate today. She finds postcards with technologies that make us smarter, make it easier to clean, enable us to fly, enable us to travel, enable us to fight with and kill each other, farm, raise livestock, make the environment more beautiful, make it easier to live.
Twenty-one years in the future, Clara feels something remarkably similar to what her mother feels as she sits here typing today, finally finishing a post about her yet-to-be-conceived daughter on this dull and cloudy morning of April 21, 2019. Clara pauses, Mom pauses. She looks up, over the edge of her computer at the silent garden in front of her. She thinks about the fact that, just like Côté’s postcards, Clara’s life won’t look anything like the way it’s depicted in this blog post. There will be non-linearities that change the trajectory of technology. There will be heartache when loved ones die. There will be beauty that breaks our heart and changes the contour of all future experience. There will be room for mistakes and freedom.
And yet, somehow, miraculously, in a narrow crevice of possibility, Clara and I will encounter the overwhelming recognition of our shared humanity, of our identification with others in the future as in the past, others who fought different battles and engaged with different technologies, but who somehow, some way, were able to meet us in the space of imagination and feel what we feel as deeply as they would if they’d lived it themselves.
 I don’t think many people in the audience liked the talk, even if Mihnea and my mom both said it was great (Mihnea went so far as to say it’s the best talk he’s seen me give). Their opinion didn’t stop my from spending much of the afternoon in a funk wondering why, time and again, I chose to do something outside the mode of the distribution of social acceptance, desire praise and accolades as if I were inside said mode, and then suffer from regret and disappointment when people don’t praise me as I want them to. I do and have done this is almost every facet of my life while being entirely aware that it would be much wiser to seek praise from a few people outside the mode should I continue to behave and value the things I do.
 When I mediate with Mihnea, I flip around my hands and mirror this disposition on his belly and chest so I can better tune into his breathing and feel it stream through my body.
 Here is our first breach in the morality of story-telling geared to illustrate technical ideas. It’s hard for me to entertain a world where anything negative happens to my newborn daughter. It feels like I’m cursing her by even mentioning this as possible, simply to illustrate a technical possibility. Subjugating her story, her life, her well-being to the logic of the narrative and the demonstration. Does this force us to think more deeply about all the technology we build? To move from the place of cold abstraction to feel, deeply, viscerally, that when we talk about the “ethics of AI” we aren’t just talking about statistics, people left out because of “algorithmic bias” but real individuals with real stories and real lives who are impacted by the choices we make and the tools we build? If for this alone, it’s worth facing the emotional impact of making it personal, making it about the little being I anticipate to love differently than I have ever loved.
 Another fantastic replica of Botticelli’s Birth of Venus is in the Elefante restaurant in the animated Netflix series BoJack horseman. The details in the show are without fail marvelous.
The featured image is from my friend Noah Waisberg’s book Robbie the Robot Learns to Read, a baby board book about machine learning. Noah is also the Founder and CEO of Kira Systems, an AI startup I’ve always admired because they focus on the value their tool provides to users rather than making empty claims about how it’s powered by AI. I’ve always believed the best way to test our understanding of a complex concept is to explain it in language a three-year old can understand. My former colleague (and lifelong friend) Tyler Schnoebelen wrote a wonderful post explaining AI to a five-year-old…or CEO in their respective language systems, inspired by the one-and-only Charlie Oliver. In his tribute to the late Arthur Samuel, TEX-creator Don Knuth praised Samuel’s long-time interest in writing tutorials for beginners, including First Grade TEX. Dan Dennett once told me he writes trade books to help clarify his ideas and get out of the epistemological sludge of feeling a need to express ideas in complicated professional jargon so as not to offend colleagues. I think a great service would be done to humanity if all introductory technical books were baby books. We need more levity and less hype.