“What Makes Us Human?” is a biweekly column where Emi Sakamoto ’28 investigates the interdisciplinary criteria whereby we may better respond to this metaphysically contested question. Amidst our rapidly evolving technological landscape, it is incumbent upon us to do so.
Her ebullient spirit saturated our conversation in an ephemeral contagion. It spun with a curiosity that an algorithmic model could only ever mimic.
Angèle Christin, associate professor of communications, has mastered the craft of reckoning with the human condition. In her words, “That’s the stuff that makes me happiest: swimming in new things that I am trying to parse. I would never delegate that to an AI because I want to do that myself. That’s my joy. That’s my process.”
But let’s start from the beginning; let’s probe that process.
From the moment I heard her deliver a remarkable speech at the inaugural Critical AI workgroup, I was determined to learn more. Even now, I struggle to convey the feeling, but there was something about her unshakable spirit that broke through the current. She spoke of the burden we carry, the courage we too often lack. She encouraged us to, as her discipline does, seek the truth: squinting to see the subscript of our shallowly and syntactically constrained and colloquial exchange.
Christin began her training in France, studying economics and quantitative sociology. Her curiosities soon led her to the ethnographic study of “understanding how people experience the world.” This curiosity for digging into the human condition compelled her to investigate news media upon her immigration to the U.S. This career shift, she explained, “coincided with when digital media and platforms began to transform the way in which people communicate. What really captured my imagination was how digital media was beginning to mediate our communication process.”
Before I had the chance to prod at the tethering between communications and AI, it was as if she’d anticipated my query. Christin contemplated, “I was really fascinated by the way in which Silicon Valley technologies are changing the ways in which cultural producers are thinking about content, how audiences are reacting to content and how all of this is changing political preferences along with the way we talk to each other.”
When I asked her about this shaping, its orientation and turning, she encouraged a thoughtful meditation on both sides of the story. The first was rather utopian: “digital technologies are going to make the world a better place because access to information will be democratized. Everyone will have access to so much more information, so they will become more ‘enlightened.’” I could sense the skepticism underlying her honest attempt to portray the merit of the very idealism which permeates Silicon Valley’s posture towards AI. Candidly, I resonated with it. It was a feeling I knew all too well.
Then came the inevitable recognition for the more sinister side to the AI arms race. “This is not what is actually happening,” she observed. “What we have seen over the past thirty years is that with digital technology, there is a concentration of capital and power in the hands of a very small number of companies. And we can disagree on the details, but they are monetizing our data: the data of people who are not receiving compensation or recognition for it.”
But adjudicating questions concerning intellectual property — namely, AI’s entanglement with pressing legal and ethical liabilities — only begins to scratch the surface. AI, in the absence of sufficiently apt and duly aligned safeguards, unleashes a torrent of challenges for the 99.9% of whom pay its dues. To delineate each consequence here would be painful and unending. It would be a diversionary tactic: dividing your attention in the way AI does. Our attention is capital, our engagement is a stock option in this increasingly AI-mediated market.
To discuss the question of attention, she transported us back in time to Yale professor John Durham Peter’s remark during a guest lecture, as part of Christin’s Technology, Culture, and Power Series. She recalled, “It’s interesting you mentioned Peters because I remember he said something really fascinating, ‘screens are fickle.’” I smiled as his speech came back to me in the way memory does: in flickering, inevitably fallible form. His book, “The Marvelous Clouds,” has become a prized possession of mine.
She continued, “What he meant is that screens can always be something else. When you are reading a text, it could also be a social media post, a movie, an AI generating your homework.” When a screen could always be something else, we partake in a crude crusade in frictionlessly flitting through a pernicious kind of object impermanence. I sometimes forget that we are fickle, too. We are anything but forever.
The kind of reckoning exhibited by Christin is even more pressing in the now: We exist within, as Christin notes, “the belly of the beast.” Her hope for students is for the charting of an alternative path forward where students may “express their own imaginations, their own dreams, their own hopes for shaping the kind of future they want to see. Not the one that someone at the head of a company making billions of dollars is deciding for them.”
I couldn’t help but trace a kind of cosmic connection to Nemerov’s sentiments as she entertained a brief digression into the galaxy. “Right now, I’m in a project where I’m spending a lot of time with astrophysicists. I’m seeing how they are using Generative AI to map the secrets of the universe… I watch as they struggle with telescopes, and cameras are not quite working. I am really trying to grasp what they are struggling with,” Christin said. “That’s what really keeps me up at night: being surrounded by things I don’t understand and trying to make sense of it.”
Following the tradition of any thinker, she turned inwards, diving into ceramics. She practically beamed as she said, “I am sure robots can do ceramics that are much better than what I am making. But the point is that I enjoy making it myself. I enjoy the feeling of the clay on my hands. I enjoy looking at how the pots emerge. That’s what brings me joy. I don’t want a robot to do it for me.”
I was practically holding my breath for the final question, “What makes us human?”
She paused, lingering in the warm nostalgia of a distally dormant, yet all too alive, past. She recalled, “I remember when I was a teenager living in France, having a long discussion with friends, and we were talking about the meaning of life. My friend at the time said that smoking is what makes us human: No animal would ever smoke because you know it’s bad for your health, and you still do it.” I smiled at her unabashed candor. To me, it was a subtle expression of her constitution, one written by an enigmatically lived life. She continued, “I think there is something there. There are lots of things we do in our lives that are not necessarily good for us, in order to get something else.”
Christin paused again — like a ceramicist taking a break amidst the tremendous tedium and gift of turning — before her concluding remark. She mused, “What I think makes us human is when we are fully engaged because we are curious, because we care, because it’s fun or funny or sad. It’s what we do just for the sake of it.”
To Christin, “this is the collective enterprise that we are all navigating.”