Can Silicon Valley Find God?

By Linda Kinstler

“…AT A BASIC LEVEL, the goal of A.I. and Faith and like-minded groups I came across in Toronto, San Francisco, London and elsewhere is to inject a kind of humility and historicity into an industry that has often rejected them both. Their mission is admittedly also one of self-preservation, to make sure that the global religions remain culturally relevant, that the texts and teachings of the last several centuries are not discarded wholesale as the world is remade. It is also a deeply humanistic project, an effort to bring different kinds of knowledge — not only faith-based, but also the literary, classical and oral traditions — to bear upon what might very well be the most important technological transformation of our time.

“There are people who spend their lives thinking about culture, religion and ethics. You should bring them into your funding universe if you actually care about an ethics conversation,” Robert Geraci, a religion scholar, told me. “Our government is currently poised to start pouring a bunch of extra money into A.I. … Why is it that people who understand culture, literature, art and religion are not part of the conversation about what we want to build and how we are going to build it?”

A.I. and Faith is trying to coax this conversation further along and broaden its range of participants. Its members do not have prescriptions for how A.I. should be built, or rigid policy goals; all they want is an opportunity to participate in a conversation that is already unquestionably and indeterminately altering all of our interior lives. The goals the group does have are classically liberal ones: They do not want to see advanced technology marshaled toward even greater surveillance, accelerated inequality and widespread disenfranchisement.

The group’s ad hoc network has rapidly grown around the globe. It did not take me long to discover that the conversations Mr. Brenner has been staging are also taking place, in different languages and cadences, among religious communities in Singapore, Saudi Arabia, Bangkok and many places in between.

In my conversations with A.I. and Faith members and others working toward similar goals, I often found myself marveling at their moral clarity. Each in their own way, they were working to use their religious traditions toward advancing social justice and combating the worst impulses of capitalism. They seemed to share an admirable humility about what they do not and cannot know about the world; it is a humility that the technology industry — and its political and legal offshoots — sorely lacks.

Over the course of my reporting, I often thought back to the experience of Rob Barrett, who worked as a researcher at IBM in the ’90s. One day, he was outlining the default privacy settings for an early web browser feature. His boss, he said, gave him only one instruction: “Do the right thing.” It was up to Mr. Barrett to decide what the “right thing” was. That was when it dawned on him: “I don’t know enough theology to be a good engineer,” he told his boss. He requested a leave of absence so he could study the Old Testament, and eventually he left the industry.

A few weeks ago, I called Mr. Boettcher to ask about the results of the study that I had participated in, posing existential questions to Alexa and Google. He was surprised, he told me, at how many of his respondents had immediately anthropomorphized the devices, speaking of the machines offering spiritual advice as if they were fellow humans. Across all religious backgrounds, exchanges with the virtual assistants triggered some of the participants’ deepest memories — going to church with their parents, for example, or recalling a father’s favorite line from the Bible — that the experiment often veered into a profoundly “emotional mode.” The ease with which the devices were able to reach people’s inner worlds and most intimate thoughts alarmed him.

“There’s cautionary stuff here for me,” Mr. Boettcher said. “You’re getting into people’s memories. You’re getting into the way that they think about the world, some of the ethical positions that they take, how they think about their own lives — this isn’t an area that we want to let algorithms just run and feed people based on whether they … click on the ads next to this stuff.”

The nonreligious “nones” entered this emotional register more readily, Mr. Boettcher found. Several had come from religious families but had no faith practice of their own, and they found themselves thinking back to their childhoods as they re-encountered language from their upbringings. It signaled something like a longing, he told me. “There’s something that is wanted here.”

He is hardly the first researcher to wade into this territory. In her 1984 book “The Second Self,” Sherry Turkle, a professor at M.I.T., wrote about how computer culture was prompting a “new romantic reaction” concerned with the “ineffable” qualities that set humans apart from machines. “In the presence of the computer, people’s thoughts turn to their feelings,” she wrote. “We cede to the computer the power of reason, but at the same time, in defense, our sense of identity becomes increasingly focused on the soul and the spirit in the human machine.” The romantic reaction she described wasn’t about rejecting technology but embracing it.

In the decades since Dr. Turkle wrote that book, the human-machine relationship has grown ever more complex, our spirits and souls that much more intertwined with our data and devices. When we gaze at our screens, we also connect with our memories, beliefs and desires. Our social media profiles log where we live, whom we love, what we lack and what we want to happen when we die. Artificial intelligence can do far more — it can mimic our voices, writings and thoughts. It can cull through our pasts to point the way to our futures.

If we are to make real progress on the question of ethics in technology, perhaps we must revisit the kind of romanticism that Dr. Turkle described. As we confront the question of what makes us human, let us not disregard the religions and spiritualities that make up our oldest kinds of knowledge. Whether we agree with them or not, they are our shared inheritance, part of the past, present and future of humankind.”

+++

Originally posted on The New York Times, 7/16/21.