In the Reading Room
I learned the term, palimpsest — a manuscript written over an earlier text that has been scraped away — when I was a doctoral student at Yale working with medieval liturgical manuscripts. To read a palimpsest, you need to tilt the page rather than stare at it straight on because the visible script is not the whole story. Beneath a clean column of notation, another hand sometimes lingers: a faint neume, the trace of a gloss, the abrasion left by a knife that once cleared the surface for reuse. Under ultraviolet light, the scraped surface often reveals what it once carried. The page records a decision: one generation judged what was worth preserving, and another judged what could give way.
I eventually left the reading room for the classroom, first as an English teacher and now as an administrator leading my school’s work on artificial intelligence. But no matter my role, I still employ the habits of reading that I developed in graduate school years ago. No matter how I look at AI, I keep seeing, beneath the urgent daily questions about chatbots and academic integrity, something comfortingly familiar: a palimpsest.
Taking AI to Task
Last fall, I convened a task force — teachers from every division — to draft our school’s AI core beliefs and guidelines. The sessions were rigorous, sometimes fierce. We argued about assessment and authenticity, about what students owe their own thinking and what we owe theirs. We debated whether certain tools should be permitted or prohibited, and on what grounds, and who decides. There was real intellectual tussling in that room, the kind that leaves you hoarse at the end of a Monday afternoon.
But the moment I keep returning to was a quieter one.
We had been talking about the barriers to AI fluency among our faculty. Concerns I take seriously came up: the environmental costs, the risk of cognitive offloading, the pace of change. And then a teacher who had been listening more than speaking said something that stilled the room.
He said that for many of our colleagues, this wasn’t really about technology. It was about identity. The need to rethink your practice in the classroom — to reconsider habits you’d built over a career, over decades — is a competence question, and competence is bound up with who you believe yourself to be.
He asked us to approach the work with patience and compassion.
No one argued with him. There was a pause, the kind where you can feel a room recalibrating. My colleague had tilted the page, and we were all reading something underneath.
That moment helped me see something I had been circling for months.
For half a century, one powerful layer on the surface of American education has been a vocational and instrumental understanding of schooling. Acquire knowledge. Demonstrate mastery. Convert mastery into credential. Exchange credential for economic placement. AI disrupts this sequence because the instrumental model depends on demonstration — essays, exams, problem sets — and machines can now produce competent versions of those performances quickly and cheaply. The signal that once connected performance to mastery has grown unreliable. Not every demonstration is equally vulnerable to this disruption, of course, but enough of our familiar proxies for learning are vulnerable now that the larger model has begun to wobble.
I watched the realization move through my own faculty in stages. In 2022, the question was detection: How do we catch it? By the following year: What are we actually measuring? And by now, the question I hear most often is the one with no procedural answer: What is all of this for?
That question is not new. It has been inscribed beneath the surface all along.
Just Beneath the Surface
My colleague had named something true about teachers. But the longer I sat with it, the more I realized the same word — identity — described what was at stake for students, too. Older inscriptions become legible once you see them. What they carry, often, is a conviction someone before us thought was important to preserve.
For example, in an 1816 letter, Thomas Jefferson warned that a nation cannot expect to remain both ignorant and free. His concern was not employability but self-government — the formation of citizens whose cultivated identity includes the capacity to weigh arguments and accept responsibility for public decisions.
Beneath that civic layer lies a still older one: education as formation. In De Oratore, Cicero described the cultivation of humanitas — the shaping of a person capable of prudence and public responsibility. Two millennia later, John Dewey inherited that same concern, insisting in his 1897 My Pedagogic Creed that education is “a process of living and not a preparation for future living.”
Civic education assumes a self already formed enough to deliberate. Formation is how that self comes into being.
Lewinsville Road
I did not arrive at this by reading Dewey, though Dewey helped me name it. I arrived at it driving home one evening after a long meeting about whether our school should use AI detectors. We had spent an hour on accuracy rates, false positives, messaging to families — all important.
But somewhere on Lewinsville Road, the argument I’d been constructing in my head fell away, and what replaced it was simpler: This is not about detection. It is not even about integrity, really. It is about relationships, or rather, about the kind of relationships on which integrity depends.
It is about nurturing children who will thrive in a world we cannot yet describe. We are not learning to use AI to teach glitzy lessons or to streamline our workflows. We are learning about it, and in some cases learning to use it, so that we understand the world our children are living in and can teach them with honesty, prudence, and care.
And yet the tool itself complicates the very thing I was driving home to protect.
The challenge AI poses is not procedural. It is formative. A large language model can offer what feels like attention, but the encounter is asymmetric; a student who debates a classmate risks being changed by the exchange, while a chatbot risks nothing. Formation is forged through resistance, through friction, through the discomfort of one mind encountering another that will not yield.
When we surveyed our upper school students about AI last November, the thread that ran most clearly through the responses surprised me. They were not asking for the absence of limits. They were asking for trust within limits, — which is, when you think about it, what good teaching has always offered.
This is what all of our schools’ AI guidelines are for — not compliance checklists but shared commitments — and it is why the practices that best sustain trust turn out to be the oldest ones we know: oral examination, seminar discussion, and writing that make the process of revision visible, and structured exercises that ask students to argue, listen, and decide together.
None of these is an innovation. They are older inscriptions, long present though not always central. What has changed is the pressure placed upon them.
The Undertext: Humans
A palimpsest never resolves into a single clean text. The layers remain. So do the questions: What does it mean to be human? What does a community owe its young? What forms of learning require the presence of others and cannot be replicated by a system, however fluent?
In the archive, the most consequential moment is not discovering that an earlier text exists beneath the visible one. That possibility is always assumed. The consequential moment is recognizing that the earlier scribe was addressing a question later generations believed settled.
We appear to be in such a moment. The temptation will be to inscribe a new surface quickly, confident that the problem is technical and the solution procedural. But as my colleague reminded me, this work is about identity — his, mine, our students’. And if I have learned anything in three years of sitting with this question, it is that the oldest inscription on the page is not a pedagogy or a philosophy. It is the relationship: one person, in a room, deciding that another person’s becoming matters enough to stay present for — even when staying present is difficult. And even when the tools make absence easy.
That is what I saw the first time I tilted the page.
I am still reading.


