In the last few years, artificial intelligence has consumed education. Teachers have turned to AI to craft curricula, plan lessons and generate exam questions. Students have used AI to complete problem sets, generate essays and code websites. In K-12 classrooms across the U.S., 85% of teachers and 86% of students reported using AI in the 2024-2025 school year. In less than a decade, AI has transformed from a novelty into a default, raising concerns about educational reform to ensure the preservation of critical thinking and logic within our academic system.
Despite its status as an exemplary institution, Dartmouth is not an exception. A quick walk around the library tells you all you need to know. Blank Google Docs and textbooks are replaced with chatbots, assisting students with tasks ranging from essay outlines to problem sets to relationship advice. Practically all traditional testing methods have come into question, prompting frequent conversations amongst professors about the best ways to fairly and effectively assess students.
In humanities courses nationwide, professors have imposed the in-class essay as an immediate reaction to the harms AI imposes on student learning. This allows educators to hold students accountable for not using AI tools and thinking for themselves, ensuring writing occurs in a highly regulated and controlled environment.
These professors have the right idea: In the age of AI, the in-class essay is the only way to preserve students’ writing and critical-thinking skills in humanities-based courses.
In humanities courses, writing is the primary mode of thinking and, thus, learning that occurs in classrooms. Writing allows students to critically analyze course material, consolidate key arguments and craft well-thought-out claims. AI allows students to evade the hard questions and outsource substantial amounts of work, limiting their intellectual growth. Students will inevitably use AI on take-home essays to varying degrees, regardless of course guidelines or institutional regulations, in order to conserve time and mental energy when writing. Therefore, the only way to set a standardized baseline for AI use is to mandate in-class writing, forcing students to craft a piece that is entirely their own.
Furthermore, in-class essays allow professors to build trust with their students, fostering an environment of mutual respect and intellectual accountability. Since there is no reliable way to assess whether AI has been used on an assignment, a cloud of distrust plagues classrooms when it comes time for take-home essays. This cuts both ways: Half of the K-12 students surveyed by the non-profit Center for Democracy and Technology agreed that using AI makes them feel less connected to their teachers. This data is particularly troubling in the context of Dartmouth, a small liberal arts school with small class sizes that seeks to cultivate a sense of genuine connection between students and professors. Dartmouth’s “Take a Professor to Lunch” program, which has been in operation since 2002, seeks to do exactly that: subsidize meals for students and professors to engage in extracurricular conversation and develop meaningful relationships. AI usage is simply antithetical to the culture Dartmouth aims to promote.
Additionally, in an educational system where work can be artificially generated and claimed as one’s own, credentials lose their meaning. A degree from Dartmouth no longer becomes a degree that one possesses as a result of their genuine hard work for four years of undergrad; rather, it becomes a credential of convenience, reflecting four years of inserting prompts into chatbots. Thus, employers will have an even more difficult time differentiating between candidates, a challenge that has already proven significantly more difficult due to grade inflation.
Some may argue, “Just run the writing through an AI detector.” Unfortunately, current software is nowhere near accurate enough. Grayson University professor Dayna Ford describes in a Faculty Focus article how professors “need incontrovertible proof of cheating, and students have no way to prove they are not cheating.” Thus, short of 100% certainty of AI-generated content, it is difficult for professors to penalize students, setting an incredibly high threshold for establishing an honor code violation.
Of course, there are challenges to the in-class essay approach. For one thing, building skills for in-class essays can hinder the development of long-term research skills. Students planning to write theses in departments such as history, government or sociology may be particularly affected by an inability to practice research skills and multi-phase draft writing.
While the research skill gap issue is abundantly clear, it is a temporary cost of an anomalous moment — one that assisting with faculty research and participating in independent study and thesis work can temporarily fill until this transitional period into an AI-dominated world has elapsed.
Furthermore, there are some scenarios for which my argument does not apply. For example, for a creative writing course in the English department, the primary educational value lies in the practice of consistently editing, revising and workshopping one’s work over a longer period of time, rendering intensive in-class writing nearly impossible. Another exception can be provided to senior thesis writers in humanities-based courses, who work alongside a faculty member over the course of a year to create a complex product that, at least in theory, possesses a degree of complexity that cannot effectively be generated by AI in its current form.
There is a genuine cost to depriving students of the ability to write long research papers that allow them to explore different articles from across academia. I recognize the importance of longer-term research projects in allowing for deep, comprehensive analysis based on a vast collection of nuanced sources. And it is a shame not to be able to take advantage of the practically unlimited information at our fingertips with the internet.
AI is an important tool with great educational potential, allowing for personalized learning, practice exam generation and increased resource equity. However, during the present anomalous transitional period — as our education systems integrate AI and learn to accommodate its accompanying disruptions — it is of the utmost importance that educational credentials retain their validity and that critical thinking prevails. Until these AI detection technologies are developed, the in-class essay must be prioritized.
Opinion articles represent the views of their author(s), which are not necessarily those of The Dartmouth.

