Will Patterson ’27
In recent years, EA has wrestled with the integration of ChatGPT and other AI tools in schools to aid, complete, and grade projects. Across the country, this revolutionary technology has fueled fiery debate, roiled precedent, and upended decades of educational practices.
Amid the increasing pace of technological innovation, the usage of technology as a whole in academic environments has been a hotly contested topic. From having access to phones at school to taking College Board exams on computers, education has grappled with major questions about technology. However, none of these debates near the importance surrounding the utilization of AI. The simplicity, availability, and capability of large language models such as ChatGPT and Gemini have shaped AI into a broadly used tool. In particular, these platforms’ ability to replicate human work in terms of writing and research forced major changes to educational practices as a whole.

tests, helping with homework, and cheating on essays.
Photo courtesy of Will Newman
The advent of AI has caused substantial policy changes at EA, particularly in its humanities departments. In reaction to the rising prevalence of the software, the history department has moved the writing process of all student research papers into the classroom. History Department Chair Steve Schuh comments, “What we found early on was that the kids were using AI and didn’t really know what they were arguing about. They weren’t going through the struggle of having to write it down, to understand it, and then, again, be able to articulate whatever argument they were going to make. And so we shifted the writing of the papers into the classroom.” As a result of this policy change, the U.S. History research paper, a time-honored rite of passage for all EA students, is now written solely in the classroom.
Schuh believes the changes have been beneficial, saying, “I think by and large, it’s been successful in ensuring that students are going through what we regard as a necessary process of having to sit down and write the paper.” When asked about some of the potential tradeoffs or downsides to the adaptation, Schuh notes, “You’ve certainly lost some depth to those papers. The papers aren’t quite as polished, or maybe as far-reaching as they have been when they were being done over a course of weeks at home. But at least we’re ensuring that everyone is going through that [writing] process.”
In the context of historical research and writing, the usage of AI undermines the key processes that are meant to teach students analytical skills that they will use beyond just the writing of history papers. Schuh further elaborates that, “I think particularly in history, AI really strikes to the core of what we’re trying to teach, so we’ve had to be flexible—we believe that those (analytical) skills are so important that simply losing them to ChatGPT is not an option.”
However, the History Department has allowed students to use AI tools in certain situations. Teachers have experimented with allowing students to use AI on some assignments and to aid in the research process for papers. Caden Kropf ’27 explains that he and other students have been allowed to use AI to find sources: “Using AI to find sources can be really beneficial because it allows you to scan [a] more broad [set of sources].”
The English department has been similarly affected by the rise of AI, with essay assignments traditionally completed at home being transitioned to in-class assignments. Explaining the reasoning behind the change, English Department Chair Heather Dupont says, “Our most important quality that we want for our students is that they develop the ability to think independently and critically, and that they can write for themselves. We really want our students to develop confidence in their ability to read a text, to analyze a text, to identify the meaning of a text, and then to communicate that through their own written voice. And so for us, we don’t see AI falling into that bit of our goal—we don’t want students using AI to do that work for them.”
Dupont further expands on changes being made in order to further prevent the usage of AI, sharing, “Although it’s not necessarily a new policy or a new way of teaching for us, one thing that is happening more consistently and more intentionally is just that so much of the work is being done in the classroom.” Explaining the rationale, she discusses the benefits of the move, “I think [moving the essays into class] serves a lot of purposes,” she argues. “For one, it really helps because teachers are there to coach students along as they’re doing the thinking and the writing. A lot of the outlining, is being done in classes well, so that kids are really developing their own thoughts in the classroom with a teacher present—the teacher is there to coach alongside and help them out, answer questions, and offer immediate feedback—it also takes some of that pressure off when students have to go home and write an essay at night, it releases some of that temptation to just look at what Chat GPT has to say.”
Dupont, however, disagrees with Schuh’s belief that writing quality has declined. “I don’t think writing has declined. I think that students—especially considering the way we assess on final exams and on in-class essays—are getting a lot more practice and preparation,” she claims. “When it comes time for final exams, students know that they have two hours and they’ve got to write all these analyses. And, in a way, they’ve already done that in class several times, so I think [writing in class] helps build that confidence.”

classrooms in the Upper School.
Photo courtesy of Lilly Smolenski ’27
Across the humanities, the adjustments made to prevent the usage of AI have been designed to continue the development of argumentative skills vital to students’ education. In other subjects, AI has had a very similar impact. Computer Science Teacher Matthew Davis discusses some changes made with regard to AI in the computer science department: “Because these are just high school level classes, all of the problems that we’re solving are things that AI can instantaneously solve and do. And so we are very cognizant of the fact that like, if it goes home, it could be AI-written. And so we do a lot more in class assessing than we used to—take out a sheet of paper and show me how your brain works.”
However, Davis believes AI can a helpful tool in aiding students’ learning when used responsibly, stating, “I think AI can be very helpful for students in terms of being a study partner when you have an assessment coming up and you have content that you don’t understand or you just need review material—if you can prompt it correctly, which is a skill that the majority of people do not have.”
On the other hand, Mathematics Teacher Dr. Tom Goebeler insists that AI does not yet have a place in the teaching or learning processes of his math classes. He explains, “The point of these math courses, surprisingly, is not just to get answers. It’s not to solve problems. It is to develop problem-solving skills within your own mind….I want students to experience the pleasure of thinking. If you are not accustomed to the pleasure of thinking, it’s a surprise to notice that you are taking pleasure and grappling with things. If you remove that, then you’re removing a deeply human and important element of education.”
Students, now dealing with these changes, have responded in different ways. Siena Scungio ’27 sees both sides of the coin, acknowledging that in certain settings, AI is a great tool “It is beneficial to help students learn, and it is a big study tool for a lot of students.” On the other hand, Scungio adds, “But I do see the side of the English Department and History Department who believe that it takes away from critical thinking.” Tensions over AI can also rise when students are accused of using AI, a new issue educators everywhere are now facing.
Kropf also sees both the benefits and deficiencies of AI use. “I think it’s good for, if you’re studying for a test, giving a practice problem,” says Kropf. “You can get an infinite amount of practice problems from ChatGPT.” However, Kropf also wants guardrails on AI use, stating, “Obviously, you should be writing your paper…There’s a difference between finding sources with AI and having it write the entire paper.”
Lius Vasiliadis ’27, who says he has been falsely accused of AI use, mentions concerns about false allegations. “It raises the problem that if your writing sounds a little off maybe because you add punctuation or something then the immediate thought is that you used AI. And there’s no real way to prove or disprove it.” Further touching upon the problematic nature of AI accusations, Scungio shares, “There is no way for a student to prove that they didn’t [use AI] except for their version history.”
Some students, though, have voiced concerns about how clear AI policies are. “I don’t always really know what we are and aren’t allowed to do,” says an EA high school student who wished to be anonymous due to the sensitive nature of the topic.
Moreover, AI use has also begun creeping into work outside of the classroom. Recently, The Academy Scholium has had to deal with submissions of AI articles, leading EA to affirm that improper uses of AI can happen outside of the context of a research paper.
The emergence of AI has undoubtedly led to many questions about its ethical usage and proper place in academic settings. Its capability to solve problems, research, and write makes it particularly intrusive to the goals of education at EA in particular. Although many alterations have already been made in the way students are assessed and taught, the topic is still evolving. With the introduction of something as foundationally consequential as AI, it will take time for the methods and philosophies used to educate students to fully adapt. Regardless, the ability of students to question, analyze, and argue information will continue to be paramount.




