Skip to content
E-Scholium

E-Scholium

Episcopal Academy

  • Scholium
  • News
  • Sports
  • Arts
  • Community
  • Editorials
  • Features
  • Culture
  • Archives

Teachers must approach AI-use with caution

Posted on March 5, 2026March 5, 2026 By Lucia Forte
Editorials, Scholium

Darian Mihalakis ’27

The advent of AI has revolutionized institutions around the world, upending long-standing practices. The reverberations of this have been felt worldwide, including at EA, where the technology seeps into teaching. This embrace, however, is a major mistake, and EA would be wise to exercise extreme caution when integrating AI into teaching. 

In recent years, EA teachers have begun to increasingly incorporate AI into their jobs to varying degrees. For instance, many teachers now utilize AI in their email writing, although primarily for revisions. “If you’re communicating with other teachers, students, and parents, you want to make sure that you’re grammatically correct [and] that you’re using a really professional tone,” shares Erin Bilbao, chair of the modern languages department. “I [prompt ChatGPT to] ‘please soften this email’ often,” she adds. Computer science US teacher Matthew Davis says that he uses AI for similar purposes. “I’m not like, hey, write this email to this parent, but I might say, this is what I’ve got, can you please help me phrase this some nicer way,” he explains. Revising emails can improve the quality of communication. “If it’s an email, then as long as I understand what the email is saying, I don’t care [if teachers use AI],” says Nikolai Nawrocki ’26.

Douglas Parsons, Upper School English teacher, states that he doesn’t use AI much himself but that “some of my colleagues use it to help give students feedback on shorter pieces of writing.” He then elaborates, “It’s not the final piece of writing. It’s formative feedback in the writing process. Teachers frontload the criteria for what they’re looking for. Some of the students upload their paragraph, they get feedback which they can act on to revise the paragraph, and then the teacher will look at it and give a final grade.” Bilbao touches on similar uses, saying, “What I have seen is a teacher might use AI to help organize the creation of a rubric, or that they might input a rubric into AI and then have Chat GPT give feedback on an essay. Then the teacher can decide, Does this fit?”

Even midterm comments—supposedly personal reflections of a teacher’s view of a student—have begun to be mixed with AI, due to their time-intensive nature for already busy teachers. When asked whether teachers use AI to write midterm comments, Davis answers, “Do I think people do that? Yes, 100%.”

On a broader scale, it’s plausible that AI-use exceeds the scope mentioned, as generally, the most controversial AI uses are unlikely to be openly discussed.

One of the most significant benefits of integrating AI into teaching is that it saves teachers’ time. “Instead of having to create extra materials for your students, AI can throw it together with the vocabulary and grammar and just make things for you, which is a huge time saver,” explains Bilbao. This then leaves teachers with more time for their families or personal help with students.

A constant refrain from EA teachers was a belief in a “human-AI-human” approach. Davis explains, “We always say in the computer science department, it should always be human first, then AI, and then human again.” He then offers up the example of using AI to help revise emails. This approach is quite sensible. Leveraging this approach, teachers—who are in a much different position than students— can save time and improve how they convey ideas, while also avoiding common miscommunication errors and maintaining some humanity in their words.

However, as a student, I do wish that some guardrails were put in place to regulate how teachers use AI. EA educators have spent years honing their craft and possess extensive experience in their respective disciplines. In the case of AI feedback, I would much rather prefer that they provide feedback on my writing to help improve it, which will be of higher quality and more personalized than generic AI slop. In terms of midterm comments, I value my teacher’s thoughts and would always hope that the comments were written by my teacher. I completely understand the appeal of using AI on the teacher’s side, but I would in no way want to be receiving personal critiques from a machine.

TEACHER-GPT: A satiric AI prompt for ChatGPT demonstrating its potential usage.
Photo courtesy of Danity Pike ’27

Expanding AI usage by teachers also risks undermining (rightful) restrictions on student uses of the technology. Encouraging students to submit their work to Flint for feedback would undoubtedly send mixed messages about appropriate AI use. Why is using Flint OK but not ChatGPT? This would also induce further incidents of full-blown cheating, as students just use ChatGPT anyway. Furthermore, how can you decry AI use as dangerous to students while simultaneously encouraging them to use it?

With every further integration of AI, we risk blasting open the already damaged lid of Pandora’s Box. By allowing AI to provide advice now, we pave the way for all feedback to be AI-driven in a couple of years. By allowing AI to create rubrics now, we pave the way for grading to be fully powered by AI. Davis touches on this, predicting, “One day, there will be AI graders and the teacher will assign the point value in the gradebook, but the AI is determining if you completed the task.” As students mistakenly rely more and more on AI, we risk a future in which students submit AI papers to be assessed by AI graders, a future much nearer than most might think. 

Thus, the core question at the heart of this debate revolves around the point of education. Is it a process whose success is solely determined by the results? Or is it a uniquely personal and human undertaking in which the process, the trials and errors of constant practice, are much more important than the final output?

Ultimately, AI is a dangerous technology that must be treated with care. It has the power to profoundly shape the world for the better, but also the capacity to destroy human expression. Thus, EA must tread with caution as we walk the perilous cliffside path of AI integration.

Tags: december editorials

Post navigation

❮ Previous Post: I wouldn’t touch Christmas with a 39 1⁄2 foot pole
Next Post: Staff Editorial: Early decision/action PSA ❯

You may also like

Community
Fahrenheit 451’s lasting impact on the Upper School
November 10, 2025
Features
November Club Connect: Queer-Straight Alliance
March 3, 2026
Features
Cutting-edge new treatment for Huntington’s Disease
December 7, 2025
Arts
Behind the lens: a spotlight on EA photography
October 12, 2025

Copyright © 2026 E-Scholium.

Theme: Oceanly News by ScriptsTown