top of page

AI—The New Steroid Injecting Education

  • Patrice Hamilton
  • May 28
  • 6 min read

By Patrice Hamilton

Professor of English, CT State Tunxis


ree

Deutsch: Illustration für Projekt KI und Wikipedia, Rubrik Textgenerierung. Creative Commons License viewable at <https://commons.wikimedia.org/wiki/File:Cyborg_typing_text.jpg>.


Let’s get this out of the way—Artificial Intelligence can do amazing things. It can save a lot of time, provide an infinite amount of research, chat with us when no one else wants to, create outlines, press releases, and ad copy. It can rewrite our emails, structure our resumes, and in the medical community, it can provide diagnoses in record time that may result in saving lives.


In education, students like it because they can finish a homework assignment or write an essay in minutes instead of hours. And for teachers, AI can generate lesson plans and rubrics, compose quizzes, design projects, and even grade student essays and exams. Those who use it say this gives them more time to be better teachers in the classroom and that the feedback AI provides can be very similar to comments they would make. I can see why using an AI program isn’t all that different from having a Teaching Assistant do one’s grading, and who wouldn’t love to have a TA? And now Google is giving students its premium version of AI, Gemini Advanced, for free, which includes 2 TB of storage. Google is willing to give students a free one-year subscription because its goal is to increase users the same way that cigarette companies used the cartoon image of Joe Camel to attract young, life-long customers in the past.


But I have some concerns.


First, higher education is under a microscope right now, which isn’t a bad thing because we could certainly make improvements. However, with the Department of Education being dismantled, now more than ever our society needs to believe in the importance and relevance of a college degree. We’re all familiar with what happened when every child who played a sport went home from the end-of-the-season banquet with a trophy. It was a well-intentioned idea, but the concept backfired because many children didn’t value the awards they received. Even at a young age, they needed to feel that they earned that ribbon or statue. What happens when college degrees are seen as another form of participation trophies? If students can get AI to write their papers, complete their math homework, fill out their lab reports, and answer their exam questions, getting a degree won’t be that big of a deal. Graduates need to feel that sense of accomplishment that comes with sweat and sacrifice, and professionals looking to hire should see graduation as a significant achievement as well.


Second, most of us know that one problem steroids create in sports is that if some athletes use them, this increases pressure on other athletes to use them just to stay competitive. I worry about this when it comes to education. Many students want to do their own work, and if they use AI or sources on the internet, they use them to enhance learning rather than replace it, but how will their efforts stack up against content created by an algorithm? After all, those students are only human.


I spoke with a colleague who teaches math, and they told me about a math major who had taken all of their upper-level math classes in online asynchronous courses; however, when they applied for a tutoring job, they were unable to pass a basic algebra test.


When I attend conferences, I listen to advice about what instructors can do to rewrite their assignments to discourage AI use, and yes, we should do what we can. But this puts the burden on the teacher to keep revising assignments, which is not only time-consuming, but very difficult given the way AI keeps evolving and the amount of websites that host research papers and exams keeps increasing. Of course, we could use AI to revise our exams, but this all starts to feel a bit Orwellian: AI writes paper assignments and exam questions, AI is used to respond to those assignments and exams, AI grades those papers and exams—the only thing missing is human beings. Of course, if we do use AI to grade student work, we would have to inform them of that. How will this impact students and their motivation to turn in authentic work?


As a professor, it can be dispiriting to read a batch of student work, especially when I teach an online class. With asynchronous courses, the only way I get to know my students is through their writing because I don’t have the opportunity to meet them in a classroom. When reading discussion posts, I used to get a sense of their personalities by their word choices, their individual examples, and their unique perspectives. Each discussion post had a different voice. Sure, the writing was flawed at times, and yes, I used to complain about wordy sentences and vague statements, but I felt that I knew my students. I miss that now. The other day, a colleague said, “I asked my class to write a literary analysis, and many of them turned in nearly perfect papers, but when I asked them what literary analysis is, only one student could come up with an answer.” They supposedly had done literary analyses, but they didn’t know what they had done.


There’s no such thing as a free lunch, and someday I think we will pay a price for the steroid we call AI. Someone who graduates with a college degree shouldn’t have to worry about what impression their writing will make if they can’t get a fix from an AI program.


It’s tempting to get on the AI train and see where it takes us. After all, we can’t be afraid of technological innovations. Maybe it’s not important that tomorrow’s leaders know how to write or calculate answers to problems. Maybe all they need to know is how to craft good prompts and edit the results. But I think we all can agree that they do need to know how to think, and AI is eager to do that for them as well. Should everyone who uses AI review it for errors and accuracy? Sure. Will they? Sometimes.


Another concern is whether business executives will want to pay top salaries to employees who completed their classes with the help of AI. Will those high GPAs carry the weight they used to? We all are keenly aware that graduates need substantial salaries to pay off those student loans.


I’m not suggesting we get rid of AI because the benefits it offers are important, and much like social media, we can’t put the toothpaste back in the tube. We have to learn to manage both AI and social media so that they add to our lives without creating dependency. Artificial Intelligence needs to be regulated. We need to have clear identification of all AI created videos, text, images, and audio, and we need regulations that hold platforms responsible for hosting exams, student papers, and deepfakes. We need to require students to cite the AI they used just as they would any other human source, and we need better tools that spot AI plagiarism. . .so that when we talk to students, we can refer to reliable evidence. Teachers are advised to use their intuition when it comes to student work, but this is very difficult because with programs like Grammarly, we may never get to know their authentic writing voices, and the last thing we want to do is falsely accuse a student. Many instructors have students do more of their work during class, which helps, but at the end of the day, some work has to be done outside of the classroom—and when teaching an online course, that solution is impossible to implement. I also think that college institutions need more limitations regarding the amount of classes a student can take online, and I would like to see more hybrid courses offered so that students are required to demonstrate their knowledge in front of an instructor once or twice during a semester. One major reason students choose asynchronous classes is because they offer flexibility, which is especially important to students who juggle their courses along with work and family responsibilities, but we can find a way to provide that convenience while also ensuring that students are learning what we think they are learning. I have heard that some math departments are leaning in this direction, and I hope that continues.


Social media companies like Meta and X are reducing fact-checking and putting the responsibility on the user to call out misinformation. We can’t rely on the gatekeepers of the past, which means we need a population who has information literacy skills, and we all have to protect ourselves from becoming intellectually lazy. People tell me that they frequently use AI at work—it saves employees time and the company money. I think that’s great, but education is a different matter. We are teaching students to not only accomplish a task, but to deepen their intellects. AI shouldn’t be a crutch and we shouldn’t let it become an addiction. I would rather read my students’ work, warts and all, then read some artificial concoction that is bulging with muscles but has no beating heart.







Professor Patrice Hamilton teaches full-time at CT State Tunxis, and specializes in English and Communications.




 
 
 

Comments


ABOUT Action Academe

Activating interdisciplinarity in the humanities and liberal arts & sciences

SOCIALS 

*ERGO is engagement, retention, graduation, and opportunity—our "operating system" and mission.

The views and opinions expressed herein and elsewhere on actionacademe.com are solely those of the respective author(s) and do not necessarily reflect or represent those of Action Academe (AA); AA's staff or community partners; CT State Tunxis; Connecticut State Community College (CT State); or Connecticut State Colleges & Universities (CSCU). 

Copyright © 2013-2025 by Action Academe (formerly Action Humanities (AH)  and Humanities in Action (HiA)).
All rights reserved.

bottom of page