Academic Pressure Drives Students to Cheat

Advika Singh & Kaveri Dole || STAFF WRITERS

Many students in Andover High School’s competitive environment balance demanding courses alongside extracurricular activities and personal responsibilities, which can become challenging, leading students to seek out Artificial Intelligence (AI).(make lede more engaging)

As the use of technology in everyday life has become prevalent, AI tools have become more accessible, and their presence in schools has sparked many conflicts. As more students begin to use AI, the need for clear guidelines, especially in an academic setting, becomes more obvious.

Patrick Benjamin, a sophomore at AHS, noticed the impact of these unclear AI guidelines. “The AI rules for each class are different,” he said, reflecting how the fine line between misuse and assistance can feel unclear, leading to inconsistent usage across classrooms. This uncertainty may be why the use of AI persists, as students are unsure how to navigate expectations that differ from teacher to teacher.

Freshman Bhavika Sharma defines stress as the main reason students turn to AI shortcuts. “Most people just get very stressed and pressured because they have so much work,” Sharma said. For students with overlapping deadlines and long nights of homework, AI can appear to offer relief by helping them complete their tasks. This can refer to streamlining certain tedious tasks, or direct plagiarism, something teachers are becoming increasingly wary of.

Guidance counselor Kimberly Bergey also noticed the impact of rigorous schedules on students. “A lot of students feel like there aren’t enough hours in the day,” Bergey said. At a competitive school like AHS, pressure in school can be abundant. The average AP score in AHS is a 3.66, with the national average being a 3.06. These heightened expectations often fall on the shoulders of students who strive for academic achievement.

However, irresponsible AI use is not the solution. “ I think cheating prevents you from retaining information, and it can negatively affect you in the future,” said Latin teacher Laura Jordan. AI use can include using it to write entire essays or assignments and submitting it as your own, using it to answer test questions in real time, having AI summaries replace the original text, and building off of AI research without fact checking. Although many factors can affect a students’ decision to use AI, ultimately the choice is the student’s which unfortunately often results in abuse.

On the other hand, AI isn’t always sought out as a shortcut, but rather from curious students using it as a resource instead of as a means of plagiarism. “I will occasionally get AI to create practice problems for me resulting in me scoring better in tests,” said junior Selina Amere. Additional responsible AI use can include but is not limited to getting explanations of difficult topics, asking for hints or feedback rather than full answers, summarizing notes you have already written, and AI text to speech or speech to text tools.


  • Related Posts

    Steve Zrike Appointed MA Secretary of Education

    Avery Slaughter || ONLINE EDITOR

    Steve Zrike was appointed Massachusetts Secretary of Education by Governor Maura Healey on February 10. He will assume the position on February 13.

    Zrike will succeed Patrick Tutwiler in the role. As Secretary of Education, Zrike will oversee the Executive Office of Education, which is responsible for managing pre-elementary, K-12 and higher education across the state.

    “My responsibility is to the children of the Commonwealth,” Zrike said. “I just want to make sure that that is clear–that at the end of the day, the job is about improving the student experience in our public institutions across Massachusetts.”

    Zrike is currently the superintendent of Salem Public Schools. Previously, he also served as superintendent of both Holyoke and Wakefield. He has held various other positions in school districts across Massachusetts.

    “I feel like I’ve had a lot of experiences in a lot of different types of communities,” Zrike said. “I’m going to rely on the different relationships and different experiences I’ve had across all those places. Of course, I have a lot to learn, and I’m excited to get started with better understanding the many different types of education programs that exist across Massachusetts.”

    An Andover resident, Zrike began his career in education as a fifth grade teacher in Andover Public Schools. Prior to this, he received education at Dartmouth College and attended the Harvard Graduate School of Education.

    “Andover was an amazing place for me to start my career,” Zrike said. “I learned a lot from the people that I worked with. I’ve carried that experience with me since.”

    AI, Wikipedia Share Problem

    Samin Faiz & Avery Slaughter || STAFF WRITERS

    Recent developments in the AI industry have taken our generation by storm.

    With AI-generated responses taking over Google searches and a pocket-sized AI assistant just one click away for many, it’s not unreasonable to say that our robotic friends are becoming increasingly integrated into our daily lives. This is no surprise, considering the attractive list of benefits it brings to students.

    “It’s helpful at times for quick answers,” explained senior Vignesha Jayakumar. “You don’t need to go to a site and get bombarded with ads and all that cookie nonsense.”

    Despite its brevity and user-friendliness, AI is notorious for its shortcomings. According to a 2025 study from Columbia University——as well as the past experiences of many——the majority of leading AI engines give answers that are either partially or completely incorrect. The fabrication of information is a phenomenon known as “hallucination,” and it’s a trap that many students fall into while researching.

    “It’s very inaccurate at times—. Sometimes, AI just spouts out hallucinations when they feel like they’re expected to give an answer,” Jayakumar noted.

    The unreliability of AI seems oddly familiar to certain AHS faculty. Having taught courses that place an emphasis on research, history teacher Ruth Masters shares her experience with a well-known source that presents a similar threat to students.

    “It’s a place to start—, kind of like when I’m teaching research and students ask, ‘Can I use Wikipedia?’ I’ll say, ‘No, but you can go to the bottom of an entry and use those sources—.’ I think of it in the same vein as that.”

    Jayakumar sees the parallels between AI and Wikipedia, as well. “Every teacher has their own guidelines for research,” he added. “But Wikipedia is a one-stop shop for information; Wikipedia is unreliable, AI is unreliable, so maybe we should treat them the same.”

    NMany newer computers come with a built-in button that summons Microsoft’s compact AI assistant, Copilot—a feature that cannot be disabled. Furthermore, search results and AI-generated responses seem to come in a package deal nowadays, with Google’s AI overview illuminating across your screen in place of top results. So let’s face it: AI is nearly impossible to avoid, even for those that do not intend to use it. This raises an important question: is AI use considered a form of academic dishonesty if it’s unintentional or unwilling? To what extent should AI use be allowed before it crosses the line into cheating?

    “Sure, read it. But that’s not your source,” asserted Masters. “You’ve got to go digging for your information. So do I think it’s unavoidable? I think that’s a choice. And if you choose to use it in lieu of your own critical thinking, then yeah, I think that’s academic dishonesty.”

    According to a 2025 study from Pew Research Center, only one percent of people habitually verify linked sources when provided with a Google AI summary. Instead, they trust that the information is accurate—the exact opposite of academic honesty.

    “The thing is that students have to be proactive in making sure that what they’re taking from that information is accurate and reliable, just like how you would go to any news source and check whether they’re saying something biased or not biased,” reasoned Jayakumar.

    This easy access to often unreliable information can be correlated with a higher rate of use, especially amongst students. It does, after all, seem like a bottomless well of information. Accurate or not, it’s undeniably a temptation that countless students succumb to.

    “You know at some point, students need to learn self-control, right? There’s probably stuff in your house that is one step away from your use that you ought not be using,” Masters said.

    It’s easy to fall prey to the allure of AI searches in a world where—, according to both Google’s AI overview and a verifiable report from Forbes—, the economy itself relies on companies like OpenAI. Teachers like Masters believe it still falls on students to use good practice when verifying their information, especially in the face of such temptations.

    “I think that our role as high school teachers is to teach critical thinking… and I have a lot of concerns with students’ decreasing abilities to intellectually wrestle with information.”

    Leave a Reply

    You Missed

    Steve Zrike Appointed MA Secretary of Education

    AI, Wikipedia Share Problem

    Hey! What about the teachers?

    Academic Pressure Drives Students to Cheat

    Academic Pressure Drives Students to Cheat

    Study Tracks AI Dependence and Cognitive Decline

    Study Tracks AI Dependence and Cognitive Decline

    “Did you use AI?”

    Discover more from AHS NEWSPAPER

    Subscribe now to keep reading and get access to the full archive.

    Continue reading