AI’s hidden cost in higher education
When the teacher entered the class and asked a routine question about business models, half the students at the University of Engineering and Technology (UET) instantly bent their heads — not in thought, but to type a query into their phones. Within seconds, answers flashed on their screens. Some students confidently read them aloud, word for word. Others covertly murmured AI-generated responses, hoping to sound spontaneous. The teacher, well aware, could only sigh in disappointment.
“Students never prepare for their quizzes anymore,” said Eman Fatima, a student of BBIT at the Institute of Business Administration, UET Lahore. “Whenever the teacher asks a question, we just use ChatGPT or other AI tools, copy the answer, and read it out. Hardly anyone studies the topic in advance.”
Her observation is echoed across campuses. At the College of Ophthalmology and Vision Sciences, affiliated with King Edward Medical University (KEMU), a similar pattern is emerging. “Students reach classes unprepared, and when teachers ask them something, they either openly or secretly consult AI tools on their phones,” said Afshan Ahmad, a student of the college. “They don’t even bother to read the material later. The knowledge they gain this way vanishes quickly — and they need AI again to answer the same questions.”
Artificial Intelligence, once hailed as a tool to enhance learning, is now being used as a shortcut to bypass it. The convenience of instant answers is cultivating a new kind of dependency among university students — one that is quietly eroding their capacity for deep learning and independent thought. From business studies and computer science to medicine and engineering, students are increasingly outsourcing their thinking to machines.
“AI tools like ChatGPT are wonderful for brainstorming, but they are also dangerous if used uncritically,” remarked Dr Ayesha Naseer, Assistant Professor at UET. “Students no longer engage with the subject matter. They don’t analyse, reflect, or form opinions — they just reproduce what the AI tells them. That’s not learning; that’s parroting.” She noted that this trend accelerated after the COVID-19 pandemic, when online learning became the norm and digital tools became integral to education. “What began as support during remote education has now turned into an addiction,” she added. “Many students can’t even write a paragraph without seeking help from AI.”
Teachers across institutions have started to notice an alarming gap between the quality of written assignments and the depth of understanding displayed during classroom discussions. “At first, I was impressed with the improvement in students’ written work,” said a lecturer at the University of the Punjab’s Department of Mass Communication. “But when I started questioning them orally, they couldn’t explain even the basic concepts they had written in their assignments. It became clear that AI was doing the heavy lifting.”
He believes that this reliance is undermining critical thinking — the very foundation of higher education. “Education isn’t about just finding the answer. It’s about exploring why that answer matters, what alternatives exist, and how it connects to other ideas. AI can provide information, but it cannot replace human reasoning and curiosity.”
For students, however, the temptation to depend on AI is difficult to resist. University life is fast-paced, demanding, and often stressful. With tight deadlines, part-time jobs, and social commitments, many view AI as a time-saving companion. “It’s just faster,” admitted Hassan Raza, a computer science student at Lahore Garrison University. “If I need to write an essay or prepare for a quiz, I can ask AI to summarise the topic or even generate complete notes. It saves me hours.”
Yet even he concedes that over-reliance has consequences. “Sometimes I don’t remember what I submitted the next day,” he said with a laugh. “It’s like borrowing knowledge — not owning it.”
Teachers are increasingly frustrated as they struggle to balance the integration of AI into learning with the need to preserve academic integrity. Some have started conducting oral tests and surprise quizzes to assess genuine understanding, while others have resorted to AI detection tools — though their accuracy is still questionable. “Universities need clear policies,” argued Dr Rabia Malik, a senior educationist and curriculum expert. “We can’t ignore AI, but we must regulate its use. Students should learn how to use it responsibly — as a research assistant, not a crutch.”
She believes that institutions should introduce AI literacy courses to help students understand the ethical and intellectual boundaries of these technologies. “They must be trained to verify AI outputs, think critically about its suggestions, and use it to enhance creativity, not to replace learning. That’s the only way to coexist with it productively.”
The issue is particularly worrying in professional disciplines like medicine and engineering, where practical application of knowledge is vital. “If a medical student memorises AI-generated answers without understanding anatomy or pathology, that’s dangerous,” warned Dr Sajid Latif, a professor at KEMU. “You can’t Google your way through a surgery or a diagnosis.” He stressed that professional competence depends on deep understanding, not quick retrieval. “AI can assist professionals, but only if they already possess the foundational knowledge to judge its accuracy.”
Experts agree that banning AI altogether would be both impractical and unwise. Instead, the goal should be to encourage ethical and informed use. “AI should be used to enhance creativity, to analyse data, or to simulate complex systems,” said Dr Naseer. “But it should not replace reading, reflection, and reasoning.”
Back in her BBIT class, Eman Fatima acknowledges the dilemma many students face. “We all know it’s wrong to depend too much on AI,” she admitted. “But everyone’s doing it, and teachers don’t always stop us. Maybe if we had more interactive classes or practical projects, we’d study more on our own.”
Her words capture the heart of the problem. The issue isn’t technology itself — it’s how it’s being used. Without conscious efforts from both students and educators, Pakistan’s higher education system risks producing graduates who can skillfully prompt an AI for answers but struggle to think, analyze, and create on their own.