Since the emergence of artificial intelligence (AI) onto the world stage, many people have begun to marvel at the advent of a new age where AI can perform tasks with unprecedented accuracy, efficiency, and velocity. However, some have raised alarms at the potential risks and threats that AI can pose to humanity. Chief among these concerns is the threat of mass displacement of human workers by AI, which could lead to widespread job loss and financial instability, ultimately resulting in social and economic disasters in this world.
Despite this skepticism, many individuals and governments wholeheartedly embrace AI, believing that the future of the world lies in AI. Integrating AI into education (AIEd) is an emerging area which has gained tremendous traction in numerous advanced societies. Here in Hong Kong, the Education Bureau (EDB) has already rolled out comprehensive plans to incorporate AI across various educational levels – primary, secondary and tertiary. For example, EDB has notably assisted publicly-funded primary and secondary schools in offering coding and AI education courses, which cover essential AI-related topics such as data security, privacy and ethical issue of AI and impacts of AI on manpower and society, with some specifically tailored for junior-level secondary students and teachers. All these initiatives are commendable positive strides towards nurturing a new generation of AI-literate individuals ready for the massively digitalized world in the age of AI. That said, I believe that further efforts are needed to educate both educators and students about the importance of cultivating a balanced, realistic and vigilant mindset when utilizing AI as a tool for all human-centric tasks, in addition to the foundational AI knowledge. At the end of the day, we as humans must be in control of our own fate rather than by machines or algorithms, and students should not regard AI as a panacea to all tests and exams akin to CliffsNotes in which they mindlessly copy the ‘model’ answers.
A recent incident involving AI ‘hallucination’ in Hong Kong has served as a stark awakening to the consequence of mindless use of AI in education. In this incident, a social sciences professor at the University of Hong Kong resigned from his position as associate dean after unknowingly endorsing non-existent citations generated by AI in a research paper written by his graduate student for an academic journal. This incident carries a number of profound implications for Hong Kong’s education field to move forward to the next level.
Firstly, this incident has likely left many, including educators, questioning if AIEd is overhyped and overused. The so-called “AI hallucinations” phenomenon may represent a ticking bomb waiting to explode. To avoid AI hallucinations, must individuals in the end bear the responsibility to fact-check data and information themselves? If this is the case, does AI create more workload for us instead of less?
Next, the HKU saga could be an ominous sign about the consequence of placing blind faith in AI within the realm of education. On one hand, the intriguing transformation that AI has brought to campuses and classrooms cannot be overstated. It relieves the burdensome workload of teachers in planning lessons, designing tests and grading students’ assignments, while providing students with instant feedback on math problems and written essays. On the other hand, an increasing number of people have come to recognize that AI is not infallible, regardless of how rapidly Generative AI and Large Language Models (LLMs) have evolved to become increasingly powerful and sophisticated. This recognition is grounded in the good old adage of "garbage in, garbage out" (GIGO), which arguably remains highly relevant in the AI landscape; just a single bit of faulty data used in training an AI model might become the proverbial bad apple that spoils the entire basket. Moreover, the predictive power of LLMs is only as robust as the quality of their underlying statistical analyses and algorithmic computations.
Last but not least, the HKU incident may reflect that we are growing complacent and lazy in the face of AI. In schools, students are often found to be offloading their efforts to learn and conceptualize complex concepts to AI. This trend is easy to understand. Take this as an example, AI tools can produce essays in mere seconds as opposed to hours taken by students to perform the same task. As a result, both college and secondary school students have become accustomed to using ChatGPT for essay writing, and over time, this will cause their critical thinking and cognitive retention power to diminsh.
The point outlined above in fact presents a dilemma to educators. While they seek to leverage AI to evolve to enhance teaching dynamics, foster classroom engagement, and improve students’ learning outcomes, there is an underlying concern as AI becomes increasingly intelligent and human-like, students may rapidly lose their cognitive abilities, which definitely is what educators like to see.
Certainly, AI opens up unseen possibilities in the field of education. For example, teachers employ AI-enabled apps and AI-generated content to make classrooms lively and engaging, and to help weaker students regain their confidence and interest in learning. However, the prevalence of students using AI tools to complete their take-home essays and conduct research has prompted our tertiary educational institutions to issue stern warnings against AI-assisted cheating. To address the misuse of AI on campus, I believe that our universities should consider adopting more real-time, face-to-face oral tests and exams to assess students’ comprehension of key concepts and their ability to articulate eloquently their compelling points of views. This practice is already mandated by a growing number of educational institutions in North America. Admittedly, humans tend to do less rather than more, particularly when given the opportunity, a tendency supported by the linguistic principle of least effort. Therefore, EDB might explore providing alongside core AI courses assistance and incentives to schools to design non-AI-assisted coursework and tests at evaluating students’ critical thinking and problem-solving abilities of students, two critical components of modern-day educated individuals. In this way, we can fully maximize the potential of AIEd.
Scott Cheng, public affairs consultant, and part-time lecturer teaching advanced business English writing at HKU SPACE