The world seems entranced by artificial intelligence. The recent controversy over ChatGPT and other apps such as Bearly.ai has prompted some academics to rethink teaching and testing. It is time to confront the question of whether our way of life will be at risk in future because of AI.
It is far too early to conclude that university education is outdated because of AI. In many ways, AI has already outsmarted human intelligence and it is futile to try to catch up with it. The real threat is not the AI technology itself but its potential impact on human behaviour and wider society.
As computer scientist and technology philosopher Yuk Hui said in a 2021 article, maybe the definition of AI already implies its limit because there is no generally accepted theory of intelligence and data, and AI is only one way to model it. This view echoes Harvard psychologist Howard Gardner's theory of multiple intelligences, which corrected the aberration at the time that IQ was the only indicator to measure human intelligence.
Why is essay writing still important? For any learning to take place, people need a firm grasp of language, which is the symbolic tool to mediate thinking and express meanings in a real social context. We are no longer in the Middle Ages, but there is more than a grain of truth behind the curriculums of medieval universities.
Scholars learned the trivium (grammar, logic and rhetoric) before the quadrivium (arithmetic, geometry, music and astronomy) - that is, they learned the word before the world. Also, the format of essay writing presupposes that students communicate ideas and argue their points rationally in public, just as Martin Luther wrote up his theses to defend his theological views against the Catholic Church.
However, the technology of ChatGPT assumes a very narrow view of learning, and universities have good reason to ban it. Why does this ghostwriting machine exist in the first place? The business model of ChatGPT is premised on two anti-education beliefs of which Elon Musk might be enamoured.
This picture taken on Monday, in Toulouse, southwestern France, shows screens displaying the logos of Microsoft and OpenAI. - Microsoft extended its partnership with with OpenAI, the research lab and creator of ChatGPT, a conversational artificial intelligence application software, in a "multiyear, multibillion dollar investment". (Photo: AFP)
First, a student is utilitarian, lazy and selfish by nature. He or she desires good grades but is tempted to cut corners, resigning mental effort to an artificial ghostwriter. Second, students might think the knowledge required for the assignment is too easy to acquire and it would be a waste of time to go through the painful steps of brainstorming, thinking, failing and trying again in the process of writing. The question is whether this is the moral compass we want to implant in our students' minds.
In fact, before ChatGPT, the use of algorithms has sparked controversy all over the world. In 2020, for example, the British government's use of an algorithm sparked a public outcry over its bias and errors in determining GCSE and A-level grades.
According to the BBC, the UK's Office of Qualifications and Examinations Regulation depended on two sets of data to estimate A-level grades: how the students were ranked within a school and how their school had performed in recent years. However, the algorithm was found to be inaccurate in predicting grades. In the end, the scheme was scrapped.
In Australia, the government used an online automated welfare compliance system called Robodebt in 2016. The system was found to issue incorrect debt notices to recipients, leading to several legal challenges and a public inquiry.
From such incidents, it is clear that the idea algorithms alone can create a smarter future is just a pipe dream.
AI will not render traditional educational methods obsolete, just as the invention of calculators did not make school maths fade into oblivion. The alarm over ChatGPT is a symptom of educators' existential anxiety.
We must ask: do we give students the chance to scratch beneath the surface and appreciate the complexity of social reality, beyond black-and-white thinking? Do we impart knowledge that is challenging enough so a student sees it is worth the intellectual effort to produce a good essay on their own?
It is unwise to spurn the idea of using technology in class, but ChatGPT is forcing us to jettison the mechanistic mode of working in universities and diversify learning activities. After all, nobody wants to learn and work like a machine.