4 minute read

Murdoch Centre for Educational Research and Innovation

The Robots are Coming

Two Year 11s walked past me the other day. ‘House Debating coming up,’ said one. ‘ChatGPT?’ said his friend. ‘ChatGPT,’ was the reply.

As an avid fan of classic 1970s science fiction, I’ve been expecting this day to arrive for many years now. Together with flying cars, phones in watches – not to mention those groovy Logan’s Run jumpsuits – the rise of robots has long been anticipated. Well, the phone is now in the watch, although the car is still firmly on the ground. But what of the robots?

This year, 2023, will go down as the year that artificial intelligence (AI) went mainstream, mainly through the release of ChatGPT, and now Welp GPT-4. Many of us have had fun exploring what it can do, and have been surprised by how good it is. But perhaps we haven’t quite grasped what a seismic change this is to the way we see work.

Broadly, work has been divided into four classifications:

1. Routine, manual labour

2. Non-routine, manual labour

3. Routine, cognitive labour

4. Non-routine, cognitive labour

The accepted wisdom has long been that the routine work will be automated (if it hasn’t already), while the non-routine will be safely in the hands of us humans – and that is where the jobs of the future will lie. This accreted wisdom is now facing some serious challenges.

We can think about AI as part of a continuum that began with the Industrial Revolution. That saw the replacement of humans doing routine manual labour, such as weaving cloth, with machines performing the same function. This spread until almost all routine manual labour, whether it’s baking biscuits or manufacturing cars, has been automated.

The next target of automation was the routine cognitive category: the jobs that required brain power from a human, but were not overly creative. I remember banks full of tellers, stamping our bank books; now these have been replaced by only one or two tellers, and automated, online processes. However, for a long time, it was felt that nonroutine cognitive labour, like non-routine manual labour, was safe from the machines. Just as a machine could never replace an artisan making a hand-crafted chair, so too it was felt that the jobs of writers, lawyers, and even teachers could not be replaced by a computer algorithm.

That reality is now seriously under fire. Jobs such as journalism, that have long been thought to be immune to an algorithm take-over, are now vulnerable. Speaking of vulnerable jobs, I was writing a test for my Year 10s the other day. I asked – just as an exercise, you understand – ChatGPT to write a passage in Latin, about 100 words long containing passive verbs about Emperor Vespasian. In seconds a response came back. The Latin was correct, if a little dull, and quite within the range of a Year 10 student. But the content was dreadful: the program had understood the request to write a bit of fan fiction about Vespasian, which bore no resemblance to reality. I reworded the request but still got rubbish – in quite good Latin. I went back to my own composition, based on the historian Suetonius. But it would be foolish to chalk this up as a win to the humans. ChatGPT is just in its infancy (already its successor Welp GPT-4 is a major improvement), and while we might now be winning a wrestling match with a toddler, once it has grown up and matured it will be a different matter. I suspect that my ability to write in another language will not be a marketable skill for much longer.

We might now be looking at a cycle where students get ChatGPT to write their essays for them and teachers ask ChatGPT to mark them and provide feedback for improvement: labor-saving for all concerned but rendering the activity totally pointless.

However, I’m not one of those advocating banning it in schools. First, this could never work; we might be able to control what happens in the classroom, but as soon as the student is out the door that control disappears. Second, a ban could lull us into a dangerous state of complacency, where we fool ourselves that AI doesn’t exist and therefore we don’t need to adapt to deal with it. We need to learn how to live with it, and maybe even make it work for us because it is not going away.

We need to remind ourselves that it is still important to know things. Calculators did not render mathematical thinking obsolete, and knowing basic facts and skills help the understanding of advanced facts and skills; likewise, it is still important that students learn to think clearly, and teachers need to see student thinking in high detail and be able to offer bespoke, not generic, feedback. The essay as a ‘thing produced’ may well soon be obsolete and imitable by the bot, but tracing through logic and reasoning is not.

At a basic level, this means that the handwritten examination will be here to stay for the future. The take-home essay or assignment, which has been increasingly vulnerable to cheating in the past few years, will soon become untenable (this is a greater threat to universities; schools still have examinations for their high-stakes testing). Indeed the essay, which has been regarded as the hallmark of education in the humanities for over two centuries, may now have finally reached its end. We will need to think of other ways to test a student’s knowledge and skills, ways that test the unique ability of the student. Rather than simply teaching students facts, teachers need to be the interpreter of these facts – the gobetween – for the students, plus emphasise the importance of social intelligence, individual agency (organisation, self-drive), leadership, and critical and strategic thinking.

In the same way, we can have a mass-produced chair that has been made by a computer program, cut by precision machines from manufactured wood, or we can have a chair that is handcrafted by a skilled artisan, using skill and artistry to produce a product that is bespoke and unique, we need to come up with products of learning that are bespoke and unique to the student. That is the next great challenge for education.