Vol. 22 - teaching code in the age of AI
I'm embracing the challenge of teaching programming while navigating generative AI's impact on education.
Hey — it’s been a while. Time has been a flat circle, the kind that is spinning superfast, and I was hanging on for dear life. And by “was”, I mean “am still”, but I also decided to stop postponing a little writing session.
So I am teaching programming to biology undergrads this term, and generative AI is a thing that exists now. It exists so much that when you install VS Code, it will prompt you to configure it before picking a color scheme (Monokai Pro, of course) or a font (Iosevka).
And when something is so in-your-face, when a novice is presented with a shortcut before they are presented with the basic functionalities of the tools, it’s a good time to be an educator. Because generative AI is not making us obsolete: it is making us essential.
First: we need to teach the basics.
One of the things that I found most frightening with LLM-assisted coding is that the code that is generated is terse. It’s good! But it relies on many abstractions, and syntactic sugar, and unless you already know the language well, it will mean nothing.
I once spent a 3-hour cooking class cutting carrots. Not because I was training to be a professional carrot-cooker (I’m pretty good, admittedly), but because it was a good material to practice the basics of cutting. How do you hold the knife, how not to strain your wrist, what’s a brunoise? The act of cutting carrots didn’t amount to anything but the act itself.
This is the same thing with programming; it will get very complicated eventually, but you can’t jump into the complicated thing before you understand the most basic aspects first.
Second: we need to discover what is an acceptable use of these tools.
The reality is that students will use these tools, and so our role is to make sure that they use it responsibly. There are some situations where generative AI should never be used. But saying this is different from saying that it should never be used at all (which may yet be true, if only for its human and environmental cost).
But prosaically, there is a tool, it is available for learners, our duty is to make sure that they understand how to use it. The way I am currently doing this is through assessment, and asking students to share their chat session with whatever engine they use. Cultivating a critical sense, and seeing it in action when deciding what questions to ask and what answers to accept, is far more important than making sure they know how to write a for loop.
The only way to do this is to engage our students as fully formed individuals, understand why they are using these tools, and how, and engage in dialogue to figure out what the “right way” is. Because we do not have the answers yet, and this puts us in the (sadly) uncommon position of having to learn and figure it out, instead of having to teach and explain.
Conclusion: we need to teach the way to learn.
One thing I tell students with alarming regularity is that I'm not much concerned with how much data science, mathematical modeling, network science they will remember in a year or ten because they will be able to open a book and read about it again. But I care deeply about making sure that they develop two certainties: they can learn hard concepts, and they can use them to do hard things.
Everything else, the material conditions in which the learning takes place, is something we can put to the service of learning how to learn well. Maybe there is a role for AI there. Maybe there isn’t. Maybe there is a different answer for each learner and each instructor.
And this is where we, as instructors, become essential. Who else will do this hard work?