AI: Familiar Territory or Alien World?

Grush: Can we extend that model into the present, with AI issues?

Frydenberg: We can and we do. Now we teach students to develop these same literacy skills when using (or in the context of) AI: including a bit of how AI works, how to create effective prompts, what to use it for and when it's appropriate to do so — and more importantly, how to recognize when its results are biased or inaccurate. AI literacy is yet another skill to develop in the path that began with computer literacy in the 1950s. Later, we expanded the skill set to include information literacy and digital literacy, beginning with the widespread use of the World Wide Web; and technology literacy as our devices connected us to the Internet and to each other.

Grush: So, can learning from our experiences really help us out, or might we be truly stranded in an alien world?

Frydenberg: I don't think we're stranded in an alien world. We just need to learn how to navigate it better. And learning from our prior experiences with disruptive technologies will help.


I don't think we're stranded in an alien world. We just need to learn how to navigate it better. And learning from our prior experiences with disruptive technologies will help.

Grush: How would you characterize the rate of student adoption of AI tools?

Frydenberg: ChatGPT is a good example to give you. It was introduced in November 2022, and the following January, at the start of the spring semester, we asked first year students at our university how they use it. Nearly half of the students hadn't heard of it in January 2023, but by September of that year, only 7 percent hadn't heard of it, and only 1.5 percent had not heard of it as of January 2024. Also, the number of students using it for homework assignments has increased from 3 percent in January 2023 to just under 40 percent in January 2024. All this highlights how rapidly college students have adopted AI technologies.

Grush: How can you tell whether students are learning with the use of AI tools? How do they benefit?

Frydenberg: One of the best ways to tell if students are learning in the ChatGPT era is to change the way we assess their knowledge. Reliance on multiple choice quizzes to evaluate learning must be put behind us now that the Internet and AI are at our fingertips. Educators today need to embrace project-based learning and develop assignments that ChatGPT can't — yet — solve fully or easily. Educators may also talk more one-on-one with students to get a sense of what they are learning. In an introductory Python class I'm teaching in the fall, I'll replace two mid-semester quizzes with 3 or 4 short "check-in interviews" where I ask the students about code they wrote, or a project they completed. If they can't explain their work, they haven't fully grasped the concepts needed to complete it. By being able to explain concepts, students are able to tackle more challenging problems.

Grush: If an important goal is to use AI to help students learn more, we're focusing on serving students. Serving our students well means knowing them well. How would you characterize students in relation to technology?

Frydenberg: An article I read last year by Antonios Karampelas on Medium.com classified learners as analog, digital, and now, AI natives. Many of us who went to college before the era of PCs and cell phones purchased and carried printed textbooks around! Our teachers primarily lectured, and all our work was done by hand, on typewriters, and later, on computer terminals. But we were analog natives, with limited exposure to technology.

Today's students are different. Mark Prensky famously calls learners who grew up with electronic devices and the Internet "digital natives." They'll go to Google to do research and use Excel to analyze data. They learn through watching videos and most of their resources are digital.


Featured