7 Questions on Generative AI in Learning Design
Another example: I've used ChatGPT generate course and training outlines, and to generate learning objectives that clearly communicate to learners what they're meant to be learning within a specific module. And the interesting thing about it is that you essentially have a co-writer who is never upset with your feedback. When I asked ChatGPT to generate some learning objectives, I had to tell it, "You're on the right track, but those objectives are not measurable in any way, shape, or form. I'm looking for an action verb that will indicate that learners have actually demonstrated that they've learned something." I gave it some examples of what those verbs might look like, and it rewrote the objectives in a way that was very competent. Having an AI companion that knows how to write in multiple formats means I don't necessarily have to spend my time trying to remember, what are the elements of a measurable objective? Are we framing these like smart goals?
CT: How can AI tools be used to help improve accessibility when building course content?
Vaughn: AI transcription and subtitle generation has gotten consistently better. I love to use the transcription feature in a little-known app called Microsoft Word: I can upload an audio or video file into Word, and then it comes up with its best approximation of a transcript. It recognizes multiple speakers — this is pretty common for most tools like this now — and I can copy the transcript right into Word and edit it in an environment that I'm familiar with. And now I have a full-blown transcript, I have the audio or video file as part of the document, I can copy and paste that transcript into a video streaming service, and nearly every video streaming service will automatically turn that transcript into subtitles or captions for my video. It just saves so much time at this point, that it's almost like, why wouldn't you do this?
I've also seen some excellent applications using image recognition software, if I'm having trouble coming up with an alt text description or a caption for an image. It's hard for me, as someone who's always been a sighted person, to think of how to describe an image to someone who cannot see. And having that extra bit of AI — being able to leverage those tools to dramatically reduce the amount of time that it takes to create accessible content by default — it's a wonderful gift. I don't think nearly enough people look at it that way. It's incredible that we're able to do this.
I will end with the caveat that we should not just let the AI do everything. While it's a competent writer, it's not a great writer. I've seen plenty of issues with generated captions and subtitles being accidentally inappropriate or misleading, and that would be very confusing to someone who can't also hear the audio. But you can certainly use it to do the bulk of the work.
CT: What's the best way to engage faculty in using these tools?
Vaughn: It feels like we have this conversation every year with whatever new thing emerges. But AI is one of those ed tech things that's come along that I don't think is a fad. I do think we'll still be talking about this in a year — we'll be talking about this for a long, long time. I don't see it going anywhere.
The example I love to give is Instagram, one of the largest social networks on the planet, took two-and-a-half years to reach 100 million active monthly users. And ChatGPT did that in two months. It's one of the most rapidly adopted technologies in history. Two months to get to 100 million active monthly users is incredible growth. I don't see students dropping AI anytime soon. So there is an incentive to learn about these tools, or at least how they work longer term, even if you're not going to use them directly.
I do have a lot of empathy for instructors, though, because as instructional designers, as educational technologists, we seem to come to them every year and say, "Here's another thing you have to learn to use, on top of everything else going on in your discipline within the field of teaching and learning." You're always going to have some folks who are understandably very resistant to any sort of new change. The most realistic approach is to reassure faculty that a) you have tools in place to help address some of their fears about AI, and b) you are present and available to help onboard them into those technologies if and when they're ready.