Using AI for Course Development: A Chat with Aditi Garg

Aditi Garg

Niya Bond

Click on this text to view the video transcript
– Hi, everyone. I’m Niya Bond, the faculty developer here at OneHE, and I’m excited to be bringing to you today Aditi Garg who’s going to be talking to us a little bit about AI and all of the many ways that we can think about using it as educators in whatever space that we’re in. Hi, Aditi. Welcome.
– Thanks, Niya. Thanks for having me. I’m glad to be here. My name, like you said, is Aditi. And I’m at the University of Saskatchewan and I am an educational development specialist. My focus is sustainability across the curriculum, and I also look at experiential learning and how we can help students be what the world needs through bringing the world in and going out into the world. So everything in between from bringing guest speakers into the classroom and how do we have students reflect on that and get feedback about it, and going out to do practicums and getting experiences where they are doing community engaged learning, taking action for sustainability and trying to really embody the heart, head, and hands for sustainability.
– That’s awesome. So tell us if you would a little bit about what we’re doing together today kind of fits with all of that.
– Yeah, so part of our role as an teaching and learning centre is that we support faculty in helping them come up with everything from learning outcomes to the activities that they can do to how they’re going to assess and validate that their students have met the outcomes of the course. So that constructive alignment then are the three points of what are students going to get out of the experience, how are they going to experience it, and how will we know? So what, now what, very much. So to get to that point, you know, it can involve many hours of collaboration and coaching and mentorship and anything we can do to help cognitively offload some of that work and speed up some of those tasks helps us be more effective and helps get folks onto the same page or at least become more familiar with an idea before we get into the personal one-on-one conversation or working with our disciplinary expertise. So how do we give folks just enough of a background or enough of a scaffolded structure so that they feel supported? And I know scaffold is a very teacher-y word, but, you know, how do we give them the foundations or just enough of a structure so that they can go in and fill in the rest of the pieces and build the rest of the structure for themselves of their teaching practise?
– [Niya] I like that. I like how it’s focused on agency, but with like careful, intentional balance.
– Yeah, exactly. And in some ways, you know, we get to, as the ed developers, we get to kind of direct the boat a little bit and it’s a sailboat, I make a lot of sailing analogies, but so we know where we want the boat to go, but we don’t know where the wind is gonna come from. So how do we adjust the sails and kind of keep folks on the track where we want them to go, and then they have to kind of take up all the rest of the control of the boat and figure out where the wind is coming from for themselves. So we try to set guidelines or boundaries for folks to navigate within, and then they get to decide how they’re gonna make that journey.
– [Niya] Alright well, I don’t know if there’s a sailing analogy for what we’re gonna do next that you’re gonna demo, but how does the wind fit into AI and what we’re gonna chat about?
– Yeah, well, AI has been changing higher ed over the past year for sure. It’s been a topic of conversation, I’ll say that much. I don’t know if it’s been changing empirically. But it’s definitely been on the forefront of folks in conversations both on the research side and the scholarly side. And so at our university, you know, setting everything from which tools do we allow students to use, which tools do we approve for faculty use and staff use? What are the guidelines that we have for how to record or report the usage of the tools, how to use them with integrity? These are all conversations that we’ve heard in higher ed over the past year. And a conversation that came up recently in another workshop I was doing was what is the climate impact? What is the ecological impact of using these tools?
And something I hadn’t given much thought to either was the economic impact. So there is a cost to using these tools. You’re paying for tokens to… Someone is paying for the use or the generation of these tools. So trying to balance that ecological and economic impact for ourselves and then, are we using this tool in a way that the benefits outweigh the effect? So do we know that we are having a net positive effect when we are creating courses that will help students develop those competencies for sustainability to address the earth’s greatest challenges, to address climate change? If a tool can be used to create better learning that results in students developing those competencies, then I think we need to consider using it.
– [Niya] So we’re back to that idea of balance again.
– Mm-hmm, yes, which might be a sailing metaphor you’re trying to even in the keel and keep things going in the right direction. So yes, absolutely. We’re trying to move forward and even keel and make sure that we progress in a way that allows students a complimentary suite of activities through their curriculum. And part of that is also aligning the curriculum so that we’re not replicating what’s happening between first and second year with what happens in third and fourth year. So how can we use the tools at our disposal to look for congruency where there might be overlap, where there’s gaps in their learning, and what are the skills that students will need to know how to do? We’ve heard a lot of talk about prompt engineering, so folks knowing how to use the AI tools in a way that results in actual usable material instead of, you know, oh, I’m using this AI tool to make a cat with six eyes. I mean, maybe there’s a use for that in some universe in some way, but, you know, what am I using the tool for? What’s my intentionality behind it coming back to that ecological and economic impact?
– [Niya] Yeah, I think the economic impact is interesting. It’s not something that has come up for me in conversations around AI, so I’m interested to definitely learn more about that.
– Yeah, so at our university, we’ve created a tool based off of a backbone of another tool created at the University of Calgary. So ours is called the AI Learning Design Assistant or ALDA and our cost is approximately 5 Canadian cents per 10 prompts. So for the exchange of 10 prompts going out and 10 prompts coming back. So I don’t understand the computer science of it, but our developer has explained to me that any transaction of approximately 10 prompts that I write into the system costs 5 cents. So roughly about 5 cents American as well. And at that same cost for about… There’s been a study done that 10 to 50 prompts is equivalent to 500 millilitres of fresh water consumed. So that’s involved with the cooling, with the monitoring or the infrastructure related to computing. So, you know, how much water are we consuming by 10 to 50 prompts, which might be the prompts required to set up a new course? Which I’ll show you in a moment here. So if we’re setting up a new course and it takes us 10 to 50 prompts, that’s a bottle of fresh water. So I know myself personally, I’ve started thinking, “Okay, I need to start offsetting my water consumption.” So when I’ve been washing fruits and vegetables coming outta my garden,
I’ve been saving the water and watering all the plants and then I over watered a plant. My husband thought that the dog had peed on the floor. No, it was me just trying to offset my AI use. So, you know, trying to think about the behavioural change at an individual level is one thing but then what’s the behavioural impact or the systemic impact of all of our behaviours collectively? So what are we each doing? And then what is the system going to do? Or how are we gonna design a system that accounts for our ecological and economic impact? So currently our centre is paying for the prompts being used on this tool and you know, at a cost of 5 cents per 10 prompts or 5 cents per user, there is an economic balance there that, you know, that’s my time that is saved. That might be, if I can do, instead of doing a three hour workshop, if I can use this tool at 5 cents per 10 prompts, maybe I’m saving my three-hour workshop and that time for better one-on-one conversations for helping people get a step further. So if I can do this foundational piece, maybe I’ll be able to spend my time more on the other end of the richer end of conversations and helping folks get a step further in their philosophy and teaching design.
– [Niya] Well, it’s interesting to think about how your use and your knowledge of that economic and environmental impact has altered your behaviours maybe, but also how you’re considering that, as you said earlier, in the context of what’s happening beyond the classroom and just how it’s not only an educational tool like it impacts other elements of humans’ lives.
– Absolutely and, you know, our students who we design our curriculum for, they’re impacted by our use of the technology. And if we don’t show them responsible use or what we think is responsible use of the technology, how will they proceed and how will they carry forward in their careers and how are we preparing them for whatever their future lives entail? Or encompass, yeah.
– [Niya] Well, on that point of use, I know you have some things to show us. So I’m excited to dig in and talk more about that.
– So this is ALDA. This is the AI learning assistant that we’ve developed based off of Smarty at the University of Calgary. And so on our main page, we’ve made it intentional and explicit that we want folks to know that there is a cost to this and that we think that there are ways that they can use it responsibly, but we want them to be aware of that. We also don’t save any of their responses to save ourselves that data storage cost. So a tool like ChatGPT would store the material if we were using it directly. So how can we minimise some of those ecological impact through that context in particular? And then we have each of the tools that have been designed for specific reasons. So there’s a little description of what each tool is designed to do and I’ll show you a little bit about how each of those works. In particular, the core structure assistant is pretty neat how we’ve set it up. So I have permission from one of my colleagues, Ulrich to use his course as a test course. This is his course, it’s Health Studies 310. Students get to do a lot of really neat projects. They actually go out into the community and pick different organisations to learn more about and sometimes have a huge impact on. One of his former students took the knowledge that they learned from the course and actually went to her place of worship, and had them set up a new composting programme.
So they were able to divert a significant amount of waste out of the landfill and also put in measures to reduce their use of lights and electricity, and any money that they saved from turning off the lights and putting in timered lights went towards other green initiatives. So they’ve created a kind of green economy fund in their place of worship just because of her experience in this course. So it’s a neat course because of the way that he set it up, but it’s also a neat way to show that community learning can have some really cool ripples. So I’m just gonna put this into the tool. I’m giving the official course description, I’m gonna give it the name of the course and I’m trying to put as much information as I can at once. Now it’s going to come up with some learning outcomes. What we’ve done just recently, this is hot off the press, we have included the four domains of the medicine wheel, which in the culture of this land where I am on Treaty 6 territory, there are these four domains, physical, emotional, intellectual, and spiritual. So it’s creating outcomes based on those four quadrants in particular. So that’s another way that we can bring indigenisation into the curriculum and help folks think about how are they going to help students manifest each of those in their practise. So it’s doing that, it’s aligned, and it’s come up with outcomes tied to Bloom’s Taxonomy. High order thinking.
– [Niya] Amazing.
– Yeah, so I wonder, do you think there… Like, is there anything we would tweak if I was the… Maybe I might ask them to… Some of these look like they’re double barreled. They have the word and in them, so maybe I might say, “Can you simplify each outcome into more specific and measurable outcomes?” It’s refining them a little bit. It’s giving me some more options. And now if I was Ulrich and I was an expert in health studies, this would be an opportunity for me to use my knowledge as a practitioner to try to break these out into smaller pieces. That’s something that you can do with that. And I always recommend for our instructors to have a Word doc ready to go.
– Yeah, it doesn’t save them. So I’m gonna open up a Word doc and I’m gonna copy and paste these so that I have some ideas to gather after. And something that actually I’m trying to do a better job of myself is to copy my prompts. So one of the ways that we are helping students and instructors think about how are they going to keep that academic integrity Is to say, “What are the prompts that you provided to your AI assistant so that we can know?” Okay, what was your input that resulted in the output? So I’m trying to get better at that. So I’m gonna copy and paste these into my document so that I have those, so that I model that. For anyone who’s watching this little video. So having that recorded for them is a good practise overall. So now that I have these, I’m actually going to go to the course learning activities assistant. And this is the biggest tool, and this is where we’ve incorporated some ideas of sustainability. So this is where it’s actually going to come up with some activities I could do.
And this is really your thinking buddy. And that’s where we say cognitive offloading. This is a way like instead of having a one hour consult with a colleague, maybe I send them this ALDA and I say, “Hey, go through the prompts in ALDA, spend a couple hours with it, and then we’ll connect for a 30-minute chat and we can refine any ideas that you generated.” So if I was doing this with Ulrich, I might say, “Okay, you’re going to take the intervention plan idea and put that into here.” I want them to focus on this. They are 40 students in health studies. Third year undergrad. Ah, and now it wants to know what the modality is going to be. Is it gonna be online, hybrid, face-to-face? I’m gonna say hybrid, just for fun. I will use Canvas. That’s our LMS. And I want to focus on community engaged learning. And now it’s asking me if there’s specific competencies that I’d want to do. So I’m gonna say I don’t know what they are because if I was an instructor, probably I wouldn’t know what my university competencies are. Although Ulrich would, but that’s ’cause he’s a rockstar. So now it’s gonna give me the competencies. So that’s kind of nice, so it’s being my buddy. Yes, I like what it’s telling me. Here are my three ideas. Maybe I’ll say just nurturing successful relationships. I don’t want it to focus on the other ones. I’ll give it a minute to generate and then we’ll go through it all if that works for you.
– [Niya] Sounds perfect. Yeah, I can see it’s very detailed.
– Yes, and I almost feel like we’ve given folks too much with this but it was a great spot to say, “What can we give them to give them just a tidbit of an idea?” So, okay, this is the outcome we want to make. Here’s a suggestion for how we might do it. If we wanted it to be super experiential, maybe there’s a discussion, maybe there’s breakout rooms where people are actually coming up with plans, they’re going to create a plan and get some feedback about it. And so if you’re an instructor who has only done lecture-based instruction up until this point, this might be very novel for you and this might be enough to say, “Oh, there’s a different way I could do this type of project.” In terms of gen AI, should my students be allowed to use AI in this project? Okay, in this case, because the plan requires theoretical work, I think we’re going to say that let’s not have students use it for the main portion of the work. And we’ve given a link to an AI assessment scale so that folks can compare and say, “Okay, how should my students be allowed to use it or not? What is the depth of use that they could have?” So this is a really helpful table to say, “Should AI be allowed or not?” And then if we can direct or nudge instructors towards that, that would be helpful.
Okay, this is how students are gonna get feedback. This is how you might create differentiation. Students could pick their own topics so having some agency and voice there. What are ways that we can make this authentic? Oh, you could reach out to a local health organisation or a community partner, get some input on it. How does this connect to sustainability? Oh, health is SDG 3, good health and wellbeing. So this is a great way for students to see what real world problems are and maybe take action if they’re actually working with that authentic partner. So this is a general way that you might be able to use the course learning activities assistant or your own institution might want to develop a tool like this that they could then have instructors think through some of these and maybe do this in a group setting, discuss it with your buddy, with your department. What did it give you? I did this with a group in the STEM disciplines and we had everyone at the same table, like, three different tables working on theirs together and comparing what the AI tool generated for them.
And it was great for them to be able to compare, “Oh, what did you get? What did you get? Would you actually do that activity? How would you change it?” And it was a really rich learning activity to stimulate conversation. Not only where did we have a virtual thinking buddy, we also had actual thinking buddies that we could then collaborate with. So that’s the gist of what the tool can do. There’s a few different ones where you can upload your syllabus and it’ll tell you if it it’s experiential or not, or in what ways you might make it more experiential. It can help you design a rubric. These tools I think are still requiring a bit more nuance and development, but really it’s that course learning activities assistance that I think is the great beginning spot. So again, this would be where I would want to make sure that I copy everything that it generated for me into my Word doc and also go back and probably say, “What did I put into it?” Yeah, so that’s a gist of what it can do.
– That’s awesome.
– Questions?
– [Niya] Well, first I wanna say I appreciate how careful you were with your prompting as you considered what elements you should add so that you weren’t having to prompt unnecessarily or extravagantly. Second, I love the idea of your recent workshop where you had AI and human interaction and so humans were still first and that conversation and dialogue was still seemingly most important but this was still like a useful and valuable way to jumpstart those conversations and get the brainstorming going, which I thought was really ingenious.
– Yeah, that jumpstart idea, you’re priming the pump, you’re getting things moving just a little bit faster, yeah.
– [Niya] Yeah, and then I wonder if you have any tips for someone who’s never used this or any other AI interaction with this kind of prompting how they should get started or what should they should think about as they begin this task?
– That is a very good question. I think when we’re thinking about whenever you’re encountering something new, it’s okay to play and the good thing about these tools is that if you don’t like the answer it gives you, you can try again knowing that yes, there is an ecological and economic impact. But you can still play and try a few different wordings. It took us many attempts to get language that we felt confident was going to produce something reliable. So being open to play I think is a great way to start. There’s some courses online that you can take or videos you can watch around how to do prompt engineering, how to write prompts. Folks have told me that if you give it like a role, like an identity, so if you say, “You are an instructional designer and you’re an expert in Bloom’s Taxonomy” that you’re more likely to get an output that is aligned with someone who would have that expertise.
And so that’s part of our course structure assistant. If that’s what it’s told, it’s that, you know, “You’re an expert in creating learning outcomes and you know what the medicine wheel is that is comprised of these four elements.” So if you can feed it the right information, it’s more likely to create the response that you’re looking for. So really think in depth about what prompts you want to give the tool before you start generating, that would be another tip. I recently came across this toolkit from York University and they’ve got AI tools tied to sustainability for a variety of disciplines. They have general AI information and then they have for different topics, different disciplines. So it’s a great collection worth reviewing and I’ll give you the link to include, so there’s some good material in there. And your institution might have something similar where if they’ve compiled information around AI and tools that you can support. At our university, we have a learning technology ecosystem and we have approved tools for AI use. So always connect with your institution or with a body that knows what’s available in your context.
– Amazing. Well, I so appreciate your time today. I appreciate this demo and all the hard work that you and everyone at the institution have put into this tool. And thank you for your time.
– Yeah, thank you for asking. It’s always exciting to hear folks trying to, you know, help make the world a better place for people, planet, and prosperity and whatever we can do to help support that. So thank you.
In this video Niya Bond, OneHE Faculty Developer, talks to Aditi Garg, who is an Educational Development Specialist, Sustainability & Experiential Learning, at the University of Saskatchewan, Canada. Niya and Aditi discuss some economic, sustainability, and creative considerations of generative artificial intelligence (GAI) use.
In the second part of the video Aditi demonstrates their institutional AI agent called ALDA, an AI Learning Design Assistant. ALDA is a tool designed to help university-level-educators create comprehensive and inclusive course outlines by helping to generate different course components, such as course descriptions, learning outcomes, learning activities, actions to advance UN Sustainability Development Goals, etc. ALDA was trained by University of Saskatchewan staff based on a SMARTIE application, which is an open-source version developed by Soroush Sabbaghan and published under a Creative Commons Attribution 4.0 International License. It is powered by: Streamlit and GPT-4 API (Application Programming Interface).
If you are interested in developing a similar tool for your institution, please contact Aditi at [email protected], who will connect you with the creators on Smatie.dev, as well as advise on the prompt writing the University of Saskatchewan used to develop their custom version.
Useful resource:
- AI Resources, York University, Canada – A collection of AI resources and tools that educators can use to plan, deliver, and share with students.
- The AI Assessment Scale: Version 2 – The AI Assessment Scale which encompasses different levels of AI integration in courses.
DISCUSSION
What is one way that you can imagine collaborating with AI to Intentionally design and develop courses or learning experiences?
Share your thoughts and questions in the comments section below.