Authentic Assessment in the GenAI Age

Caleb Curfman

Click here to open or close the video transcript
– Well, first of all, thank you for joining me in this and as was being talked about, as we look at the One Higher Ed, I found out about this opportunity through one of the things that I do, and that is a podcast, Assess Without the Stress, and all of a sudden, I started hearing things from OneHE and got me connected, and it’s been a joy to work with you guys for this as well. I am a history faculty member, as well as the Faculty Development Coordinator for Northland Community and Technical College in Minnesota in the United States, and something that really got me thinking about this AI stuff, of course, as a historian, we need to start with a story. and so the story is the anxiety that comes with this, right? The concern about AI and what does it mean for us as educators? What does it mean for our students? And I’m gonna paint a picture I’m sure you’re familiar with, you are sitting and looking at papers, maybe you have your favorite tea or coffee with you, and students are turning in some great work, you’re going through it, you’re reading it, you’re realizing how great it is, and then there’s that little part of your brain that starts to say, “Did they actually write this?” I think this is where a lot of people have been as artificial intelligence has grown and become a part of our everyday lives in many ways, but especially for educators, it can be, in some ways, almost an identity crisis of what is our role? Where do we fit in? How do we make sure our students know what they need to know? But it’s a natural piece of it, and I just want to recognize right off the bat that this does cause a lot of angst, a lot of worry. And my goal in this 20 minutes is to offer some ideas, offer some tips, and most importantly, some things you can do tomorrow that hopefully will help you navigate this, but I think everyone that talks about AI should have a disclosure that says, “We don’t know what’s gonna happen tomorrow with this,” right? I mean, no one can truly be an expert on where this is going. The changes that have happened in just the last couple of years, and there are some other wonderful webinars that have just happened or are coming up through One Higher Ed, which really is getting to those new tech things. This is really stepping back and saying, “What can we do?”
So I have three goals for us. The first goal is I’m gonna challenge you to reframe how we look at assessment. Kind of a higher level, we need to think about what we’re actually doing, thinking about policing students’ work, and maybe switching more to thinking about making that learning visible over a period of time. The one thing that has changed, a lot of things haven’t changed with assessment, one thing that has changed with AI is that we know that it’s harder for the AI to work over a span of time, and so we’re gonna look at that. I want to give these practical strategies. So how do we document the process, building continuity within our assignments across the semester? Again, kind of the long game if you’re looking at it, whether it’s an eight-week course, 16 week course, or even longer. Designing ways to have assignments that AI can’t shortcut. Notice I’m not saying AI can’t do. Another thing if people are talking about AI, very hard to have something that is AI-proof. In fact, there’s always gonna be that question, and so if you were coming here for the absolute answer of how to do things and not have AI be involved, unfortunately, you’re probably not gonna get that, but I hope you do get the idea that you can empower your assessment and empower your voice, and more importantly, your students’ voices moving forward with this new style. And finally, I mentioned that in my capacity at a community and technical college working as faculty development, I work with people who teach welding, who teach nursing, who teach math, and so I hope to have some adaptable ideas. No matter what you teach, no matter what your discipline or area is, I hope there’s something that will work for you. One of the things that really is that story at the beginning, right? That anxiety that comes with did the student use ai, and what I would throw out there is that is kind of the old question, and I think we really need to start thinking about a better question. The old question, when we say did the student use AI, it immediately brings some things forward.
First of all, it brings us to this detection arms race. I’m sure you see it, I see it everywhere, as many times as there’s new AI ads, there’s also ads about detectors and now we have the answer. Now we have the solution. It is a absolute arms race to see if the detectors are better than the AIs and back and forth. So if we only focus on can we catch or will we catch a student, we’re gonna end up in that arms race. We also end up having very much a surveillance culture, and what I mean by that is if we’re looking at and always sitting there thinking did the student do this? Did something else do it? Did a bot do it? I don’t know about you, but that’s not why I wanted to be an educator, right? I wanted to see student growth. I wanted to help them, not spend my time in that barrier, right? Not spend my time second guessing. And finally, we don’t see a signal on the actual learning. When we are only focused on AI and was it used, we don’t really get to look at that process. So a better question that I would pose for us is, “Can I see how this student’s thinking was developing,” right? Can I see how the student’s thinking is developing? Because it leads to seeing evidence of real growth, something that we really hope assessments will do. Meaningful feedback loops, the ability to try something and try it again, and also, it’s learning you can actually see, so when I’m sitting there not having that question of was AI used, I can look at it and say, “Well, based on the first drafts that were turned in and the other process stuff that they were doing, they have developed. Look at how far the student has gone in this assignment.” And the key to this is if the answer to the second question is yes, I can see students thinking develop, that first part becomes far less urgent, right? We don’t have to worry so much about that. So what do I mean by authentic?
Well, authentic assessment is really asking students to apply their knowledge in a meaningful, real-world context, but what it’s not doing is creating a whole new idea. Authentic assessment has decades of research showing the benefits of having students do things that are used to the discipline, connected to the discipline. AI really made it more urgent perhaps. We start thinking about the thinking process behind the final product, not just the assessment product itself. And then the goal of assessment, and I hope this is a goal for you as well, is we should be seeing thinking as visible. We wanna see the thinking. Now, how do you see thinking, right? The vulnerability that is being exposed with AI-created work is the lack of seeing how people learn. And so if we only look at the final product, the final paper, the final report, the final project, we’re missing out on a lot of information. And I provide some examples of those one-off type assignments here to get you kind of thinking about it, right? You know, so an essay that you turn in, the first draft is the only draft, and there’s a grade and there’s no revision. That’s an example of kind of just seeing the product, not seeing the actual learning. Other examples would be not having continuity between your assignments. So if each assignment is basically starting at step zero, how do you actually see if students are learning something? In my context, if students were to write three papers in my class, in a history course, I don’t know if they’re learning over time the skills of history if I’m asking them a very specific question about this time period, then the next, then the next, for example. So the reframe, what I think we really should focus on doing is going from the submit the paper, receive the grade, and done to finding ways to document the process, reflect and revise as students are working on assignments, have multiple due dates, build on prior work in a semester, and then through that process, we’re going to see demonstrated growth. So now, the real practical tips, right? That sounds all great, but how do you do it? If you document the process, some examples, ask students to submit evidence of their thinking before they turn in assignments, right? So have them annotate an outline, even by hand, of kind of what their process is, or a brainstorming document, pose a question or a problem to them and have them kind of think through it, right? Not necessarily get to that answer, but think through it first. Have some sort of a draft where there is revision on an assignment. Have the students track their changes, whether they use Microsoft Word or Google Docs so that you can see, if you go into that log, you can see that process in real time. You can see where they were changing what they were writing.
Another great way of doing this is having a process memo. Ask the students, at the end of a project when they turn it in, what was your process as you were working on this? What were the things that were easy? What were the things that were difficult? Ask them things about how they did it, not just show what they did. And something I have found to be very effective, especially in an asynchronous course, is to have a brief voice or video reflection when turning in an assignment, having the students talk about their project. Again, it’s bringing some of that agency in, and this practice, as you do it, it’s changing the nature of the task. It’s telling the students, even if you aren’t saying it straight out in front of them, we want to see what you are learning. We wanna see how you are growing as a student, right? And so those are some of the ways we can try to document the process. Second tip I would give you is try to build continuity across assignments within a class, and I don’t just mean scaffolding an assignment. You’ve probably heard about the idea of having multiple drafts, right? What I mean is have students take a position on something, even if it’s in a discussion board, if it’s an online course, depending on your discipline, have them write a position they have in week two on an issue that’s central to the course, and then have them revisit their initial position and kind of complicate it in week 10 or week 11, right? So you get to see them connecting and thinking, “Oh, yeah, I did change how I was thinking, I thought this before the class started, now I’m here,” right? So it’s finding a way to make that meaningful across the semester, not just the one-off. Another option would be a Running Course Journal, having students post a certain number of journal entries talking about their learning process, what they’re finding difficult, what they’re finding interesting as the course goes along. And these don’t need to be worth a whole lot of points. It’s a way to see that growth over time, or to see where there is struggle, and then you can intervene and kind of help with different interventions as they’re going through the process, not just once that final grade is coming due. I also encourage Peer Commentary Loops, not necessarily group projects in my case, but have students review each other’s work, talk about that work, and then make revisions based on that work. And so we’re seeing this connection across the entirety of the semester.
Tip three is make the student the expert, and this is where the authentic piece comes in, right? One thing that we know about AI is that it is general. I mean, even in nature, it’s general, but the idea of having students talk about their local communities, finding ways to connect that into the course, and that can be in many different disciplines. If you are in welding, for example, maybe do some research about the local needs. What type of welders are needed? What types of jobs are out there? Some outlooks for career paths, for example, right? So finding local or community lenses for your assignments instead of larger national, or even global considerations. Discipline-specific skills, these are the things that make our disciplines tick. So think about what are those in your discipline and how can you have assignments that show and demonstrate a student’s ability to do that task. Again, not so much the content knowledge piece, but doing the task. Again, in my discipline, can a student look at a source written in 1865 and can they describe what was going on? Can they explain why that source is important? Can they do these things to build their knowledge? That would be a historical thinking skill, which fits in with the specific skills of a discipline. Tip number four is having low-stakes check-ins throughout the semester. Not everything we do has to be AI-resistant, right? Having these low-stakes chances also give us a chance to see growth over time. And so whether it’s a weekly one-paragraph response, one thing with AI, it tends to get very verbose. It talks a lot, it tells a lot of information, and so I have found that some of the shorter things, it doesn’t do very good at being concise, right? And so having students just do a quick check-in, what was the most interesting thing you learned this week? What is something that you can apply to your future career based on our materials? If you have the ability to have an in-person course, those in-class quick writes or the exit tickets, muddiest points, some of these classroom assessment techniques that have been good well before AI are also very helpful in the age of AI. They’re connecting the students to that learning. Again, the short voice or video reflections, again, just small check-in assignments to see where your students are at. And the fifth and final kind of tip for you is have the AI conversation openly, right out in front of students.
The first thing I talk about in my first day of class is usually, as we’re going through the syllabus, other materials, we talk about AI and what benefits and what kind of problems we see with AI in the discipline, in my case, of history, and we talk about it as a group. And there are certain times in the class where I invite students to use AI on assignments, but I have them respond to those assignments by asking questions, evaluating how well the AI did. So for example, if they were to have it write a short summary of something, where did it get it wrong? Where was AI wrong? Where did it oversimplify things? Where did it miss the point completely? But the one I really like is asking students to create their own version, maybe based off of a initial draft from AI, and I ask them the simple question of, “What does the AI version lack that your version has?” And the amount of time students just anecdotally will write, “I realized that AI was really boring, it didn’t really make a point, and it didn’t use any evidence,” right? Or it didn’t use very good evidence, and so they’re seeing for themselves how valuable their voice can be, even within a discipline that they may not be very familiar with. They’re seeing their value added. So an example, we look at the paper, a topic proposal, you could do that in week two, annotated source list week four, have the drafts. This is more the traditional-style assignment that you might see as a scaffolded assignment. Include those revision journals so they kind of say how they’re changing things. Incorporate some peer feedback. And then that final paper is not as much of the main assessment tool, it is a piece of the larger puzzle. But I wanna make sure I have a little bit of time to talk about how do we do this outside of writing, right? Because many of us work in disciplines where writing is maybe not at the core. If you are, just for example, nursing or clinical, have these opportunities for a pre-lab prediction, some sort of procedure log, talking about what was actually happening during your experience, and some sort of debrief reflection, and make it unique to that experience. What was unique about the hospital you were at this week versus the one you were at before? It’s making it very local, making it very practical for the students.
The lab sciences, same kind of thing. Having a hypothesis log, “What are you thinking right now? What is the data you found? And then how do you interpret that? What is your interpretation based off of the data that you found?” And again, as the instructor, we could make that data very local, we could make it very specific. For math and other problem sets, things of that nature, having students annotate a worked example, so do a problem out on paper or do it on their screen, but then have them do a video explaining how to do the assignment. Don’t do this for every assignment or every single problem they do, right? But in place of some of the exams, try having this idea of having them teach it back to you in a short video, because by explaining it to somebody else, as we all know as educators, you’ll learn a whole lot more when you can explain it to somebody, right? And so it’s letting us see that process. With studio design and performance, again, having some sort of portfolio where you are going through and documenting first drafts, continued drafts, and grading that as a whole, the whole group together. But I know there’s a section of us out there where we may have a very fixed or standard curricula. In other words, this all sounds great, but I can’t really change my assignments all that much. What I would encourage you to think about is is there a way you could have some type of portfolio that goes along with the course that shows progression over time? As far as your grading opportunities, your assessment opportunities, is there a way that you could have students show what they did at the beginning, what they did in the middle, and what they did at the end, and find a way to evaluate them based on that growth over time. Again, what this is gonna do is take away a lot of that concern that comes with being too heavy on the large assessments. So where to start tomorrow?
My final slide here. What is something that you can do tomorrow? Pick a strategy, try it out. Ask for a process reflection at the end of an assignment. So maybe you have an assignment coming up, just add a question, “In 150 to 200 words, tell me about your process. Tell me about the most interesting part of what you did. Tell me the challenges that you had.” But we also need to voice that we wanna hear those challenges, right? I always tell my students, “Don’t just tell me it was all easy or it was really fine and there was nothing especially important,” right? The idea is I wanna see what was challenging, because then I can be a resource for you as you do the next assignment. Ask for one early draft. If you tend to have more of a one turn-in essay, ask for something a little early. And then the third thing you could do tomorrow is think about a way to build in one callback assignment. Have an assignment where you say, “Okay, go back to week five and what you did and tell me how your thinking has changed, or tell me how your interpretation is different, or how you have shown improvement from that point.” These are all ways we can try to, again, show process over product, which, at the end of the day, is going to be one of the best tools we have when working with AI in assessment. And so thank you, and I wanna make sure we have plenty of time to talk about some of these questions. So what are some of the questions? Let me see here.
– Thank you so much, Caleb, for the presentation. I’m gonna help you out a little bit with the Q&A. We’ve got a lot of questions, so we can maybe start with a question from Amanda first.
– So Amanda’s question, “What are your thoughts on students who use AI for all of it?” And here is my real candid response to that: we can do as much as we can as instructors, the tools that we have to try to help all of our students, to explain how important it is to be doing the work, right? But unfortunately, sometimes for my own good and for my own sanity, I had to say, “You know what? There is a certain amount where I can be really worried about that. Instead, I need to think about how can I support the students, first and foremost, who are wanting to do it in a certain way that they are comfortable with, that they are doing it themselves.” I will add that I have included a AI disclosure agreement for many of my assignments, where if they do choose to use it completely, we can have a conversation about that, but that is gonna be an ongoing question and challenge, and maybe it’s easy to just say, “I do wanna focus there, but I also wanna make sure that doesn’t take away from the authentic assignments, the authentic connection I can have with all the students.” And that was just somewhere I had to go in my own head. So another question that came in, “Do you have any tips for scaling these techniques for large classes?” Yes, so some of the classes, let’s say it’s 100 or so in the class, I utilize some classroom data softwares, different ways, you used to call them the clickers, right? Some different ways to do some of this in class where students can use their phones to type in some of these responses for some of those kind of quick check-ins. As far as having multiple steps to a project, this is where I really rely heavily on that peer feedback loop, right? So obviously, we wanna be part of that feedback as well, but having peers to work on this together, and then when you get this assignment at the end, you can now see it almost in this whole long form instead of just seeing the paper, for example. You can have all these documents together and you kind of reference back to them. So your own grading practice will decide how much you put on some of those pieces, but again, it’s just another way to try to scale it up a little bit. Maybe you can only do a few of these. One of the tips I shared is you don’t need to make everything, quote, “AI-resistant,” but you always wanna make sure that you are having some assessments where you feel confident that you are truly demonstrating growth over time, and so maybe it means only one or two of these in a class versus four or five. Okay, question, “How do you handle equity of access? Not all students have access to paid AI tools.” Absolutely, and this is why I do not require students to ever use AI. I think personally, from an ethical standpoint, I’m not in a place where I wanna say yes, you have to use it for this assignment, but from the equity piece, I am fortunate in Minnesota, Minnesota State, we have a subscription through Microsoft to have Copilot actually available to all our students in a protected server. So that is how I do, of course that’s not a great answer other than the fact that I have that as an opportunity. What I would consider when looking at equity and AI is also having that conversation with students about what is the difference between a pro version and the free version. We’ve done that in my class, where we just pull up and we see how different is it, is it actually that much better? But absolutely, a good question. Let me see. Oh, yes, go ahead.
– Yeah, I was just gonna say we’re at our time, we just want to be mindful of everyone’s time, and so I can see an active chat and we haven’t answered all the questions from the Q&A, but we want to respect people time and your time as well, Caleb, so I think I’m gonna wrap it up if that’s okay with you.
– Absolutely.
– Brilliant. So thank you everyone for being here with us today, and thank you for engaging in the chat. It’s still going ahead. You can answer each other’s questions, but what I wanted to say is this has been recorded, so we’ll send out the recording in the next two weeks and I will post a link to our next webinar on the 10th of March, so the link is in the chat. And also as you were answering the questions, you mentioned, or as through your presentation, you mentioned exit tickets, so I kind of made a note of one of the courses that we have on exit tickets, which is one of our most popular courses. So I’ll post that link in the chat as well. So thank you everyone for sharing your time with us today. I hope you got what you needed out of today, some ideas or reflections of what you are already doing. And Caleb, thank you so much as well for sharing the tips with us and your energy.
– Yes, thank you.
In this webinar recording, Caleb Curfman (Northland Community and Technical College, USA) explored the complexities of authentic assessments in the AI era—not by policing student work, but by reimagining what “authentic” means when artifacts alone can no longer guarantee learning.
Below are the key discussion points with timestamps from the recording. Hover over the video timeline to switch between chapters (desktop only). On mobile, chapter markers aren’t visible, but you can access the chapter menu from the video settings in the bottom right corner.
- 03:01 – Session Goals
- 07:42 – What “Authentic” Means
- 10:16 – Five Tips for Authentic Assessment
- 18:36 – Example: The Traditional Research Paper
- 22:07 – Where to Start Tomorrow
- 23:50 – Q&A
Useful Resources:
- What is an Authentic Assessment? – interview with Caleb Curfman
- Teaching for Authentic Student Learning in an AI Age – webinar with Flower Darby
- Ethical AI Use in Assessment – webinar recording with Vincent Granito
- Exploring the Ethics of GenAI – webinar recording with Cate Denial
DISCUSSION
What was your key takeaway from the session?
Share your thoughts in the comments section below.