Beyond Detection: Finding Out Isn’t the Same as Fixing It

By Carter F. Smith | carterfsmith.com

When I first started looking into how students were using AI, I approached it like a good investigator. Observe. Document. Verify. It felt like a case to crack—something sneaky was going on, and it was my job to figure it out.

Turns out, I wasn’t wrong. But I also wasn’t ready for just how fast things were moving—or how much my instincts as a teacher needed to catch up with my instincts as a cop.

The first time I caught an AI-written paper, I wasn’t sure what tipped me off. The language was too clean. Too structured. Paragraphs that felt more like pre-fab housing than student writing. It didn’t sound like the kid. Not even close. What sealed it wasn’t some tool or scanner—it was a conversation. I asked, “How’d you come up with this idea?” The blank stare said it all.

That moment changed how I thought about the work. This wasn’t just cheating. It was a signal that our assignments, our expectations, and maybe our grading were overdue for a little scrutiny of their own.

So I put together Beyond Detection—a workshop for faculty who are tired of playing whack-a-mole with plagiarism software and want to start redesigning the work itself.

We talked about how AI reveals weaknesses in our assignments, not just in our students. About the difference between writing to express and writing to impress. About how some of the best student work isn’t polished—it’s messy, reflective, human. Which is to say: real.

Now, I’m not saying every paper needs to be handwritten in blood and notarized by their grandma. But I am saying that if all your prompts ask for is regurgitation or surface-level argument, AI will beat you at that game every time.

What we need now isn’t better detection. It’s better design. Assignments that build in process, personal reflection, in-class interaction, even moments of uncertainty. Things AI just can’t fake (yet).

And let me be clear—I’m not anti-AI. I’m just not ready to hand over the keys to the classroom and call it progress. The same way a detective doesn’t blame the tool, I don’t blame the tech. But I do want to know who used it, how, and whether the learning still happened.

So what’s next?

If you joined me for Rewrite the Rules, we took it further—into grading, feedback, trust-building, and transparency. We added an “AI use disclosure clause” to prompts. We rethought rubrics. We stopped pretending students can’t write well and started asking whether our expectations reward things that matter.

But that’s another post. This one is about not stopping at detection. Because knowing something’s off isn’t the same as fixing it. And we’ve got fixing to do.

You can download the full Beyond Detection slide deck, packet, and templates [here]. Or just start small: revise one assignment. Ask your students where they’re tempted to cheat. You might learn more from that than from any AI scanner on the market.

Let’s get better, not bitter.

—Carter

carterfsmith.com

Leave a Reply

Your email address will not be published. Required fields are marked *