Two Truths and a Lie About AI

Here are three statements: Can you figure out which two are true?

  1. Students will try to cheat with AI.
  2. Students need help reasoning on their AI usage.
  3. Students talk to their teachers about their AI usage.

OK, ok, the game is unfair. These kind of absolute statements always have some truth and untruth nested within them.

But to start the conversation, I’ll say that the first two are true and the last one is a lie . . . unless teachers do something about it.

Students will try to cheat with AI.

    The truth is, students will try to cheat in general. This has always been true. And it’s not every student. It’s not all the time. But the temptation to find a shortcut that evades intellectual effort and misrepresent a student’s own work . . . it’s a tale as old as time, my friends.

    We can read articles about this on both ends of the spectrum. Some argue rates of cheating are about what they’ve always been, even as the methods have changed with AI. Some argue that everyone is using AI all the time to cheat their way through an education. And the truth is probably somewhere in the middle.

    Cheating is essentially a moral issue. When there is a new technology that makes cheating easier (and other technology that makes attention and focus more challenging) we owe it to our students to talk about what defines cheating.

    In fact, with every class I teach, I use a short, 20 minute lesson called “Is It Cheating?” in which I demonstrate several AI interactions live, based on a hypothetical assignment that parallels a real one, and students write reflections by hand before we talk.

    Which brings us to the next point . . .

    Students need help reasoning on their AI usage.

    This one is so, so true. And many students report that they are not getting this experience in the classroom.

    Put yourself in the students’ place for a moment. They have this huge, powerful, unexplored thing on their hands that can quickly construct text that (they feel) sounds better, more authoritative than their own voice. The easiest and earliest use they discovered for it was to replace their own thinking and labor and voice.

    If schools, curriculum, and teachers are not hosting enough conversations about AI, providing some reading about how it works, how it impacts the environment and the world, when to use it and when to reject it, how to stay skeptical and verify facts, we are setting students up to use it poorly. 

    What can a teacher do tomorrow?

    Take small steps. As a class, look at some AI-generated text about a topic you have already studied. Where does it lack depth that the materials or discussion from class probed? Where does it push our thinking by proposing a subtopic we never considered?

    More importantly: How can exploring the answers to those questions help us to be better writers on this topic?  

    Students talk to their teachers about their AI usage.

    If adults are not comfortable talking about something, young people are not apt to open a conversation about it.

    If we wear blinders about the impact AI is already having on our students’ learning and just avoid the topic in our whole-class, small-group, or one-on-one conversations, student use will continue to evolve without our wisdom.

    So this statement can be another truth, but it requires that teachers initiate conversations.

    Preface a conversation with, “No judgement, I just want to learn,” and then try one of these questions:

    1. How do you use AI with your schoolwork?
    2. How is your use of AI evolving over time?
    3. Tell me about a time you used AI in a way that didn’t feel right in the end.
    4. Tell me about a time you used AI in a way that felt great in the end.
    5. If you hate AI and avoid it as much as you can, tell me about why that is.

    Don’t forget #5. I have students as adamantly opposed to using AI in their education as some adults I have met. I never see my role as someone who should talk them into it.

    When we show students that we are brave and humble enough to stay curious, we foster trust. This encourages young people to initiate conversations about AI with us when they discover new uses or ambiguous boundaries of academic integrity.

    We can say the same thing to our students that we say to our own children when they approach us with something tough: “I’m so glad you came to me about this.”

    As you probably know from the title of my new book, Artful AI in Writing Instruction: A Human-Centered Approach to Using Artificial Intelligence in Grades 6-12, I believe and have found that it is possible to use AI artfully, but it takes some thinking, talking, and tinkering to figure out how to make that happen, to demonstrate the difference between detrimental and artful use for student writers.

    We are all, willing or not, on that journey of discovery now. How is it going for you?

    NOTE: This piece is entirely human-crafted. Artificial intelligence suggested relevant hashtags below using a two-sentence description of this piece as its prompt.

    #AI #ArtificialIntelligence #ArtfulAI #Writing #Teaching #Education #EducationLeadership #EdTech #WritingInstruction #ProfessionalDevelopment #Learning #Innovation #AIinEducation #Literacy #TeacherPD #GenerativeAI