What to do with student work that uses GenAI irresponsibly?

by | Sep 25, 2024 | News | 1 comment

Even though many of my UAB colleagues and I distrust and choose not to use GenAI detection software, we still want our students to take on the challenge of learning, to produce work that represents exactly where they are in their educational journey, and to demonstrate incremental growth over time. We recognize that all students now have the easy option to offload the cognitive burden of learning to GenAI—maybe we call this cheating, but I’m still not sure. In the English department, our strategy to combat the threat of cheating is to craft exciting, interesting courses that involve some in-person, real-time graded tasks. The courses incorporate complex, particular assignments that engage students on a personal level, require lots of process work and revision, involve collaborative components, and may ask students to cite or include material that they could only understand if they’ve attended class.

While these strategies are pedagogically sound and succeed at motivating many students to learn, some students still turn in work that is clearly AI-generated, whether on homework assignments, project drafts, or even high-stakes finals. My colleagues and I have begun to notice common features that indicate a student has used GenAI in an unsavvy and thoughtless way.

Traits of student writing that signal to instructors that students used GenAI:

  1. includes content that is clearly false (describes events, concepts, ideas, characters, plots, processes incorrectly; attributes fake quotes to real people, etc.)
  2. demonstrates a level of understanding of some topic that far exceeds what we normally see in work from the average to advanced student
  3. uses an authorial voice that deviates from the student’s prior writing
  4. employs a rigidly formulaic structure akin to the tripartite formula that basic ChatGPT prompting yields
  5. has incredibly generic content/lacks specificity
  6. includes excessively flowery or unnecessary jargon
  7. uses a tedious, long winded, but mostly grammatically correct syntax

How should instructors respond to student work that demonstrates any of these characteristics? If your course policies prohibit the use of GenAI on any class work, you have, at best, an awkward and time-consuming situation on your hands in which you must confront the student, ask whether they used GenAI, and hope they admit it so you can offer some kind of warning or penalty. At worst, you face the possibility of mounting a losing plagiarism case in which there is ultimately insufficient evidence to determine one way or the other; this can lead to that trust-violation that you were hoping to avoid by resisting GenAI detection software.

Even if your course policies allow for GenAI use with attribution and the student discloses that they used the tool, how do you fairly grade the work when you have a hunch that the student did not spend much of their own brain power while others in your class put in a good faith effort?

This semester, I am piloting a strategy to deal with this issue that:

  1. Engages students in explicit conversation about the kind of writing they SHOULD and SHOULD NOT produce for the class, and
  2. Uses assessment checklists and rubrics to enforce those expectations

Before I describe the new strategy, be aware that my solution is no panacea for painlessly and simply dealing with unauthorized student-use of GenAI. At best, this approach gives instructors justification for deducting points from work that exhibits the ugliest features of unedited GenAI output without having to explicitly accuse students of using GenAI.

Explicit Expectations in the Positive and Negative

Instructors at all levels know about the importance of explaining grading expectations to students. Usually, we offer checklists or rubrics that articulate the kind of work we want students to produce, but there have always been some assumed expectations that we can no longer take for granted. The first three traits in the list above represent standards for writing that we rarely needed to say aloud before GenAI but that we must attend to now.

This semester, part of my checklists and rubrics for student work assess how true and accurate the information is and how closely the work aligns with the student’s personal writing style and current level of education and experience. At all levels of academia, we encounter students who worry that their own level of writing proficiency is insufficient and that they must do complex acrobatics to write in a more high-brow, academic manner. We can be up front with students at the beginning of our courses that we don’t just hope but we require students to complete work that is true to who they are and that falls within their zone of proximal development (see Vygotsky). Having students respond to a writing prompt on the first day of class and asking for lots of short writing at the beginning of the semester can give instructors a baseline to which they can compare writing. 

Furthermore, my checklists and rubrics formally codify traits of GenAI writing that are unacceptable in student work (#s 4-7 above) and build in an option for assessing a penalty if their work includes any of these characteristics. This looks different from common assessment tools that articulate positive traits of writing and grade students against that standard.

Deducting Points Regardless of GenAI-Use

This semester I will experiment using checklists and rubrics that reserve the right to deduct a significant number of points if the writing demonstrates any of the seven features listed above. In other words, I will never be in the position to say to a student, “You used GenAI to write this. Therefore, I will deduct points.” Instead, I will be able to point to the presence of one or more of the seven features to justify deducting points.

My first draft assessment checklists look almost exactly the same as before, but now they include the following statement: “If your work (1) includes content that is clearly false (describes events, ideas, characters, plots, processes incorrectly, attributes fake quotes to real people, etc), (2) demonstrates a level of understanding of some topic that far exceeds what we normally see in work from the average to advanced student, (3) uses an authorial voice that deviates from the student’s prior writing, (4) employs a rigidly formulaic structure, (5) has incredibly generic content/lacks specificity, (6) includes excessively flowery or unnecessary jargon, or (7) uses unnecessarily complicated syntax, your draft will lose between 10-30 points. The point value will depend on how many of these characteristics are present and the extent to which the characteristic(s) are present. You will have an opportunity to revise for the final draft.”

If the student draft bears all the hallmarks of GenAI writing and truly departs from the kind of work that student has done in class, I could deduct the maximum points which means the student makes, at best, a 70 on the first draft. In my feedback, I will flag which of the characteristics were present and urge them to meet with me or take their work to the writing center to improve. Regardless of whether the student used GenAI, this is their opportunity to try the task themselves, to actually do some critical thinking and tough work to revise, or to seek out the support they need so that their next draft meets my basic expectations. 

The same statement exists on my final draft rubrics, except this time it indicates that “you may lose 50 points.” Here, I have the option to give the student a failing grade if they did not take the opportunity to revise or if a student’s revision suddenly demonstrates these unwanted characteristics. 

Answering objections

The biggest objection to this experimental method is that it may unfairly penalize students who are in the very earliest stages of practicing specific, thoughtfully structured writing. If FYC students turn in generic or formulaic writing, will they fail their first or final drafts? Is this method just as discriminatory against English language learners or developing students as the GenAI detecting software?

No. Professionals in FYC are trained to assess a learner’s current competencies and offer feedback that invites incremental growth. My colleagues and I have read and commented on writing from students of varying levels of English-language proficiency and writing abilities and have a baseline understanding of what growth looks like.

Furthermore, I do not intend to use these assessment tools to punish students but to:

  1. Clarify my expectations for good writing
  2. Help them navigate an academic environment in which other professors may accuse them of using GenAI to complete their work
  3. Challenge them to work hard to respect their own current abilities while also pushing themselves to the next level

 My intent is not to police student writing for these bad traits but to have an option for addressing the most egregious instances of writing that deviates from a student’s prior writing style. I am certain that many readers here are busy formulating additional objections to the strategy or articulating its many flaws. Leave them in the comments and anticipate future posts where I address your concerns and review how students have reacted to this policy in my class over the first several weeks of the semester!

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

UAB the University of Alabama at Birmingham home
UAB is an Equal Employment/Equal Educational Opportunity Institution dedicated to providing equal opportunities and equal access to all individuals regardless of race, color, religion, ethnic or national origin, sex (including pregnancy), genetic information, age, disability, religion, sexual orientation, gender identity, gender expression, and veteran’s status. As required by Title IX, UAB prohibits sex discrimination in any education program or activity that it operates. Individuals may report concerns or questions to UAB’s Assistant Vice President and Senior Title IX Coordinator. The Title IX notice of nondiscrimination is located at uab.edu/titleix.