On April 2nd, the Multicultural, Diversity, and Education Council (MDEC) hosted a panel of five Central Michigan University undergraduate and graduate students from across campus. Facilitated by Dr. Matt Johnson, the students gathered to talk candidly about how they use artificial intelligence (AI) in their academic lives. Their message to faculty was both refreshing and disconcerting: AI is here, students are using it, and the way we teach matters more than ever.
What follows are key takeaways and faculty-facing lessons that emerged from this timely conversation. The students’ insights revealed far more than surface-level behaviors; they spoke to motivation, perception of course relevance, emotional and intellectual engagement, and the evolving definition of academic integrity in an AI-saturated world.
“I Use It Pretty Much Every Day”
The panel began with students describing their daily use of AI. Their tone was casual, confident, and, at times, reflective. AI, they said, is not just a last-minute tool to cheat but a study partner, a second brain, and a brainstorming buddy. They use it to summarize lengthy readings, generate outlines, understand complex ideas, and, in some cases, solve coding problems or write rough drafts of discussion posts and assignments.
One computer science graduate student shared how AI had become integral to their problem-solving and learning workflow. Another student admitted they’d used AI to draft portions of essays and assignments, which they later edited and submitted. They viewed this as efficiency, not laziness.
None of the students on the panel had ever “been caught” using AI. One student shared they felt faculty “know” when students use AI. However, one had experienced a false accusation, which they described as both frustrating and alienating. This student’s experience emphasizes the need for faculty to approach these situations carefully and with consideration.
When and Why Students Turn to AI (Ethically or Otherwise)
From the discussion, a pattern emerged: students were more likely to use AI unethically (e.g., copying answers) when they perceived assignments as “busywork” or the course itself as irrelevant to their future.
General education courses, especially those with generic assignments and minimal instructor engagement, were common culprits. “If a class feels like it doesn’t matter,” one student said, “then I don’t feel bad using AI to just get through it.”
On the flip side, when students saw the relevance of course content or felt the instructor was invested in their learning, they were far less likely to misuse AI. They wanted to learn, they said, but needed help connecting that learning to real-world contexts.
This mirrors long-established research on academic dishonesty, where perceived relevance, meaningful relationships with instructors, and intrinsic motivation all reduce the likelihood of cheating. As Miles et al. (2022) summarize in their literature review, teaching environments that emphasize connection, purpose, and authentic engagement are foundational to academic integrity.
Peer Norms and Social Influence
When asked about their peers’ AI usage, the panelists were blunt: Everyone uses it, at least to some degree. They acknowledged this as a new norm, unspoken but widely accepted. Yet they also expressed concern. Some felt disheartened watching peers bypass learning entirely. Other students worried that the normalization of AI use might compromise long-term skill development.
Their reflections align with academic misconduct research, which highlights peer behavior and culture as a strong predictor of individual choices (Resurreccion, 2012; Pan et al., 2019; Moss et al., 2018; Tee and Curtis, 2018). As the use of AI becomes increasingly normalized, our role as educators must include modeling ethical behavior, fostering community standards, and creating spaces for open discussion.
“Students Need to Be Challenged to Do Things AI Can’t Do”
One panelist directly challenged “all faculty listening” to run their assignment descriptions through AI, and if AI can complete the assignment, change it. The students seem to share the perspective that if a generative AI tool like ChatGPT can complete an assignment quickly and convincingly, then the task may not be promoting the kind of deep learning or critical thinking we aspire to. Faculty shouldn’t panic, but we need to evolve.
The students weren’t advocating for easy grades or AI-powered shortcuts—quite the opposite. They called on instructors to design meaningful assignments that push them to analyze, create, reflect, and synthesize—skills that AI can support but not replicate.
Teach Us How to Use It—Don’t Just Police It
The loudest call from the panel was for guidance. Students do not want AI to be the elephant in the classroom, present but unacknowledged. They want instructors to help them learn how to use it responsibly, ethically, and effectively for the careers they’re preparing for. They said ignoring AI or treating it purely as a threat felt like a disservice.
Every student agreed that AI literacy should be part of their education. They urged faculty to lean into the technology, not necessarily to require its use, but to engage with it alongside students. A few students even suggested AI be treated as a collaborative learning tool, used together in class to model prompts, evaluate responses, and discuss credibility and limits.
The Policy Gap: Students Feel the Inconsistency
One frustration shared by the panelists was the lack of a clear, campus-wide policy on AI use in coursework. Students described a patchwork of expectations that varied dramatically from class to class. In some courses, AI use was encouraged as a productivity tool; in others, it was strictly forbidden without much explanation. This inconsistency created confusion, anxiety, and, in some cases, unintentional policy violations when instructors did not clearly convey their expectations for that course. One student shared that this had led to what she felt was a disproportionate reaction when a friend used AI and subsequently failed the course.
Don’t Use AI to Grade Our Work
Interestingly, while students were enthusiastic about their own AI use and encouraged faculty to use the tool to clarify lectures or assignments, they were unequivocally opposed to faculty using AI to grade their assignments.
“It feels impersonal,” one said. Another added they felt if they put time into an assignment, faculty could reciprocate by providing the expertise they went to school for in the form of authentic feedback.
The takeaway? Students are aware of the human element of education. They crave interaction, guidance, and the expertise of their instructors. While AI can speed up processes, it cannot replicate the mentorship and nuanced insight that students value.
Sometimes, I Forget How to Think for Myself
Students also opened up about the downsides of AI. They acknowledged a creeping dependence that made it harder to think independently or retain information. Paraphrasing one student, “I know I won’t always have AI at my fingertips, and I need to learn how to solve problems myself.”
This is our window. Students are not blind to AI’s cognitive costs; they need help navigating the tradeoffs. If we want to preserve students’ capacity for deep thought, we need to explicitly articulate when and why not to use AI. We must design assignments and in-class experiences that reinforce critical thinking and relevance, not just performance.
Implications for Faculty at CMU
So, what can we do with this information? Below are five recommendations grounded in the student panel’s insights and the research base on academic integrity and motivation.
- Design for Relevance
Make it clear how your course and each assignment connect to students’ goals, lives, and future careers. Students who feel like they’re learning something that matters are far more likely to engage authentically. (Ashworth et al., 1997; Rabi et al., 2006; Bretag et al., 2019)
- Reimagine Assessments
Use AI as a design partner. Run your own assignments through ChatGPT or Claude and see what it generates. If it can complete the task too easily, consider revising it to include more process-based work, personal reflection, in-class collaboration, or multimedia elements. (McGowan, 2016; Bretag et al., 2020)
- Talk About AI—Openly and Often
Normalize discussion of AI tools. Talk about what they can and can’t do, when to use them, and when not to. Model ethical use and make space for students to share their own experiences. Make your expectations explicit and link them to learning outcomes. (Cole and Kiss, 2000; Broeckelman-Post, 2008; Moss et al., 2018).
- Teach AI Literacy
If your discipline uses AI in the real world, teach students how to do so responsibly. Help them understand prompting, verification, bias, and the limits of AI-generated content. This is career readiness, not just academic policy.
- Preserve the Human Touch
When it comes to feedback, grading, and mentorship, students still want you. While AI might help draft rubrics or provide simplified explanations of complex topics, students overwhelmingly prefer instructor-generated comments and personal attention.
Final Thoughts
The students on this panel didn’t come to complain. They came to help. Their insights were nuanced, grounded, and forward-thinking. They didn’t romanticize AI, nor did they vilify it. They simply asked for faculty who are curious, thoughtful, and willing to adapt.
Let’s take that challenge seriously. At the end of the day, students don’t need us to be perfect AI experts. They just need us to care enough to engage with them and with the tools that are shaping their futures.
References
Ashworth, P., Bannister, P., Thorne, P., & Students on the Qualitative Research. (1997). Guilty in whose eyes? University students’ perceptions of cheating and plagiarism in academic work and assessment. Studies in Higher Education, 22(2), 187–203. https://doi.org/10.1080/03075079712331381034
Bretag, T., Harper, R., Burton, M., Ellis, C., Newton, P., van Haeringen, K., Saddiqui, S., & Rozenberg, P. (2018). Contract cheating and assessment design: exploring the relationship. Assessment & Evaluation in Higher Education, 44(5), 676–691. https://doi.org/10.1080/02602938.2018.1527892
Broeckelman-Post, M. A. (2008). Faculty and Student Classroom Influences on Academic Dishonesty. IEEE Transactions on Education, 51(2), 206–211. https://doi.org/10.1109/te.2007.910428
Cole, S., & Kiss, E. (2000). What Can we do about Student Cheating. About Campus: Enriching the Student Learning Experience, 5(2), 5–12. https://doi.org/10.1177/108648220000500203
McGowan, S. (2016). Breaches of Academic Integrity Using Collusion. Handbook of Academic Integrity, 221–248. https://doi.org/10.1007/978-981-287-098-8_36
Miles, P., Campbell, M., & Ruxton, G. (2022). Why students cheat and how understanding this can help reduce the frequency of academic misconduct in higher education: A literature review. The Journal of Undergraduate Neuroscience Education, 20(2). https://pmc.ncbi.nlm.nih.gov/articles/PMC10653228/
Moss, S. A., White, B., & Lee, J. (2017). A Systematic Review Into the Psychological Causes and Correlates of Plagiarism. Ethics & Behavior, 28(4), 261–283. https://doi.org/10.1080/10508422.2017.1341837
Pan, M., Tempelmeyer, T. C., Stiles, B. L., & Vieth, K. (2019). Everybody’s Doing It. Advances in Higher Education and Professional Development, 117–136. https://doi.org/10.4018/978-1-5225-7531-3.ch006
Rabi, S. M., Patton, L. R., Fjortoft, N., & Zgarrick, D. P. (2006). Characteristics, prevalence, attitudes, and perceptions of academic dishonesty among pharmacy students. American Journal of Pharmaceutical Education, 70(4), 73. https://pubmed.ncbi.nlm.nih.gov/17136192/
Resurreccion, P.F. (2012). The Impact of Faculty, Peers and Integrity Culture in the Academe on Academic Misconduct among Filipino Students: An Empirical Study Based on Social Cognitive Theory.
Tee, S., & Curtis, K. (2018). Academic misconduct – Helping students retain their moral compass. Nurse Education Today, 61, 153–154. https://doi.org/10.1016/j.nedt.2017.11.030