Candid Conversations about Teaching: Right-Size Your Respondus Use: Assessment Security at What Cost?

Candid Conversations about Teaching: Right-Size Your Respondus Use: Assessment Security at What Cost?

As the world around us continues to shift, it is increasingly clear that our roles and activities as educators are also forever changed. Since we re-entered our classrooms this year, we’ve gained experience and understanding of just how different things are now. In fact, some of the strategies and techniques we prioritized before the pandemic aren’t as well suited now. As we realize that re-establishing our practices might not be as straightforward as we’d hoped, we may be finding it necessary to re-evaluate our tools and strategies, and even redefine some core elements of our teaching and learning philosophies and practices. Inspired by recent publications from educators like you, this blog series will explore topics reflecting these ongoing paradigm shifts in our modern classroom. Through these “Candid Conversations,” we’ll examine current pedagogical priorities as we wrestle with upholding the standards and strategies of the past.

 

Right-Size Your Respondus Use: Assessment Security at What Cost?  

One major outcome of the pandemic-prompted pivot to online learning was redefining how technology, particularly Blackboard, is used by many instructors. In the Spring of 2020, converting paper and pencil exams to online tests was, for some, a brand-new undertaking. Gaining insight into best practices for online test administration, understanding configuration settings, and determining how to administer an exam securely, were all top concerns. Now, for many courses, paper and pencil exams are a thing of the past. During this transition, the use of two Blackboard assessment security tools increased significantly at CMU: Respondus’ Lockdown Browser (LDB) and Monitor. In brief, LDB deters cheating by removing the ability to access outside resources during an exam, and Monitor adds an additional layer of security by recording the testing experience for later viewing by the instructor (Swauger, 2020).

 

As with any instructional technology, there are both tangible and intangible benefits and costs associated with using Respondus tools in courses. Now, two years following the rapid rise in popularity of LDB and Monitor, we aim to offer some additional insight into these technologies to better inform how we might continue incorporating such tools into our post-pandemic teaching. Benefits that accompany the implementation of these tools are straightforward: the software aids in the prevention of academic dishonesty, a valid and vital concern that accompanies any valuable educational experience, especially in virtual spaces (Carrasco, 2022; Rettinger & McConnell, 2021; Swauger, 2020).

 

The burdens associated with these tools are a bit more complex. Notably, the inherent background efforts required to effectively apply and use such programs; the additional cognitive load and legwork that these tools require of students (and instructors) represent the largest challenge associated with these technologies (Anderson, 2021). While the time, resources, and cognition needed to install, access, and use Respondus tools at exam time, may not seem significant to us, when placed on already stressed and anxious students, these extra demands can be detrimentally substantial (Son et al., 2020; Swauger, 2020; Woldeab & Brothen, 2019). Another important cost that accompanies the use of these tools is financial. While they may be free for students to use, LDB and Monitor are not free to the University. Due to current budget realities, in the upcoming academic year (2022-2023) CMU’s Respondus Monitor user license will no longer be unlimited (the current unlimited license, necessitated by the required increase in use during the Spring 2020 semester, costs the University nearly $28,000 annually).

 

As we close out this academic year and prepare for the next, given the imminent decrease in Respondus Monitor licensure and the user experience, it is time for us to carefully consider our use of these tools. How? Our assessments currently using LDB or Respondus Monitor should be evaluated to determine if these are, in fact, the best solution. While such honesty enforcing software is useful, meeting important and valid needs, verifying that the benefits outweigh the costs is imperative. Of course, it is not our intention to recommend or prevent the implementation of these tools; instead, we emphasize the need to balance secure, effective testing with the investments needed to achieve that security. To help in this process, we suggest some alternatives to using Respondus LDB and Monitor and offer potential solutions to combat the cognitive load and reduce the additional stressors that accompany the use of these tools.

 

Consider these alternatives:

  • Turn them off. Here are two cases where LDB and Monitor may not be needed (and may especially hinder students):
    • The test is lower stakes or of a relatively low point value and/or consists of a small number of questions.
    • Students are allowed the use of the course textbook and/or notes. With more and more textbooks being offered digitally, open eBook exams are difficult (and sometimes impossible) to administer using Lockdown Browser.
  • Instead of defaulting to Respondus LDB and Monitor, you might also consider other security strategies for the assessment that are already built into Blackboard, such as:
    • Incorporating pools of questions with the ability to randomly pull questions to give each student a different collection of questions
    • Randomize test questions, so students see them in a different order.
    • Use open-ended questions to elicit student knowledge.
    • Use alternate test versions, swapping out different versions from one semester to the next.
  • Change that assessment up. Last but not least, this may be an opportunity to forego an objective assessment altogether and instead utilize a case study, scenario-based assignment, or another assessment type.

 

Combat the cognitive load that use of these tools may inadvertently place on students:

  • Offer clear, stepwise instructions for installing and practicing using these programs well in advance of a test in your class.
  • Provide practice opportunities for students to use the software and make sure that it works with their device/systems before the test.
  • Make yourself available to answer questions (via office hours or email) and direct students to the CMU OIT HelpDesk for specialized help whenever they need it.
  • Direct students to where they can access devices with the software already installed (such as Park Library or the Certified Testing Center) in case their device is not compatible or malfunctions.
  • Build flexibility into your policies so that it is reasonably straightforward (for you) to accommodate situations where devices malfunction, the internet stops working, or other “emergencies” occur.
  • Contact us for assistance. We can talk through your assessment with you and help ensure that your settings and other configurations are appropriate and functional.

 

As always, the Office of Curriculum and Instructional Support is here to help. If this reflective process ends with a need to change up your blackboard tests, let CIS’s CoursePro help or email us to schedule a consultation.

 

References:

Anderson, J. (2021, August 31). Out of Crisis, Compassion: Using Instructional Technologies to Alleviate Student Stress. Educause. https://er.educause.edu/articles/2021/8/out-of-crisis-compassion-using-instructional-technologies-to-alleviate-student-stress

Carrasco, M. (2022, January 28). Concerns About Online Cheating Decline. Inside Higher Ed. https://www.insidehighered.com/news/2022/01/28/instructors-express-fewer-concerns-about-online-cheating

Rettinger, D., & McConnell, K. (2021, September 22). Is Cheating a Problem at Your Institution? Spoiler Alert: It Is. Inside Higher Ed. https://www.insidehighered.com/views/2021/09/22/cheating-problem-your-college-spoiler-alert-it

Son, C., Hegde, S., Smith, A., Wang, X., and Sasangohar, F. (2020). Effects of COVID-19 on College Students’ Mental Health in the United States. Journal of Medical Internet Research, 22(9), e21279. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7473764/

Swauger, S. (2020, April 2). Our Bodies Encoded: Algorithmic Test Proctoring in Higher Education. Hybrid Pedagogy. https://hybridpedagogy.org/our-bodies-encoded-algorithmic-test-proctoring-in-higher-education/

Woldeab, D., & Brothen, T. (2019). 21st Century assessment: Online proctoring, test anxiety, and student performance. International Journal of E-Learning & Distance Education / Revue Internationale Du E-Learning Et La Formation à Distance, 34(1). http://www.ijede.ca/index.php/jde/article/view/1106

 

Leave a Reply

Your email address will not be published. Required fields are marked *