Spring semester updates & COVID info Roadrunner Roadmap

Home / Blog Post / How Gamification Changed My Assessment Strategy

 

By Lorenzo Brancaleon, Ph.D.
UTSA 2022-2023 Next-Gen Leadership Fellow
Professor, Department of Physics and Astronomy

When I started writing this second blog post, I wanted to discuss the many ways in which one can leverage course gamification to deploy experimental or, more simply, unfamiliar pedagogy. There is much to write about how gamification can be used to change assessment strategies, stimulate students with activities which are effective but underused in STEM, and create badges on steroids (or any other favorite performance-enhancing drug banned by the International Olympic Committee). However, when I was done writing I realized that what I wrote was a dissertation on STEM education while I was supposed to write a blogpost. So, I decided to separate the various topics in a more manageable format and length.

In this post, I will attempt to briefly (ha! If that was even possible for me) to outline how gamification has changed my strategy on assessment. In future posts, we’ll get into other topics including concept maps and how to re-think the use of badges.

What spurred my experimentation with gamification:

As an instructor, I fall firmly into the tinkerer camp. I have always made adjustments to my teaching from one semester to the next – why do you think I experimented with gamification in the first place?!

Adjustments may range from small changes to almost entire overhauls and would be triggered by self-evaluations, conversation with fellow instructors (especially my favorite partners in crime: the College of Sciences Faculty Champions) and, dare I say, students’ evaluations (my apologies if this statement drew a loud gasp).

Why gamification and experimental pedagogy? Think of gamification as a sophisticated system of extra credits that:

  • are built into the course,
  • require considerable effort by the students, and
  • offer additional opportunities for them to engage with the course material.

Thus, adopting a model that makes gamification voluntary only incentivizes students’ participation and creates a space to implement pedagogical tools, approaches, and methods a faculty member may not have tried before.

In other words, deploy a test run or a beta version of the pedagogical novelties.

Low-Stakes Assessment (LSAs).

LSAs include forms of assessing student knowledge without each test greatly affecting their grade. Traditional assessments, particularly in STEM courses, may be high-stakes and use tools such as partial and final exams that, individually, may impact 30% or more of a student’s final grade.

Often, high-stakes assessment creates anxiety in learners and, I would argue, limits the ability of an instructor to probe students’ understanding of the material. Incidentally, high-stakes assessment, particularly in STEM courses, has been linked to higher education inequity (see resources listed below) – but that’s another blog post.

LSAs are not uniquely defined. I know several colleagues at UTSA, including some teaching STEM courses, who have adopted different versions of LSAs such as the extensive use of clickers or other response systems, quizzes (both in person and online), group work, escape rooms, and more.

LSAs are not limited to short, multiple-choice questions. They can be deployed in reasoning problem settings or even projects. In general, strategies that use formative assessment are often considered LSAs whether they involve quizzes or long reasoning problems.

There are many ways to implement LSAs especially in STEM courses. For instance, they can be deployed as a set of more frequent assessment activities that one can use to probe student’s understanding more deeply. Alternatively, they can be leveraged to increase the formative component of more traditional assessment activities by providing students feedback on their first attempt and let them re-try the same assessment or, a different assessment designed to measure the same learning outcomes. By the way, the use of formative assessment should be used to give students a chance to meet a desired learning outcome (i.e., their true comprehension of a topic) and not to simply increase their grade.

LSAs and Gamification

What does this have to do with gamification? I used my “minimally-invasive” gamification approach to test LSA activities before deciding whether to adopt them as part of my grading schema.

In my case, I tested several LSA concepts including:

  1. mini exams that tested only one chapter at a time using reasoning problems,
  2. interactive video assignments (online) using Panopto and PlayPosit,
  3. interactive quantitative applets (Physics-specific applets),
  4. surveys, and
  5. post-activity self-reflections.

While some are still being evaluated, others such as the mini exam are now part of my grading schema (as you see in the video) where each mini exam does not contribute more than 6% to the overall grade.

Assessment Diversification

There is one last pitch I would like to make for the merits of LSA diversification. LSAs allow an instructor to diversify the format of each assessment and, as a result be more inclusive of the different ways in which students approach their learning and perform during assessment activities. Assessment diversification may help instructors identify strengths and weaknesses of students and design approaches to intervene.

Looking at it in a different way, LSAs provide a broader arsenal to instructors to properly probe the proficiency of our learners and identify areas of intervention. For instance, through LSAs one can establish a gap between understanding a concept and deploying the concept to solve a problem thus informing an instructor on how to intervene to help the student fill the gap.

Tips and Ideas:

­­If you are memory-challenged like I am I find it useful to actually write down notes during the semester so that I can remember what pedagogical strategies have worked, which can be improved, which should be abandoned and why. At the end of each semester, I review these notes and strategize how to implement changes and when.

References:
  • Malespina, & Singh, C. (2022). Gender differences in test anxiety and self-efficacy: why instructors should emphasize low-stakes formative assessments in physics courses. European Journal of Physics, 43(3), 35701– https://doi.org/10.1088/1361-6404/ac51b1
    UTSA LIbrary Permalink
  • E. Bickel, L.M. Bunnell, T. Heus. (2021) Utilizing peer teaching and reflection on low-stakes quizzes to improve concept learning outcomes in introductory calculus-based physics classes, European Journal of Physics, 42 055701.
  • McNeil, Linda M., “Faking Equity: High-Stakes Testing and the Education of Latino Youth,” pp. 57-111 in Angela Valenzuela, ed., Leaving Children Behind: How “Texas-Style” Accountability Fails Latino Youth. Albany, NY: State University of New York Press, 2005.