October 7, 2010

Testing Knowledge–An Interesting Alternative

By: in Educational Assessment, Teaching Professor Blog

Add Comment

Sometimes we do get stuck in ruts—we use the same kinds of test questions: multiple-choice, short answer, maybe a few fill-in-the-blank, some matching and an occasional longer essay question. We forget there are other options. Here’s an example, initially proposed in 1990.

Students are given a prompt (a short sentence or phrase, maybe even a formula) and then asked to write down as many distinct, correct and relevant facts about the prompt as they can. They receive credit for facts that are correct, distinct and relevant. It might be a good idea to do introduce this question format in class so students understand what they are supposed to do and see an example of what does and doesn’t count.

A group of chemistry profs who have written an article about using this strategy recommend that you brainstorm a set of facts that would be considered correct. You can then use them to develop a grading rubric—identifying the number of facts a student might need to list to receive full or partial credit. Once you use a prompt, the student responses will help you enlarge the list of possibilities. These profs also recommend that you post the grading rubric or share it in class so students can see which facts counted and so you can illustrate why others were not distinct or relevant. Of course, this is an open-ended assessment techniques so the rubric will include those facts listed by the majority of students, but not an exhaustive list of possibilities.

These profs report their experiences using this assessment strategy on homework assignments and on in-class exams. There’s no reason they couldn’t be used on quizzes, in small groups and as a way to summarize at the end of class or review at the beginning of the next class.

The strategy has the advantage of allowing students to show their complete (or incomplete) knowledge of some important course content. Exam questions, like multiple-choice, “provide only a partial picture of a student’s knowledge in a course … a student may know relevant information that was not included on the exam.” (p. 49) The prompts also make very clear misconceptions students are holding as well as inappropriate conceptual connections that they may be making.

It’s a straightforward approach, simple to implement and not one that requires extra time to grade. If you decide to give it a try, we welcome you to share your experience in a response to this blog entry.

Reference: Lewis, S. E., Shaw, J. L. and Freeman, K. A. (2010). Creative exercises in general chemistry: A student-centered assessment. Journal of College Science Teaching, 40 (1), 48-53.

email
Add Comment

Tags: , , , ,


Comments

Bethany Usher | October 27, 2010

I teach human osteology, and I organize my final exam like this. I set up "stations" around the room, and put out a set of materials at each one – for example, one might have the bones of the arm, some measurement tools, a reference guide. The students are told to tell me as much as they can about this station. Some might approach it from an evolutionary perspective, talking about the origin and development of arm bones. Other might look at it forensically, using the tools to talk about potential age, sex, and size of the individual, with a critical assessment of the tools. Others might approach it as a model of growth and development. All are correct. The only restrictions are that they can't take the same perspective on any of the stations (usually, they get to pick 4 of 6), and I will give them 1 point for each significant fact or observation, up to 25. I get amazing essays.


Trackbacks

  1. There are no trackbacks to this post yet.


website security