October 7th, 2010

Testing Knowledge–An Interesting Alternative


Sometimes we do get stuck in ruts—we use the same kinds of test questions: multiple-choice, short answer, maybe a few fill-in-the-blank, some matching and an occasional longer essay question. We forget there are other options. Here’s an example, initially proposed in 1990.

Students are given a prompt (a short sentence or phrase, maybe even a formula) and then asked to write down as many distinct, correct and relevant facts about the prompt as they can. They receive credit for facts that are correct, distinct and relevant. It might be a good idea to do introduce this question format in class so students understand what they are supposed to do and see an example of what does and doesn’t count.

A group of chemistry profs who have written an article about using this strategy recommend that you brainstorm a set of facts that would be considered correct. You can then use them to develop a grading rubric—identifying the number of facts a student might need to list to receive full or partial credit. Once you use a prompt, the student responses will help you enlarge the list of possibilities. These profs also recommend that you post the grading rubric or share it in class so students can see which facts counted and so you can illustrate why others were not distinct or relevant. Of course, this is an open-ended assessment techniques so the rubric will include those facts listed by the majority of students, but not an exhaustive list of possibilities.

These profs report their experiences using this assessment strategy on homework assignments and on in-class exams. There’s no reason they couldn’t be used on quizzes, in small groups and as a way to summarize at the end of class or review at the beginning of the next class.

The strategy has the advantage of allowing students to show their complete (or incomplete) knowledge of some important course content. Exam questions, like multiple-choice, “provide only a partial picture of a student’s knowledge in a course … a student may know relevant information that was not included on the exam.” (p. 49) The prompts also make very clear misconceptions students are holding as well as inappropriate conceptual connections that they may be making.

It’s a straightforward approach, simple to implement and not one that requires extra time to grade. If you decide to give it a try, we welcome you to share your experience in a response to this blog entry.

Reference: Lewis, S. E., Shaw, J. L. and Freeman, K. A. (2010). Creative exercises in general chemistry: A student-centered assessment. Journal of College Science Teaching, 40 (1), 48-53.