students receiving exam results

Making the Grading Process More Transparent

College teachers are always on the lookout for ways to help students better understand why their paper, essay answer, or project earned a particular grade. Many students aren’t objective assessors of their own work, especially when there’s a grade involved, and others can’t seem to understand how the criteria the instructor used applies to their work.

As the author Matthew Bamber notes, grading is not a transparent process to students, even if they have been given the criteria or rubric beforehand. He devised an exercise for his master’s-level accounting and finance students that they found “eye-opening.” In the UK, students “sit” for lengthy exams—in this case, a three-hour, closed-book essay test. In the exercise, students began by answering one lengthy essay question. When finished, they were given a suggested answer to the question (it contained a problem they had to solve and a written analysis), a marking guide, and a set of grade descriptors. Then they were given an anonymous answer to the same question and told to grade it using the materials provided. After having completed that step, students were given a teacher-graded copy of the anonymous answer. The exercise concluded with students being told to grade their answer to the question.

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]

mid-semester feedback

A Collaborative Midterm Student Evaluation

Can students collaborate on the feedback they provide faculty? How would that kind of input be collected? Both are legitimate questions, and both were answered by a group of marketing faculty who developed, implemented, and assessed the approach.

The first argument, supported by research cited in their article, establishes the value of collecting midterm feedback from students. Students tend to take the activity more seriously because they still have a vested interest in the course. The teachers have the rest of the course to make changes that could potentially improve their learning experiences. There’s also research that documents when midcourse feedback is collected and the results are discussed with students, end-of-course ratings improve. And they don’t improve because teachers are doing everything students recommend—sometimes a policy doesn’t need to be changed so much as it needs to be better explained.

The faculty involved in this project reasoned that having students collaborate on feedback for the instructor might have several advantages. It could increase student engagement with the process. Almost across the board now, there are concerns about the low response rates generated by online course evaluations. In addition, students don’t generally put much effort into the feedback they provide. In one study cited in the article, students self-reported taking an average of 2.5 minutes to complete their evaluations. Because doing an evaluation collaboratively was unique and happened midcourse, faculty thought that maybe students would get more involved in the process.

They also wondered if the quality of the feedback might be improved by the interactive exchange required to complete it. And along with that, they thought the process could increase students’ feelings of accountability by virtue of providing feedback in a public venue. Perhaps it would be harder for students to get away with making highly critical, personal comments.

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]

students studying for finals

What Do Students Do When They Study?

An article in a recent issue of the International Journal of STEM Education has got me thinking about study habits and how little we know about how students study.

The article is open-access, and I encourage you to read it whether you teach in the STEM fields or not. But first, a synopsis: The research team used “a practice-based approach to focus on the actual study behaviors of 61 undergraduates at three research universities in the United States and Canada who were enrolled in biology, physics, earth science and mechanical engineering courses.” (p. 2) In small focus groups students responded to this prompt: “Please imagine for a moment how you typically study for this course—can you describe in as much detail as possible your study situation?” (p. 4) What these students reported is a good reason to read this article.

Another reason this research merits attention is the concern the researchers have with how we think about and research study behaviors. We tend to focus on parts of the study process—when students study, how long they study, what strategies they use when they study, and what strategies they should use. Hora and Oleson believe that studying is a collection of behaviors and thinking about them in isolation reduces the complex ways they interact. Their results support that belief. “Results indicate that studying is a multi-faceted process that is initiated by instructor or self-generated cues, followed by marshaling resources and managing distractions, and then implementing study behaviors that include selecting a social setting and specific strategies.” (p. 1)

As for the cohort consisting of students reporting on how they studied in STEM courses, the researchers note, “We are not suggesting that this account of studying is generalizable to all students but is a heuristic device for thinking about studying in a more multi-dimensional manner than is common at the present time.” (p. 15) So, what your students would say about how they study may well be different, but that’s another reason this is such a good article. As you make your way through it, you are constantly considering what you do and don’t know about how your students study.

Hora, M. T. and Oleson, A. K. (2017). Examining study habits in undergraduate STEM courses from a situative perspective. International Journal of STEM Education, 4 (1), 19 pages.

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]

Students work on group project

Peer Assessment: Benefits of Group Work

Teaching ProfessorWith the increased use of group work in college courses, exploration of the role of peer assessment has broadened, as has its use. In one survey, 57 percent of students reported that their faculty had incorporated peer evaluations into group assignments. We’ve done articles on this topic before, but mostly we’ve highlighted resources, specifically good instruments that direct peers to provide feedback in those areas known to influence group outcomes. Recent literature includes a variety of peer assessment systems (find three examples referenced at the end of this article), many of them online programs that expedite the collection, tabulation, and distribution of the results. Here’s a list of the benefits of making peer assessment part of group learning experiences.

Peer assessment can prevent group process problems. Several studies show that it helps, and sometimes virtually solves, one of the most egregious group problems: free riding, as in students not doing their fair share of the work. One study found that the very possibility of having peer evaluations improved the performance of group members. Of course, that benefit is enhanced when peers receive feedback from each other as they are working together as opposed to when the project is finished.

Formative peer assessment also improves individual and group performance. Even if the group is not experiencing major problems, formative feedback from peers can help individual members fine-tune their contributions and help the group increase its overall effectiveness. Some of the processes faculty are using to achieve this benefit include individual and group responses to the feedback. Individual students comment on feedback from the group via an email to the teacher, and groups use the feedback to develop an improvement plan. They also make note of what the group is doing well. Online peer assessment systems make multiple exchanges of formative feedback possible, which is helpful when the groups are working on complex, course-long projects. The Brutus and Donia system resulted in measurable individual improvement during a second semester when the system was used. In other words, students took what they’d learned about their performance in the group and acted on it the following semester.

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]

student evaluations

Improving Student Evaluations with Integrity

Oh, how the tables do turn! Each semester, after quizzing, testing, and otherwise grading our students, they get to return the favor and rate their professors, and some of them can be harsher than we are on our most critical days. Because administrators incorporate these ratings in their evaluations of us, they can’t be ignored. Rather than wallowing in the sorrows of negative reviews, we must accept it for what it is: feedback. And although we should not in any way compromise our principles or the course content to get better ratings, there are actions that don’t undermine our integrity and do positively influence the end-of-course ratings. I’d like to suggest several that have improved my ratings.

Be transparent about your grading methods. It’s my opinion that students should never be surprised by their grades in a course. Whenever I give an assignment, no matter how small, I provide instructions in writing, a point value, and a due date. I’m a huge fan of rubrics and always take time to help students understand and interpret them. Examples posted on the course website can demonstrate what you’re looking for in assignments.

I work hard to return papers in a timely manner and share my deadlines with students so that they know when to expect the feedback. Most online grading systems make it easy for students to monitor their progress throughout the semester. By removing the mystery from my grading system, I have consistently received high scores from students on the applicable questions on the evaluation form.

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]

Teaching large classes.

A Quiz That Promotes Discussion and Active Learning in Large Classes

Educational research is full of studies that show today’s students learn more in an active-learning environment than in a traditional lecture. And as more teachers move toward introductory classes that feature active-learning environments, test performance is improving, as is interest in these classes. The challenge for teachers is finding and developing those effective active-learning strategies. Here’s a take-home quiz activity that I’ve adapted and am using to get students interested in my course content.

I teach a large, non-major chemistry course. I try to include topics such as pollution sources, alternative fuels, nutrition videos, and hometown water supplies that are relevant to students in different majors. I give a five-question quiz assignment several days before the topic comes up in class and then use it to facilitate class discussion. I want students thinking and applying course content. The first thing I ask for is a link to a recent article or video of interest to the student within the designated topic area (e.g., Find a recent article that describes an alternative energy source). Question two asks for a general understanding or definition (e.g., Is this energy source renewable or nonrenewable? Explain.). Next are questions that encourage students to interpret what they’ve read and assess its reliability (e.g., How does this energy source compare to oil and coal? Or how will this energy source help meet our current and future energy needs?). The quiz wraps up with a question that asks for the student’s opinion on the topic (e.g., Burning garbage to produce electricity is an alternative fuel—would you be happy to see your town adopt this method? Explain.).

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]

learning assessment techniques

Three Learning Assessment Techniques to Gauge Student Learning

A learning assessment technique (LAT) is a three-part integrated structure that helps teachers to first identify significant learning goals, then to implement effectively the kinds of learning activities that help achieve those goals, and finally—and perhaps most importantly—to analyze and report on the learning outcomes that have been achieved from those learning activities.

LATs are correlated to Fink’s Taxonomy of Significant Learning, such that there are about 6–10 techniques for each of the learning dimensions, including techniques to help students learn the foundational knowledge of the subject and help students apply that foundational knowledge to real situations so that it becomes useful and much more meaningful to them.

There are techniques that help students integrate ideas—different realms of knowledge—so that the learning is more powerful. There are techniques to help students recognize the personal and social implications of what they are learning, which is what Dee Fink calls the human dimension. There are techniques to help students care about what they are learning so that they’re willing to put the effort into what they need to learn. And finally, there are techniques to help students become better and more self-directing learners (learning how to learn).

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]

group exams

Group Exams and Quizzes: Design Options to Consider

Although still not at all that widely used, there’s long-standing interest in letting students work together on quizzes or exams. Upon first hearing about the approach, teachers’ initial response is almost always negative. Here are the most common objections.

  • Grades are measures of individual mastery of material. With a group exam or quiz, some students may get a better grade than they’ve earned. Group grades do not measure individual learning.
  • A group can settle on wrong answers and thereby lower the score of the single bright student in the group who knows the right answer.
  • Group exams and quizzes make it too easy for students. They don’t have to think for themselves but can rely on others in the group to do the thinking for them.
  • It’s cheating. Students are getting answers they don’t know from other students. They’re consulting another source rather than putting in the work and developing their own knowledge.
  • Certifying exams (various professional exams such as those in nursing, accounting, the MCAT and GRE, for example) are not group exams. Group quizzes and exams do not prepare students for these all-important assessments.

On the other hand, those who do allow group collaboration on exams and quizzes may respond to the objections with a corresponding set of set of advantages associated with their use.

  • Group exams and quizzes reduce test anxiety. Pretty much across the board, students report that anticipating and participating in group exams and quizzes makes them feel less anxious. And for students with exam anxiety, that can be a significant benefit.
  • Collaborative quizzes and exams show students that they can learn from each other. Many students arrive in courses believing the only person they can learn from is the teacher. But as they talk about test questions, share answer justifications, discuss what content the answer requires, they get to experience what it’s like to learn from peers.
  • Group quizzes and exams provide immediate feedback. Students don’t have to wait to get the exam back. They get a good indication from those in the group why the answer is or is not correct.
  • Working together on test questions teaches students how to identify credible arguments and sources. Given the opportunity to change answers based on what someone else says directly confronts students with the tough issues of who to believe and when to trust their own judgment.
  • Collaborative quizzes and exams model how problem solving in professional contexts usually occurs. Professionals collaborate, they have access to resources, they can contact experts, they argue options, and evaluate possible answers. Collaborative testing gives students the opportunity to see how and why that results in better decision making.
  • Group quizzes and exams can improve exam scores and sometimes, but not always, content retention. The improvement in scores is an expected outcome of collaboration, but the improvement is also present when students collaborate on exam questions and then answer questions that deal with the same content on a subsequent exam taken individually. Effects of collaboration on retention are mixed. See the following references listed at the end of this article for examples: Cortright, Collins, Rodenbaugh and DiCarlo, (2002), Gilley and Clarkson (2014), Leight, Sunders, Calkins and Withers (2012), Lust and Conklin (2003) and Woody, Woody and Bromley (2008).

This is a Faculty Focus Premium Article

To continue reading, you must be a Faculty Focus Premium Member.
Please log in or sign up for full access.

Log In

[wppb-login register_url="/premium-account/subscribe" lostpassword_url="/lostpassword"]