November 28th, 2012

End-of-Course Evaluations: Making Sense of Student Comments


At most colleges, courses are starting to wind down and that means it’s course evaluation time. It’s an activity not always eagerly anticipated by faculty, largely because of those ambiguous comments students write. Just what are they trying to say?

I think part of the reason for the vague feedback is that students don’t believe that the evaluations are taken all that seriously, not to mention they’re in the middle of the usual end-of-semester stress caused by having lots of big assignments due and final exams to face. It’s just not the best time to be asking for feedback and so students dash off a few comments which instructors are left to decipher.

In most end-of-course evaluations, students tend to comment about some of the same aspects of instruction. They frequently address issues of organization, whether students were treated fairly and the challenging aspects of the course. Carol Lauer wondered if faculty and students defined some of these common terms similarly and so she asked a faculty and student cohort to say what they meant when they saw or used the term on course evaluations.

Would you be surprised to learn that faculty and students define the terms differently, or that students themselves don’t agree on definitions? Probably not, I’m thinking. Even so, some of the specifics are interesting. Take “not organized,” for example. Almost a third of the faculty think students use that term when the teacher changes or doesn’t follow the syllabus. Just over 11% of students said that’s what the term meant to them. Seventeen percent of the students equated it with the instructor not being prepared, 15% said they used it when the teacher had no apparent plan for the day and almost 13% equated it with getting student work graded and returned slowly.

“Not fair” refers to problematic grading according to almost 50% of the faculty surveyed, but to just over 2% of the students. To students “not fair” gets written on an evaluation when the teacher plays favorites and doesn’t treat all students the same way. Students and faculty are closer in their understanding of what “challenging” means when it’s applied to a course. It means hard work and lots of it.

The point here isn’t terribly profound but it merits a reminder, especially at the end of courses when teachers are tired. Many of the terms used to describe teaching on rating forms and in student comments are abstractions. “Organized” is something teachers are and deciding whether a teacher is or isn’t depends on what the teacher does. Various behaviors, actions and inaction can be what any given individual sees as the presence or absence organization.

There is good news here. If you’re interested in improving something like organization, if you define it behaviorally, you can change what you, do which is a lot easier than changing what you are. Organization has never been one of my strong suits and I didn’t make much progress trying to “be” organized. But when I started putting a skeleton outline on the board, when I stopped five minutes before the end of period and used the outline to summarize, when I began class working with students to create a list of points to remember from last class, I was seen by students as being more organized.

But it isn’t all good news. A collection of dashed off student comments collected at the end of the semester doesn’t easily translate into an action-based improvement agenda. What the student comments mean is probably not what you think they mean. Communication about the impact of teaching policies and practices on efforts to learn needs to be ongoing so there’s an opportunity for clarifying feedback, adjustments and then more feedback. We can and should make efforts to change the way our institutions collect student assessments, but, until that glacier melts, we need to take matters into our own hands and solicit a different kind of feedback and at different times during the course.

Reference: Lauer, C. (2012). A comparison of faculty and student perspectives on course evaluation terminology. To Improve the Academy, 31, 195-211.

  • Greg

    End-of-course evaluations can be a very constructive tool, but a number of things need to be in place for success. Maryellen is bang-on in saying that the comments can be more confusing than clarifying, especially if the evaluation questions are not succinct. And it won’t help matters if an academic administrator looks at 100 evaluations, overlooks the comments or results that indicate 98% of your students are generally happy, but looks at the 2% who aren’t and asks: “What are we going to do about this?” If the results/comments are confusing, it will be even worse if they are mis-interpreted when a faculty performance evaluation is on the line.

    As both a teacher and an instructional technology/instructional design supporter of faculty for over twenty-five years, I have been blessed to see things from a variety of perspectives and learned a few lessons the hard way! Here’s a brief overview on a few strategies on course evaluations that have worked for me:

    1.Ask the questions clearly.

    2.Ask the right questions. Want to know if one of your course's learning outcome is/was achieved? ASK IT!

    3.Ask for written comments also. I have always found these can be the MOST insightful, if interpreted fairly. As Maryellen points out, two students can take two different meanings from the same question, so keep it simple.

    4.Four questions I use are:
    – “What is one thing I like about this course (so far)?”
    – “What is one thing I do not like about this course (so far)?
    – “What is one thing that could be improved in this course?”
    – “I have one additional comment, not covered in the previous questions”

    5.Once you get the comments, interpret them very carefully. Take them with a grain of salt. Faculty whom I have assisted with this activity initially get very worried, since these comments can be blunt. My advice? Treat them like judges scoring skaters in an international competition. Throw out the high one and the low one, and what’s left over might a good overall indicator!

    6.Use an LMS or some other way of automating the process of administering the survey, processing and summarizing the results.

    7.Share the results with your students.

    8.Survey more than once. Perhaps one-third or halfway through the semester, and one more. Assess any progress you made from the comments on the first survey. Don’t wait until the department’s formal end-of-course evaluation – by then it is too late to improve anything.

    9.Don’t survey in the first place unless you are prepared to deal with the feedback. But if you ask your students for feedback and implement the things from the constructive comments, your students will feel empowered. They will take a different look at the course and become part of the growth process, yours included.

    10.Keep your perspective (and humor) on things. It might not be all positive feedback the first time, but as long as it is constructive, you have something to build upon. Taking things into your own hands, as Maryellen says, can also be a very empowering experience for you!

    In the interest of saving space, I’ll stop here. If anyone would like to see examples and results of my surveys over the years, please ask. They are in the public domain, as I promised my students!

    Great article Maryellen. Thanks!

    • Carla Cooper-Letiec

      Hi Greg, these are great comments. I find it all vcery interesting, as myself and the IR team are currently launching our very first college-wide course evaluations. We would love to see some examples and results that you have mentioned above.


      • Greg

        hello Carla, thanks kindly for your feedback. You can see the some results at:

        PART ONE OF TWO….

        1. click on the “student feedback from AAP2460 survey 1 | survey 2 | survey 3” which is on the homepage – written comments only

        2. For surveys in which I used Likert-scale questions plus written comments, click on the “Excellence” tab. I also did some basic stats on these results. Strong positive and negative trends were highlighted and commented upon. After the first survey, I would add one more question about changes or improvements made since the last survey. And you will also see specific course outcomes are the basis of several questions.

      • Greg

        PART TWO OF TWO…

        3. In more recent semesters, I simply polled my students, and frequently. (These results are not posted.) This would give me a quick sense of how an assignment went or about the pace of the class, or preparedness prior to a test. I would also summarize and share these results with the students with a comment or two.

        You will see that there are some excellent comments and suggestions from my students, especially as the semester progressed. The results weren’t always rosy, but I learned a lot about teaching and personal growth! I hope you and your team will find this information helpful. Feel free to contact me if you have any questions. (


  • cognitioneducation

    Thanks for this post – it is important to keep all this in mind. Despite knowing to take things with a grain of salt, I usually do perseverate on the smattering of negatives anyways. That said, I completely agree with the perspective that we (faculty) interpret things differently than students do. I've said again and again to our administration that the evaluation process needs to be handled with the same kind of attention that one would pay to research. When conducting research with human subjects we pay close attention to making sure that participants interpret questions in the same way, but for some reason that kind of detail is lost with evals. When I mention it here, some folks think of it as "coaching", that is "tampering" with the process. I couldn't disagree more strongly with that perspective, and find it frustrating. The other issue that is difficult to deal with in the evaluation process is the halo effect that comes with highly engaging, funny teachers. When students like a teacher personally they are very forgiving.

  • Anastasia

    Interesting and informative post. After serving on a faculty evaluation committee and looking at final teaching scores, quick student evaluations can be high stakes evaluation for professors. When professors are encouraged to study their own practice with colleagues as critical friends, deeper and more meaningful change and transformation can result. See

  • Pingback: Bookmarks for November 26th through December 4th | Oxford Centre for Staff and Learning Development()

  • Pingback: Using Multiple Course Evaluations to Engage and Empower Your Students and Yourself | Greg Cooper @ UCalgary()

  • Doug

    You might this dissertation very helpful with lots of current research

    Small group instructional feedback: A student perspective of its impact on the teaching and learning environment
    by Mauger, Douglas, Ed.D., GEORGE FOX UNIVERSITY, 2010, 137 pages; 3407167
    Small Group Instructional Feedback (SGIF) is an instructor driven structured mid-term activity. Its purpose is to solicit student specific feedback on instructor effectiveness and provide this feedback immediately back to the instructor. Does SGIF bring about changes to the teaching and learning environment? Do students‘ perceptions of the course, the instructor, and their motivation towards learning change after participating in SGIF? Do instructors make changes to their teaching after receiving mid-term feedback? An end of term survey was taken of 352 college students (response rate of 95%) representing 20 courses from all departments. The survey (alpha .911) consisted of 33 items on a seven point Likert scale and three open ended items that collected student comments. As a result, 81% indicated that the SGIF improved the teaching and learning environment. Strong positive correlations existed at the .001 level between students‘ attitudes toward the course, the instructor, and their motivation with 83% indicating that they became more aware of their role in improving the teaching and learning process. Ninety-six percent of the students indicated they observed that the instructor acted upon their feedback. These findings reflect the ability of structured mid-term feedback to act as a catalyst in bringing about changes to the teaching and learning environment.

  • Pingback: Greg Cooper @ UCalgary()

  • Pingback: Using Multiple Course Evaluations to Engage and Empower Your Students and Yourself |

  • Pingback: Using Multiple Course Evaluations to Engage and Empower Your Students and Yourself |

  • Pingback: Using Multiple Course Evaluations to Engage and Empower Your Students and Yourself | MCTC Center for Teaching & Learning()

  • Pingback: End-of-Course Evauations | Koehler Center for Teaching Excellence()

  • Pingback: NURSING STUDY LOUNGE › Survey: Constructive Feedback()

  • Pingback: The Future of Teacher Evaluations in Higher Ed -

  • ???????????????!
    ????????????? ??????????????????????(N??)?????!???????,???????,???????????,??????? ??,???????????????????????????????????????????????????????????????

  • ???????????????????????????????????????????? ??????????????????????????????????????????????9?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????
    ??????????? ??????????????????????? ???,??????? ????????????2015?? ?????? ??????????? ???????

  • ???????????????????(N??)?????????????????? ?????????? ???????????? ? ??????? ? ????????? ???????? ??? ????? ???????????????? ??????????????????????????????????..

  • ???????????????????????????????????????????? ??????????????????????????????????????????????9?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????

  • 2015????????????????????????????????????????????????????????????????????????????????????????????????????????????.?????????????????????????????????? ?????????????????????????????_?????????IWC????????????????????? ????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????.?????????????

  • ????????????????????????????????????????????????????????(FUKUSHOW?????????????????????? ?????????????N??????????????2????????????,??????????????????? ?????????,??????????? ?????????? ???????,???????????????????????????? ???????,???????ROLEX????IWC ???????,???????IWC???????????????????,???????hermes?????????? ???????????,?????????????????????????????????????????????????

  • ??????????? ??????????????????????? ???,??????? ????????????2015?? ?????? ??????????? ???????

  • 100%???????????????!”??????????????? ????????? ??????????,????????????????????????????????????????????????? ?????????????? ????????????????? ????????? ??????????,???????????????-jck35??:?????????,??????,?????,?????,? ??????,????????,???????,???????,??????????, ?????????,????????,????????,D&G ?????,???? ?? .2013???????????????,???????????????????? ??????????????????????????????????? ?????????????????????.????????,??No.1??????????????????????????????ROLEX?????????????? ?????????????????????2???????,????? ??????? ??,????????? ???n???????

  • ??????????? ??????????????????????? ???,??????? ????????????2015?? ?????? ??????????? ???????

  • ??????????,????????,??????????,???&???????????????????????????????7????????? ???????????????????????????? ????? ??????????????????????????? ????????? ????? ????????????????????????????????? ?IWC??? ??????? ????????? ???????????? ???&????? ??????????????????????????????????????????? ??????????????????????????? ??????????????????????????? ??????????????????????????? ??????????????????????????? ??????????????????????????? ??????????

  • ???????????????????????????????????????????????????????????????????????? ????????????????????????????????????IWC????????????????????????????????????????????????????????????? 2015????????????????10% ? ????????! ? ?????????????? ? 100%?????? ? ???????(????)!? ????????????? ????: ???????????????????????????????????????