January 21, 2011

Assessing and Developing Metacognitive Skills

By: in Learning Styles

Add Comment

Metacognition is easily defined: “[It] refers to the ability to reflect upon, understand and control one’s learning,” (Schraw and Dennison, p. 460) or, even more simply, “thinking about one’s thinking.” Despite straightforward definitions, metacognition is a complicated construct that has been the object of research for more than 30 years.

Research supports theories that separate metacognition into two major components: knowledge of cognition and regulation of cognition. Knowledge of cognition “describes an individual’s awareness of cognition at three different levels: declarative (knowing about things), procedural (knowing about how to do things), and conditional (knowing why and when to do things).” (Cooper and Sandi-Urena, p. 240) Regulation of cognition relates to how learners control their learning. Relevant regulatory activities include planning, monitoring, and evaluating.

Metacognition has been studied in students from grade school through college, and it has produced a number of interesting and important findings. Schraw and Dennison report that “recent research indicates that metacognitively aware learners are more strategic and perform better than unaware learners.” (p. 460) When learners use regulatory metacognitive skills, they do better at paying attention, they use learning strategies more effectively, and they are more aware of when they are not comprehending something they are trying to learn. Surprisingly, the research has also shown that metacognitive awareness is not a function of intellectual ability. And the research has shown that metacognitive skills are not domain specific. They are remarkably consistent across different fields.

Two of the references below (Cooper and Sandi-Urena and Schraw and Dennison) report on the development of instruments that can be used to assess a learner’s level of metacognitive awareness. The Schraw and Dennison instrument, the Metacognitive Awareness Inventory, includes 52 items, including “I am good at organizing information,” “I summarize what I’ve learned after I’ve finished,” “I am a good judge of how well I understand something,” and “I change strategies when I fail to understand.”

The Cooper and Sandi-Urena instrument, the Metacognitive Activities Inventory (MCAI), was “designed specifically to assess students’ metacognitive skillfulness during chemistry problem solving.” (p. 240) It contains 27 items, including “Once a result is obtained, I check to see that it agrees with what I expected,” “I spend little time on problems I am not sure I can solve,” “I try to double-check everything: my understanding of the problem, calculations, units, etc.,” and “I attempt to break down the problem to find the starting point.”

As these examples illustrate, even though the MCAI was developed for use in chemistry, its items are relevant to many kinds of problem solving. In both cases, students respond via a Likert scale that asks them to rate how characteristic the responses are of them. Each of these instruments was carefully developed, and the articles referenced include empirical results verifying both their reliability and validity.

The research makes clear that metacognitive skills can be developed and that, certainly, an instructor could use either of these instruments to help accomplish that goal. Having students complete an instrument like this helps instructors by providing data on how metacognitively aware a given group of students might be and by identifying students who might not have a well-developed set of metacognitive skills.

Administering an instrument like this can be a learning experience for the student who completes it. It forces reflection—what do I do when I confront a problem?—and it describes actions a student might not know about or do regularly. Neither of these instruments is time consuming to complete, and both were developed for use by faculty in classrooms. Having students complete either of these instruments after an exam when they did not do as well they (or their teacher) wanted is an effective way to provide feedback, with the potential to improve subsequent performance.

References:
Cooper, M. M., and Sandi-Urena, S. (2009). Design and validation of an instrument to assess metacognitive skillfulness in chemistry problem solving. Journal of Chemical Education 86 (2), 240-245.

Schraw, G. (1998). Promoting general metacognitive awareness. Instructional Science, 26, 113-125.
Schraw, G. and Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19, 460-475.

Excerpted from Assessing and Developing Metacognitive Skills, The Teaching Professor, 23.10 (2009): 3, 6.

email
Add Comment

Tags: ,


Comments

jones@mathshelp | April 21, 2011

The problem in the UK is one of accountability. If lecturers fail to deliver, there is very little the student can do. Standards have suffered even further with the import of overseas staff who may cost less but clearly have language problems. Effective communications is not one of their attributes! Students suffer as a consequence, often getting a class of degree that they do not merit. Unfortunately, it is the students who suffer the consequences of poor teaching practice. Accountability for poor delivery is long overdue!

analyn | February 21, 2013

Sir Schraw i would like to ask permission to use 30 items in metacognitive awareness inventory instead of 52 items is it ok sir

bhuvi | March 7, 2013

good morning sir.
i am a reserch scholar in india.. and i want to see MAI which is prepared by you ..pl send me a copy of that. my email.id is bhuvi.vish@gmail.com i will be highly oblige to you.


Trackbacks

  1. Metacognition skills | Aamigostravel


website security