By Kendal Toll, Michelle A. Drefs, and Jennifer Lock
University of Calgary
Kendal Toll, Werklund School of Education, University of Calgary; Michelle A. Drefs, Werklund School of Education, University of Calgary; Jennifer Lock, Werklund School of Education, University of Calgary.
What is student engagement? How do we identify cognitive engagement? The focus of the article is to briefly overview what student engagement is and the methods by which it is measured. Emphasis is given to using cognitive engagement measures. Findings from a case study highlight how educators can use self-report measures to better identify students’ levels of cognitive engagement in the context of using speech recognition technology for writing tasks. From the research, three factors were identified to be used when selecting an instrument for monitoring cognitive engagement: 1) nature of the measurement of cognitive engagement by the instrument; 2) appropriateness of general vs. subject-specific measure; and 3) length of the measure.
Student engagement has been described as “one of the hottest research topics in the field of educational psychology” and the “holy grail of learning” (Sinatra, Heddy, & Lombardi, 2015, p. 1). The prominence of this topic is highlighted by the fact that entire journal issues (e.g., issue 50 of Educational Psychologist) and books (e.g., the Handbook of Research on Student Engagement)have been dedicated to the topic. Why should educators be interested in student engagement? First, students who are engaged are more likely to demonstrate positive current and future outcomes in such areas as academic achievement, school graduation, and transition to post-secondary learning (Bulger, Mayer, Almeroth, & Blau, 2008; Fredericks, Blumfeld, & Paris, 2004; Greene & Miller, 1996; Wilms, Friesen, & Milton, 2009). This relationship between engagement and academic outcomes appears to matter most for students on the extremes of the engagement continuum—those being either highly engaged or highly disengaged (Lovelace, Reschly, Appleton, & Lutz, 2014). Second, teachers can influence students’ level of cognitive engagement (Christenson, Reschly, & Wylie, 2012). Conversely, if not attended to, research suggests that levels of student engagement declines notably beginning in the junior high years (Appleton, 2012; Eccles et al., 1993; Wilms et al., 2009).
Recent conceptualizations emphasize the multicomponent nature of student engagement (e.g., Christenson et al., 2012). Two commonly cited aspects are behavioural engagement (e.g., work behaviours and persistence) and emotional engagement (e.g., interest and feelings towards learning; Zyngier, 2008, p. 1769). A third and less researched aspect (Appleton, Christenson, Kim, & Reschly, 2006) is cognitive (or psychological) engagement—the investment that students place in their learning and their ability to strategically think and place relevance on their work to everyday life (Appleton 2012; Wilms et al., 2009). It is associated with such internal processes as deep processing, cognitive strategy use, self-regulation, motivation, and effort (Zyngier, 2008). Cognitive engagement has been found to meaningfully influence positive learning outcomes in students (Fredricks, et al., 2004; Greene, 2015; Greene, Miller, Crowson, Duke, & Akey, 2004; Wigfield et al., 2008).
Although all three aspects of engagement are important, cognitive engagement presents certain measurement issues for teachers because it is largely an internal process and is not readily observable within the classroom. In contrast, for example, behavioural engagement focuses on such observable and external actions as voluntary classroom participation and time on-task. While many educators feel that they can sense when cognitive engagement is present (Darr, 2012), student self-report surveys may be essential to the assessment of cognitive engagement given its internal characteristics. With self-report survey measures, students reflect on and respond to a series of questions designed to target key aspects of engagement – such as “Even if I think a task is boring I will try to do my best.”
The present study was conducted as part of a broader study examining the necessary conditions to support technology utilization (e.g., speech recognition technology) within the classroom (Toll, 2014). A key component of this work was the assessment of student cognitive engagement during writing tasks using both student self-report measures and student and teacher interviews. Findings from this study have important implications for best approaches to the assessment of cognitive engagement.
The study was conducted in a specialized school (Grades 1-12) for students with learning disabilities. Eight Grade 7 students, with a minimum of one year experience in using speech recognition technology, were selected for inclusion in this study. As our focus was on cognitive engagement during writing tasks, each student completed the Survey of Motivation to Engage in Writing (SMEW; Hawthorne, 2008) and answered a series of interview questions on the benefits of speech recognition technology on their levels of cognitive engagement. As well, the grade 7 teachers (n = 2) and teaching assistants (n = 2) were additionally interviewed to further examine their perceptions of cognitive engagement resulting from the use of speech recognition technology.
By applying established cut scores to the SMEW total scores, three students were classified as engaged (Eric, Steph, and Scott), three as neutral (Vanessa, Steve, and Nadina), and two as non-engaged (Finn and Glenn; see Figure 1). However, when students were directly asked whether they perceived themselves to be cognitively engaged, all but two of the participants (Steve and Glenn) self-identified as being cognitive engaged. Further defining and identifying cognitive engagement proved to be a difficult task. For example, only two students were able to provide specific examples of how they were cognitively engaged in their learning (i.e., “develop ideas more” [Vanessa] and “more aware of your writing” [Finn]).
Figure 1. Classification of students into engagement groupings based on total Survey of Motivation to Engage in Writing scores. Line represents the cut-off score for engaged (>130), neutral (100-129), and non-engaged (>99).
For the grade 7 teaching staff, none identified the Grade 7 students as cognitively engaged. Rather, they reported cognitive engagement as developmentally not achieved until around Grades 8 and 9—“I think [speech recognition technology] promotes it but our grade sevens are not there yet.” Similar to the students, the teaching staff were not able to provide specific examples of cognitive engagement.
Despite the difficulties in directly describing or identifying “cognitive engagement,” when asked more general questions regarding the necessary conditions to support technology utilization, both the students’ and teachers’ responses provided evidence that students were demonstrating cognitive engagement in the Grade 7 classroom. Both groups reported speech recognition technology as supporting increases in task initiation and completion, collaboration, confidence, value of learning (goal setting), control and relevance of school work, relevance of school work to future aspirations, and strategizing—all key features of cognitive engagement (examples provided in Figure 2).
Task Initiation &
“If I didn’t have Dragon I’d be at the first question and be like I’m bored… bye.” (Steve)
“I mean there is still the odd student who getting started on things is part of their issues whether they have technology or not but I would say that the majority of them, it allows them that freedom to get started and go from start to finish on their work for sure.”
Value of learning (Goal setting)
“Now those writing assignments are being finished and I can focus on math and other more important things.” (Eric)
“And by grade 8 they are starting to realize that the technology allows them to produce much higher quality work that they are capable of.”
Control & relevance of school work
“It is easier for me to invest ideas onto paper so yeah I guess I spend less time thinking about how I am going to make this thing and just actually make it a story.” (Finn)
“The students having a positive attitude and that buy in and wanting to do it because they see the success that they have.”
Figure 2. Examples of students’ and teachers’ responses to general questions regarding conditions to support technology utilization within the classroom. Responses were categorized to highlight alignment with aspects of cognitive engagement.
The above case study serves to underscore that what cognitive engagement is and what it looks like within the learning environment may not be readily apparent to either teachers or students. Part of the issue may just be that it is not yet a commonly used term, with teachers and students having received insufficient instruction in this area. However, even with increased teacher training and attention to this area, it is likely that a key component of assessing cognitive engagement will be student self-report measures. The results from the study support the self-report instrument (SMEW) as a more sensitive measure of student engagement than either student or teacher interviews (e.g., student at all three levels of engagement identified using the SMEW). Educators interested in assessing cognitive engagement in support of learning will want to adopt a multi-method approach, as opposed to relying solely on observational information, given the less observable (intrinsic) aspects of cognitive engagement.
This case study focused on the assessment of cognitive engagement at only one time point. The assessment of cognitive engagement could also be undertaken in a more formative and ongoing manner. When used along with academic achievement measures, student engagement scores obtained through self-report measures could provide an additional growth index that educators can use in monitoring student learning. For educators interested in assessing cognitive engagement, a helpful resource for reviewing available engagement instruments (aimed at the upper elementary through high school levels) can be found at: http://ies.ed.gov/ncee/edlabs/regions/southeast/pdf/REL_2011098.pdf. This listing is not exhaustive and other measures, such as the OurSCHOOL|TTFM (Tell Them From Me) available at: http://www.thelearningbar.com/solutionsp/school-improvement-p/, have been used by schools to assess and monitor cognitive engagement.
Three key factors need to be addressed when selecting an instrument for use in monitoring increases in cognitive engagement. First, there is large variation in how cognitive engagement is defined and measured across various self-report surveys (Fredricks & McColskey, 2012). Care and attention is needed to examine how cognitive engagement is measured on the selected instrument. For example, how many questions relate specifically to cognitive engagement? Are key aspects of cognitive engagement asked (e.g., intrinsic motivation, strategic thinking, relevance of learning to future endeavors, goal setting, and self-regulation)? In some cases, an instrument designed to measure another learner characteristic may provide a suitable measure of cognitive engagement. For example, while the Survey of Motivation to Engage in Writing (Hawthorne, 2008) is identified in its name as a measure of “motivation,” many of the questions assess key aspects of cognitive engagement (see Figure 3 for sample questions).
Figure 3. Sample of questions from Hawthorne’s (2008) Survey of Motivation to Engage in Writing that measure cognitive engagement. Copyright 2008 by S. Hawthorne and reprinted with permission of author.
Second, consideration needs to be given to whether a general measure versus a subject-specific measure is most appropriate. General measures are best suited when teachers are looking to broadly assess cognitive engagement across curricular areas, while subject-specific measures provide for a more targeted assessment of engagement in relation to a particular academic domain (reading, writing, math).
Third, consideration should be given to the length of the measure in relation to how frequently cognitive engagement will be assessed. At present, there are no established guidelines for the frequency of assessing cognitive engagement to detect change. Educators may wish to adopt an assessment schedule that coincides with the academic monitoring of intensive interventions, which may vary from bi-weekly to bi-monthly (Kovaleski, 2007). From a practical perspective (scheduling, time allocation), shorter surveys are better suited for frequent monitoring, provided they have adequate reliability and validity. In contrast, longer surveys may be more practical to administer on a more infrequent basis. For example, information collected through the bi-yearly administration (fall, spring) of the 30 plus items on the Student Engagement Instrument (Appleton et al., 2006; see also Betts, Appleton, Reschly, Christenson, & Huebner, 2010) has been used to inform a number of intervention efforts (Appleton, 2012).
Teachers who decide to collect cognitive engagement data are not only faced with deciding how and when best to measure it, but also with what to do with the data collected. We conclude by offering a few cautions and suggestions with respect to the latter. When students are found to be cognitively engagement, teachers have evidence that current practices are appropriate. Less clear is what to do in those instances where students are not cognitively engagement. This lack of cognitive engagement could signal a need for changes within the classroom environment. However, cognitive engagement is complex. Teachers must consider a number of broader individual and task-level factors that interact together to influence student engagement levels. Individual variance in engagement results from the influence of a broad number of student-level (gender, grade, self-efficacy beliefs, self-regulation), family (e.g., academic and motivational support for learning, goals and expectations, learning resources in the home, socioeconomic status, family structure), and peer (e.g., educational expectations, shared common school values, attendance, academic beliefs and efforts, peer aspirations for learning) factors (Appleton, 2012; Appleton et al., 2006; Chiu, Pong, Mori, & Chow, 2012; Greene et al., 2004; Te-wang & Eccles, 2011; Wilms et al., 2009).
Consideration must also be given to the impact of task difficulty on cognitive engagement. Csikszentmihalyi (1997) has argued that flow, the deep absorption in an activity when it is intrinsically interesting to a student, occurs when student abilities and the challenges of the tasks that they are asked to perform are in balance. Four general relationships between skills and challenge are stated as existing: high skill and low challenge, high skill and high challenge, low skill and low challenge, and low skill and high challenge (Csikszentmihalyi, 1997). Related to cognitive engagement, the level of challenge of the task needs to be considered. For example, students who are confident but not challenged are less likely to be engaged (Wilms et al., 2009). By attending to both student and task factors, educators are likely to have better insight into why particular students may not experience improved engagement levels when technology is implemented to support learning and how to best address and intervene.
Appleton, J. J. (2012). Systems consultation: Developing the assessment to intervention link with the student engagement instrument. In S. Christenson, A. Reschly & K. Wylie (Eds.), The handbook of research on student engagement (pp. 725-741). NY: Springer.
Appleton, J. J., Christenson, S.L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44(5), 427-445.
Betts, J. E., Appleton, J. J., Reschly, A. L., Christenson, S. L., & Huebner, E. S. (2010). A study of the factorial invariance of the student engagement instrument (SEI): Results from middle and high school students. School Psychology Quarterly, 2, 94-93.
Bulger, M. E., Mayer, R. E., Almeroth, K. C., & Blau, S. D. (2008). Measuring learning engagement in computer-equipped college classrooms. Journal of Educational Multimedia and Hypermedia, 17(2), 129-143.
Chiu, M., Pong, S., Mori, I., & Chow, B. (2012). Immigrant students' emotional and cognitive engagement at school: A multilevel analysis of students in 41 countries. Journal of Youth and Adolescence, 41(11), 1409-1425.
Christenson, S. J., Reschly, A. L., & Wylie, C. (2012). Handbook of research on student engagement. New York, NJ: Springer.
Csikszentmihalyi, M. (1997). Finding flow: The psychology with everyday life. New York: Basic Books.
Darr, C. W. (2012). Measuring student engagement: The development of a scale for formative use. In S. Christenson, A. Reschly & K. Wylie (Eds.), The handbook of research on student engagement (pp. 707-723). NY: Springer.
Eccles, J., Midgley, C., Wigfield, A., Buchanan, C., Reuman, D., & Flanagan, C. (1993). Development during adolescence: The impact of stage-environment fit on young adolscents' experiences in schools and families. American Psychologist, (48), 90-101.
Fredricks, J., Blumfeld, P., & Paris, A. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59-109.
Fredricks, J. A., & McColskey, W. (2012). The measurement of student engagement: A comparative analysis of various methods and student self-report instruments. In S. Christenson, A. Reschly & K. Wylie (Eds.), The handbook of research on student engagement (pp. 763-741). NY: Springer.
Greene, B. A. (2015). Measuring cognitive engagement with self report scales: Reflections from over 20 years of research. Educational Psychologists, 50(1), 14-30.
Greene, B. A., & Miller, R. B. (1996). Influence on achievement: Goals, perceived ability, and cognitive engagement. Contemporary Educational Psychology, 21, 181-192.
Greene, B. A., Miller, R. A., Crowson, M., Duke, B. L., & Akey, K. L. (2004). Predicting high school students' cognitive engagement and achievement: Contributions of classroom perceptions and motivation. Contemporary Educational Psychology, 29, 462-482.
Hawthorne, S. (2008). Engaging reluctant writers: The nature of reluctance to write and the effect of a self-regulation strategy training programme on the engagement and writing performance of reluctant writers in secondary school English. (Doctoral dissertation). Retrieved from ResearchSpace@Auckland.
Kovaleski, J. F. (2007). Potential pitfalls of response to intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.). Handbook of response to intervention: The science and practice of assessment and intervention (pp. 80-92). NY: Springer.
Lovelace, M.D., Reschly, A. L., Appleton, J.J., & Lutz, M.E. (2014). Concurrent and predictive validity of the student engagement instrument. Journal of Psychoeducational Assessment, 32(6), 509-520.
Sinatra, G. M., Heddy, B. C. & Lombardi, D. (2015). The challenges of defining and measuring student engagement in science. Educational Psychologists, 50(1), 1-13.
Te-Wang, M., & Eccles, J. S. (2011). Adolescent behavioral, emotional, and cognitive engagement trajectories in school and their differential relations in educational success. Journal of Research on Adolescence, 22(1), 31-39.
Toll, K. A. (2014). The role of speech recognition technology in fostering cognitive engagement in students (Unpublished master’s thesis). University of Calgary, Calgary, AB.
Wigfield, A., Guthrie, J. T., Perencevich, K. C., Taboada, A., Klauda, S. L., McRae, A., et al. (2008). Role of reading engagement in mediating effects of reading comprehension instruction on reading outcomes. Psychology in the Schools, 45, 432– 445.
Wilms, J. D., Friesen, S., & Milton, P. (2009). What did you do in school today? Transforming classrooms through social, academic and intellectual engagement. (First National Report). Toronto: Canadian Education Association.
Zyngier, D. (2008). (Re)conceptualizing student engagement: Doing education not doing time. Teaching and Teacher Education, 24, 1765-1776.