28May

Improving First Nation Student Achievement through a District Writing Assessment: An Action Research

by Ryan Anderson

EdD Candidate, University of the Cumberlands

 

Abstract

This study employs action research principles to determine the extent to which Nipisihkopahk Education Authority, a First Nation school authority in Maskwacis, Alberta, Canada, has made positive impacts on teacher practice and student achievement for writing in English language arts. This study uses teacher email surveys and student assessment information as data sets to make these determinations. Data indicates positive impacts on teacher practice and student achievement as well as a direct correlation between both data sets. Recommendations are made for further investigation on large-scale writing assessments, the extent to which they impact teacher practice and student achievement, as well as future research studies of First Nations education in Canada.

 

Introduction

Nipisihkopahk Education Authority (NEA) is the First Nation school authority for Samson Cree Nation in Maskwacis, Alberta. I started working for NEA as a literacy coach in the 2013-2014 school year. Part of my initial assignment was to review student assessment data and determine what literacy practices were being used by teachers across the district. In reviewing the district’s historical data for the Alberta Provincial Achievement Tests (PATs) in grades three, six, and nine as well as conducting several classroom observations, it became clear that the district needed to focus on improving student writing.

Once that determination was made, and in collaboration with appropriate members of district and school teams, we focused our efforts on developing and implementing a large-scale writing assessment that could span multiple grades. The process involved reviewing recent publications on literacy instruction and assessment as well as establishing working groups of teachers to develop something that would help us improve as a district. The purpose of the assessment would incorporate what we had learned about effective writing instruction and large-scale writing assessments as well as to fulfill what Breakspear (2016) suggests are the two most important considerations for educational leaders – maximizing impact on teacher practice and student learning.

 

Description of the District Writing Assessment

Until the development and implementation of the District Writing Assessment (DWA), the district relied solely on the PATs for student achievement data to determine where students were at in their learning. However, there were three primary limitations to relying on the PATs:

  1. they only provided student achievement information for students in grades three, six and nine (and later only grades six and nine);
  2. teachers in the district felt the information provided by the PATs did not have a direct impact on their professional practice; and
  3. both students and teachers felt it was difficult for students to enter into the writing tasks presented in Part A of the PATs. 

To alleviate these limitations, it was determined that the DWA would be rooted in Cree history and culture by having students learn how to be effective storytellers. The assessment would have students write narrative stories from a picture prompt that visually communicated aspects of Cree life in Maskwacis. Additionally, the DWA would span grades one through nine and would involve teachers in its development and implementation.

 

Foundational Aspects of the DWA

Using what we had learned from our literature review on literacy instruction and large-scale assessment, these emergent best practices became the foundational aspects of the DWA.    

 

Teacher Working Groups

Professional learning communities and their impact on teacher practice and student achievement are well documented (DuFour, Eaker & DuFour, 2005; Schmoker, 2005), which is why we chose to utilize teacher working groups for the development of the DWA. Our teacher working groups were configured to have grade level groupings of people teaching the same grades in each of the schools across the district. As the literacy coach, I conducted working group sessions to have teachers develop, provide feedback for, and finally approve the contents of the following:

  • test administration procedures outlined in the General Information Bulletin (GIB);
  • the format and design of the student test booklets;
  • appropriate graphic organizers used for each grade level grouping (grades 1-3, 4-6, and 7-9); and, finally
  • the scoring guides that were to be used to assess student responses.

By having teachers participate in this process, the district succeeded with the administration of the DWA. Not only had teachers put their personal stamps of approval on the work that had been done, but they also had the necessary understanding to successfully administer the assessment to their students.  

 

 Figure 1 – Example of Picture Prompt for the 2015 Administration of the DWA

 

General Information Bulletin

Developing a GIB was a critical piece of the DWA process. In our literature review, we learned that our GIB would need to be “presented with sufficient clarity and emphasis so that it is possible for others to replicate adequately the administration conditions under which the data on reliability and validity, and, where appropriate, norms were obtained’” (AERA, APA, NCME, 1999, p. 47, as cited in Thompson, Johnstone & Thurlow, 2002, p. 12). We determined that, if the assessment were to be considered valid and the results reliable, each teacher in the district would have to administer the assessment in the same way to their students. As such, we all worked together to come up with the best way to administer an assessment of this nature.

The most important outcome from this process resulted in the standardization of teacher led pre-writing activities so there was consistency in the amount of support students received before completing the writing task. From teacher feedback, we made minor adjustments to improve the overall quality of the assessment and the efficiency with which it was administered. The primary adjustments to the assessment came in the form of the pre-writing activities conducted by teachers and students, preparing the students to produce their best writing on the DWAs. The assessment structure allowed for two key pre-writing activities – a teacher led discussion about the picture prompt to generate ideas for a story and a teacher-led discussion on taking those ideas and planning how they could come together to form a plotline structure.

 

Administration Window

From the beginning, teachers were adamant about providing students an adequate amount of time to complete the writing task. To ensure this on a large scale, we looked to the province and the work it was doing on the pilot of the Student Learning Assessment (SLA), the assessment model taking the place of the PAT in grade three. One primary feature of the SLA that made it unique from the PATs, and that improved the way students could complete the assessment, was the administration window. Referencing this, Thompson, Johnstone, and Furlow (2002) state, “tests can be designed for comparability with other accommodations as well. For example, two common accommodations are the provision of extended time and the provision of extra breaks in the testing session. These accommodations are more compatible with tests that are not timed and that can be easily broken into brief sessions without compromising validity and security” (p. 12).

We determined the most student-friendly way to ensure all students would get an appropriate opportunity to complete the assessment was to establish a two-week administration window. This window would give teachers the professional discretion to dissect the components of the assessment and administer them in a way that best suited their students’ needs. For the most part, teachers decided to break the assessment into three sections – a teacher-led discussion about the picture prompt, a teacher-led discussion on planning, and then using that preparation to have students complete the writing task. Teacher feedback on this was positive, so the administration window became a defining feature of the assessment.

 

Standards and Exemplars

Leithwood (2008) indicates that student performance standards are being set in most districts and states across the US. Additionally, districts with high levels of student achievement had their student performance standards set by district-level administration. Taking this into consideration, we decided to set our own student-achievement standards for the DWA. To do this, however, we needed to consider the standards that had been historically used by the district – the provincial standards for the PATs.

The primary consideration for setting our own standards was to ensure they would be aligned with the provincial standards. The intent was to ensure we could say with confidence that students writing at grade level on our assessment would be comparable with the other standards to which our students are held. To make this a reality, a working group was set to take the standards documents from Alberta Education, along with the curriculum documents for English language arts, and create our own standards for each grade included in the assessment.

 

Figure 2 – Example of Standards Set for the DWA

 

Figure 3 – Example of Exemplars Used to Assess the DWA

 

Scoring Guides and Teacher Marking Sessions

Once each of the previous steps of the development process were complete, we needed an effective marking system to apply the standards, through exemplars, to assess each of the student samples completed as part of the DWA. In the same way we modeled our standards after Alberta’s PAT program, we modeled our assessment structure after it too. To do this, we looked to Graham Foster’s work on writing assessment in the province of Alberta. As a retired teacher from Calgary, he has written and published on the tenets of good writing assessment, which is subsequently reflective of the province’s assessment methods. Foster (2010) outlines that rubrics used in assessing writing should be reflective of student achievement in the areas of content, organization, sentence structure, vocabulary, and conventions.

 

Figure 4 – Example of Scoring Guide – Grade 1-3 Rubric on Content

Another important consideration was to determine the best way to mark the student samples and to ensure the marks were truly reflective of the standards set. This process included marking sessions where all the teachers in grades one through nine gathered to collectively mark the student samples for each grade. Once completed, I marked every student booklet for all the grades as the literacy coach as a form of sober second thought and to ensure students received accurate scores reflective of the standards set. This process involved flagging any discrepant student booklets – booklets where the score from the teacher marker and my score were different by at least two levels within any one marking category (content, organization, sentence structure, vocabulary and conventions) or booklets with an overall score differential of more than three raw score points.

 

Pilot Administration

During his presentation at the Winter Leadership Learning Network for the College of Alberta School Superintendents in January, 2016, Dr. Simon Breakspear taught the concept of blueprinting and piloting educational initiatives. This included the use of action research methodologies to make evidence-based decisions in school systems. Breakspear’s recommendations reflect the process we used to blueprint and pilot the DWA.

The work associated with developing the assessment model and its accompanying materials comprised the blueprinting phase of the project; whereas once that foundation was laid, we piloted the project in May, 2014. Because the pilot administration went so well, and because we determined that it gave us useful student achievement data, we decided to move forward with the DWA as an annual assessment tool in the district.

 

Research Question

Since the initial pilot of the DWA in May, 2014, we have had three successful administrations of the assessment. We have also collected and collated entire cohorts of student samples for each administration. As such, we have reached a place where the district needs to determine the extent to which the DWA is having a positive impact in the areas we sought to improve - teacher practice and student achievement for writing in English language arts.

Therefore, the purpose of this study was to answer the question:

To what extent has the District Writing Assessment had a positive impact on teacher practice and student achievement?

 

Procedure

Johnson (2012) provides guidance for performing teacher-based action research. He states that action research should include more than one data set, but should not include more than two to four so that the study remains simple and straightforward. I decided to focus in on two different data measures to answer my research question: these were (1) teacher email surveys and (2) student assessment data. Both approaches fall within Johnson’s list of appropriate data measures for action research studies.

 

Teacher Email Surveys

The teacher email surveys were designed to collect data on the extent to which the DWA impacted teacher practice. To make this determination, I created a set of questions shared with teachers through a Google Form that was sent to them via email. These questions correlated with the ways the DWA was designed to benefit teachers and potentially have a positive impact on their practice:

  1. Have you used DWA Scoring Guides (Rubrics) when marking student writing in your class?
  2. Have you used the DWA Exemplar Booklets when teaching or marking student writing in your class?
  3. Have you used the DWA images or the process of using an image to write narrative stories when teaching writing in your class?
  4. Do you feel that the district marking sessions have improved your understanding of what grade level writing looks like for the grade(s) you teach?
  5. Have the district marking sessions provided you the opportunity to engage in dialogue with your colleagues about teaching and marking writing in your class?

 

Student Assessment Data

With each administration of the DWA, we collected student samples and their corresponding scores according to both the scoring guides and the standards set when the DWA was developed. To best determine the extent to which the DWA has had a positive impact on student achievement, we decided to look at two important pieces of data in each administration: these were the (1) total number of students achieving the acceptable standard and the (2) total number of students achieving the standard of excellence.

 

Findings

Information for both data sets were collected in December, 2016. The teacher surveys were issued to all teachers who had taught grades one through nine and had been employed by the district since the time the DWA was developed and implemented. The total number of teachers who qualified to participate based on those criteria was fourteen. Of the fourteen teachers invited to participate, ten responded to the survey. Figure 5 is a graph that represents the data for each of the survey questions.

 

Figure 5 – Teacher Email Surveys

 

With regard to student participation and achievement in the DWA, the graph in Figure 6 represents the information collected for this study. The graph displays total student participants for each of the three administrations as well as the total number of students that achieved both the acceptable standard and the standard of excellence.

 

Figure 6 – Student Assessment Data

 

Conclusions

The results for both data sets provide interesting insights into the impact the DWA is having on both teacher practice and student achievement. The teacher email surveys demonstrate that teachers who participated in the study have adjusted how they teach writing in their English language arts classes. The data shows that teachers are now using scoring guides and exemplars, as well as teaching writing and having student engage in writing activities using picture prompts. Teachers also indicated that the opportunity to come together to mark student samples with their colleagues improved their understanding of what grade level writing looked like for the grades they taught. They also felt that marking sessions created opportunities for them to engage in meaningful dialogue about teaching and assessing writing in their respective classes.

The student assessment also indicates positive impacts for overall student achievement. The percentage of students achieving the acceptable standard for their respective grades have not only been above half of all the students participating in the assessment, but have also indicated improvement trends since the DWA’s inception. The May 2016 administration indicated a slight drop in the overall percentage of students writing at grade level; however, this drop is offset by a dramatic increase in the percentage of students achieving the standard of excellence. It is anticipated, to some degree, that this increase was due to a small number of students who improved their writing from one year to the next resulting in a score of excellence instead of remaining at the top of the range for acceptable writing.

Regardless of the reason for the increase, the most important finding is the correlation between the two data sets. As noted previously, the teacher surveys demonstrate overwhelming impact on teacher practice for those who qualified for and participated in the survey. Student achievement trends directly correlate with these improvements to teacher practice. It is therefore possible to surmise that overall student achievement has occured, to some degree, because of improved teacher practice of teaching and assessing writing in English language arts.

 

Recommendations Moving Forward

It is recommended that further investigations be done on the DWA and the extent to which it has impacted teacher practice and student learning. A comparative analysis of the DWA and the PAT that compares student achievement before and after the inception of the DWA will provide further insight into the impact the DWA has had on student achievement. To get a more accurate picture of teacher practice, a more organic study of teacher practices for teaching writing might further confirm the impact the DWA has had on teacher practice. Additionally, further investigation might be done to determine the extent to which the development of the writing assessment is in accord with universal design principles (Thompson, Johnstone & Thurlow, 2002); the extent to which it is a consequentially valid large-scale writing assessment (Slomp & Sugimoto, 2014); and, the lack of overall research being conducted in aboriginal or indigenous education (Ottman, 2010). Each of these merits further investigation of the DWA, the impact it has had on teacher practice as well as the extent to which it has improved student achievement for writing in English language arts.

 

For more information on Nipisihkopahk Education Authority’s District Writing Assessment please feel free to access http://scnealiteracyhub.weebly.com/district-writing-assessment.html

 

About the Author

Ryan Anderson is a doctoral candidate at the University of the Cumberlands in Williamsburg, Kentucky. He is completing a Doctor of Education Degree in Educational Leadership with a specialization in District Level Administration. He is also the Director of Instructional Services for Nipisihkopahk Education Authority in Maskwacis, Alberta, and the Research Coordinator for the Maskwacis Education Schools Commission – a not-for-profit organization that is overseeing the amalgamation of the four First Nation school authorities in the traditional Maskwacis territory of Alberta, Canada.