As most school systems have now learned, assessing learning in a pandemic poses all sorts of challenges including, but not limited to, how (and where) students should be assessed and whether the data collected is even accurate. As assessment and curriculum leaders for two different districts, we can still find the answers. What we do know, however, is that assessment can still provide a lot of vital information about student learning, and despite the challenges, it is still worth doing.
Our Imagine Schools and Carbondale Elementary School District 95 (CES) districts don’t seem to have much in common at first glance. Imagine if this is a public charter school system with more than 30,000 K-12 students in seven states and the District of Columbia, while CES is a more traditional district with four elementary schools in southern Illinois.
What we share is a commitment to data-driven teaching and, despite the challenges of the past year, a desire to accurately assess our students enough to regain the learning they may have lost during the COVID-19 shutdown. Here’s how we made key review decisions and how we should handle the data we collect.
Is the remote assessment accurate?
During this spring’s shutdown due to COVID-19, one of Imagine’s many concerns has been the effectiveness of remote assessment. After all, what good is the assessment if it doesn’t tell us exactly what the students know? To get a definitive answer to that question, Bill Younkin of the Biscayne Research Group conducted some research for us and compared our in-person and remote assessments from the 2019-2020 school year. 16 of our schools and approximately 5,000 students agreed to participate. Because we use Renaissance Star Assessments for reading and arithmetic at all of our schools, on campus or remotely, this was consistent throughout the study.
His results showed that, with a few minor caveats, remote assessment was actually as effective as in-person assessment. Exceptionally low scores were slightly less common in remote assessment, and exceptionally high scores were slightly more common. Both effects were seen more frequently at lower grades and almost disappeared at higher grades.
According to Younkin, the lack of extremely low scores suggests that students are serious about the assessments at home and that parents are close by. On the other hand, the slightly higher prevalence of extremely high scores is likely due to parents helping their children with the assessment. Because of this, we reminded families that the purpose of assessments is to provide teachers and school leaders with the data they need to make important decisions about student learning. Helping students answer test questions doesn’t help them at all.
We were encouraged, when comparing the data for the past four years to this fall’s first assessment results, collected mainly remotely, to find that they follow a very similar pattern, reflecting Younkin’s statement that “for the vast majority of imaginary Students “Provide Even More Evidence Remote tests give results similar to traditional tests. “
Adjusting the assessment strategies according to age groups
At CES, the key takeaway from this year’s ratings was more the importance of good assessment practice than student performance. For our youngest learners, we had them come to school to have them assessed individually. For older students that we assessed remotely, we spoke to our teachers about the importance of doing this in small groups. They, in turn, spoke to their students about the importance of assessments.
Overall, the tests went very well, but we had a bit of an issue, especially with remote second grade tests. Some teachers reported seeing parents leaning over their students’ shoulder or a hand sometimes reaching across the screen to help their children. Our parents are committed to positive changes in our district. So in the future we will work more closely with them to ensure the validity of the ratings if our students stay away.
Make the most of the evaluation results
Another thing that will be slightly different in our two districts this year is the way we implement the assessment results. With the disruption of personal learning over the past two years of school, we have seen even more differences in student abilities, skills and knowledge than in most years.
To give teachers the data they need to help students progress as quickly as possible, our assessment provider Renaissance has released a number of free tools that focus on “focus skills”. Every teacher knows that some skills are more important than others. It is important to learn to compare and contrast the written fairy tale of Cinderella and the Disney movie. However, learning what each letter sounds like is a fundamental building block that students must have before they can learn to read. This exceptional year we are using assessment to identify gaps in these essential skills so we can get students back on track as soon as possible.
Ultimately, our basic answer to school breaks is to keep doing what we do every day: find out where the students are and help them learn what’s next.