With its endless hours of mind-numbing timed drills and banal essays, standardized testing was, once upon a time, exclusively the bane of the downtrodden No. 2 pencil-toting student. “Suck it up, kid,” the adults would say, proudly bearing the hard-earned Iowa Basics scars of their own childhoods.
But in 2015, when the state of Illinois rolled out the Partnership for Assessment of Readiness for College and Careers (PARCC) test as the new state assessment and accountability measure for Illinois students in grades 3 through 8, standardized testing suddenly became a headache for everyone—from teachers to administrators to parents as well. And, all mathematical evidence to the contrary, “PARCC” became a four-letter word.
The test was overly complicated. It took too much time away from classroom instruction. The results were arbitrary and/or unreliable, and they took far too long to arrive.
To dive into the deep well of discontent that has been the PARCC experience over these past three-plus years is to plumb a seemingly bottomless pit of consternation and handwringing from almost every corner of the public education arena. But large-scale educational efforts—particularly those involving assessment and accountability—tend to foster this kind of heated debate and rhetoric; as long as the data is accurate and actionable, the common wisdom goes, the suffering will have certainly been worth all the pain.
Yet three, going on four, years later—and with what should be a celebratory, relevant, and rock-solid dataset in the books—the testing tussle has yet to subside. If anything, the calls for change have only intensified, and the state is indeed reconsidering how PARCC is administered and how its results are interpreted. Here, then, at something of a crossroads for the test (at presstime, plans for the spring 2019 version of the test were very much up in the air), two local administrators weigh in on a few standardized questions related to PARCC and its future in their districts and schools.
How would you characterize the general attitude of teachers and parents surrounding PARCC in your district or school?
If they’re not exactly rolling out the balloons and brass bands, it seems that most of these important local constituencies have at least accepted the inevitability of PARCC, and have made a genuine effort to make the most of it.
“I think there’s been a learning curve for everyone—from the students who take the test to the staff members who manage it—but the sentiment here is really positive,” says Patrick Nolten, the assistant superintendent for assessment and accountability in Naperville Community Unit School District 203 (where PARCC scores are among the highest in the area; see sidebar). “I know there are some districts where they’re bemoaning the state of PARCC and the testing process, but this district is behind it, and I really think appreciates the opportunity to show the amazing work our teachers do every day and the learning our students demonstrate.”
But while most people have indeed adjusted to PARCC as a mandate, it often fails to generate much enthusiasm in the bigger picture as a key educational tool.
“Teachers and families understand that PARCC is required,” says Arin Carter, director of the John C. Dunham STEM Partnership School, which draws students in grades 3 through 8 from the districts of Batavia (101), East Aurora (131), Indian Prairie (204), and West Aurora (129). “But our teachers go out of their way to make sure that our students are learning based on the standards—not based on PARCC.”
How has PARCC impacted curriculum changes in your district/school over the past three years?
This is one of the big ideas behind PARCC: its ability to guide administrators toward data-supported changes to help students learn better or more efficiently. It’s an idea that has perhaps not seen as much traction as hoped for over the past three years, but in District 203, at least, Nolten sees definite progress.
“Although PARCC is only a one-time-a-year summary measure, we really try to use it to look at whose needs we’re meeting instructionally, and whose needs we may need to take additional steps or measures toward meeting as a district,” he says. “There’s a targeted focus on achievement gap reduction to really meet the needs of all students. So all of our schools incorporate into their school improvement planning a metric from the PARCC results to see if we’re seeing progress and growth in terms of meeting those students’ needs.”
Has PARCC provided a largely accurate snapshot of student performance?
Nolten says that kids in his district who meet or exceed standards as they transition from grade level to grade level taking PARCC assessments are generally those in the best position to be poised for college and career readiness. Test scores, of course, are not the only measure of success, and Carter believes it’s important to keep that in mind when assessing the whole student.
“Our curriculum goes well beyond the skills that are assessed on PARCC,” she explains. “While we teach the standards that are required, the integrated way in which students are taught at our school allows them to make connections among the content areas that I’m not confident PARCC assesses. Further, there are many other skills that are extremely important today—such as interpersonal skills, collaboration, and design process understanding—that are not measured at all.”
What changes would you like to see in the design or administration of PARCC?
Changes are certainly coming, but for now, no one is exactly sure what form those changes will take. If the thoughts of many teachers and administrators figure into the next version of the now 12-hour test, however, one word will certainly define the new PARCC: shorter.
“My biggest concern is the amount of time that is spent on PARCC administration,” Carter says, echoing comments she’s heard time and again from teachers and parents alike (not to mention students, of course). “There’s just a lot of time that should be spent on instruction that is instead spent administering a test.”
“The criticism I think almost everybody shares has to do with the length of time,” Nolten agrees. “In terms of a wish list, a shorter test that takes less time instructionally would be great. We’d also like more diagnostic-like information so we can get more student-level analysis about skill development—less global or general, and more specific around exactly some of the skills that we need to focus on for growth at the student level.”
The question that will need to be answered by the state, then, is how to address all of these concerns while still delivering a standardized test that not only assesses student proficiency, but also allows for continuity of comparison with the data that has been compiled over the past three years.
That test is to come. But for now, pencils down.