Literacy Now

Latest Posts
School-based solutions: Literacy Learning Library
care, share, donate to ILA
ILA National Recognition program
School-based solutions: Literacy Learning Library
care, share, donate to ILA
ILA National Recognition program
join ILA today
ILA resource collections
ILA Journal Subscriptions
join ILA today
ILA resource collections
ILA Journal Subscriptions
  • Administrator
  • Job Functions
  • Classroom Teacher
  • Literacy Education Student
  • Teacher Empowerment
  • Curriculum Development
  • Professional Development
  • Policy & Advocacy
  • Student Evaluation
  • Assessment
  • Topics
  • Scintillating Studies
  • Literacy Research
  • Teacher Educator
  • Special Education Teacher
  • Reading Specialist
  • Policymaker
  • Literacy Coach
  • Blog Posts
  • Content Types

The Influence of Mandated Tests on Literacy Instruction

By Gay Ivey
 | May 12, 2016

78753286_x300In their recent study, Dennis Davis and Angeli Willson set out to illuminate the relationship between literacy instruction and a mandated achievement test in Texas in their Reading Research Quarterly article “Practices and Commitments of Test-Centric Literacy Instruction: Lessons From a Testing Transition.” At the time, schools were undergoing a transition to a new test, the State of Texas Assessments of Academic Readiness (STAAR), a context Davis and Willson believed would magnify the complexities of the teaching–testing dynamic. They interviewed 12 teachers, twice, over a period spanning the first and second years of test implementation and conducted a focus group meeting from the larger sample. They also examined documents publically available on the Texas Education Agency’s website intended to explain the transition to the STAAR and to provide teachers and parents with information about the new tests and their links to the state standards, which had not changed from the previous test.

Here is a summary of their findings:

First, instructional practices favoring the items, language, and limitations of the tests were pervasive. “Strategies” for test taking (e.g., prescribed annotations, acronyms for analyzing poems) were frequently substituted for cognitive and metacognitive reading strategies and were legitimized as comprehension processes despite a lack of research supporting their use. Writing instruction was tailored to the tests’ short length requirement and to particular genres. Study participants questioned the time-consuming benchmark testing in terms of both item quality and adherence to good practices in measurement design. They worried that a percentage-passing metric used to evaluate and compare classrooms and schools failed to account for individual student growth over time or differences in prior achievement across groups of students.

Second, although existing standards did not change when the STAAR was introduced, there arose new uncertainties about how and what to teach. Specifically, there was confusion over what it meant to increase rigor, for example, whether that meant merely teaching harder, providing more difficult tasks, or something else entirely. It was commonly understood the new tests would require students to understand passages holistically rather than just to read for retrieval, and that students would be expected to read a wider range of texts. However, teachers felt they needed sample test items to guide decisions about teaching particular standards.

Third, Davis and Willson theorized about why these test-centric practices were perpetuated even among teachers who found them to be problematic. Their analysis led them to the following understandings:

  • Teachers were compelled, for students’ sakes, to minimize the differences between what students experienced in class and what they would encounter on the tests.
  • Teachers broke down reading and writing processes into small pieces so they could publicize them (e. g, written objectives on the board) for administrators’ approval, particularly the skills most likely to be included on STAAR items.
  • Inappropriate inferences using benchmark test data had become normalized and accepted, for instance, analysis of a single text item to make inferences about a student’s competence with a standard, or evaluations of a teacher’s quality with no reference to student starting points.

The authors describe a phenomenon that is far more consequential than “teaching to the test.” They sum up their perspective on the test-centric instruction teachers reported in this way: “Instead of instructional practices bending to align with a test, we see the test being allowed to enlarge and encircle all aspects of instructional practice” (p. 374).

How can teachers, feeling professionally and morally compromised by such a trend, regain a sense of agency about their work? Because these practices have become normalized and entrenched in schools, Davis and Willson say the first step is to notice and name these indicators of test-centric practices: (1) use of test-like passages for instruction, (2) time spent teaching students how to document evidence of prescribed test-taking strategies, (3) the use of test-like questions as the basis of classroom discussion, and (4) discussions of data from test-formatted practice tests. Awareness of these and similar practices, they suggest, is the first step to principled resistance (Achinstein & Ogawa, 2006).

Gay Ivey, PhD, is the Tashia F. Morgridge Chair in Reading at the University of Wisconsin-Madison. She is a member of the ILA Literacy Research Panel and currently serves as vice president of the Literacy Research Association.

The ILA Literacy Research Panel uses this blog to connect ILA members around the world with research relevant to policy and practice. Reader response is welcomed via e-mail.

 

References

Achinstein, B., & Ogawa, R.T. (2006). (In)fidelity: What new teacher resistance reveals about professional principles and prescriptive educational policies. Harvard Educational Review, 76(1), 30–63.

Davis, D.S., & Willson, A. (2015). Practices and commitments of test-centric literacy instruction: Lessons from a testing transition. Reading Research Quarterly, 50(3), 357–379.

 
Back to Top

Categories

Recent Posts

Archives