NCLB Outrages
Instant Read on Reading, in Palms of Their Hands
Ohanian Comment: The military metaphor in the blurb for the article is just the beginning of the offenses here: target instruction.
Teachers in a rural New Mexico district use hand-held computers to assess students’ reading progress and target instruction accordingly.
By May 6, this was the most-read article of the issue at Education Week.
This method of delivering reading skills is called a "paradigm shift" for teachers. Indeed. And what kind of shift is it for young readers? Consider the destructive message about reading foisted on the third grader featured at the beginning of the article.
Scholastic Administrator also pumped the use of palm to deliver DIBELS information in Moriarty.
Here, Moffitt, "the coordinator of federal programs for the school district and the driving force behind its reading efforts," says that their district has a great way to start the new year. Warning: Following quotes from the article are vomit-inducing.
Teachers receive a mini stipend for coming in the week before school starts and administering the popular early literacy assessment, the DIBELS (Dynamic Indicator of Basic Early Literacy Skills), to each student on their classroom roster. “It has become a really nice beginning to the year. It’s meet your teacher. It’s one-on-one,” says Moffitt. . . .
After every student takes the initial benchmark test, teachers at each school get together to group students according to how each one did on the test. These results are organized into overall support recommendations—benchmark, strategic, and intensive—that indicate a student’s likelihood of achieving reading proficiency by the third grade. These categories allow for easy grouping so that targeted instruction can begin immediately. . . .
One of the greatest advantages to having a standardized assessment that yields data immediately and is accessible to multiple stakeholders is that it allows everyone to speak the same language.
Fascinating, isn't it that both Scholastic and Education Week would "discover" rural Moriarty, New Mexico, population 1,808? Moriarty, the largest community in Torrance County, is located at the junction of I-40 and NM 41. Much of the economy of Moriarty is related to providing services to truckers and travelers on I-40. The Moriarty School District includes other villages, but it is still remarkable that two national publications would focus on this rural district.
In this Education Week article, a reading coach says she thinks the massive data collection "will catch some of those kids" that previously stagnated in a low reading group. There doesn't seem to be any worry about what DIBELS does to kids. I worry about the kids I learn about in messages from desperate parents and grandparents, young children pushed into meltdown by all the DIBELS pressure.
This article includes the notice: Coverage of new schooling arrangements and classroom improvement efforts is supported by a grant from the Annenberg Foundation. Maybe this explains why the piece reads like a press release for the mCLASS DIBELS assessment system. Usually news articles contain a paragraph or two from those offering criticism. But as Kathy Emery and I observed in Why Is Corporate America Bashing Our Public Schools? Annenberg devotes big money to developing public support for standards-based reform. Not community standards. Corporate standards.
Surely it's just coincidental that there is not one word of criticism for mCLASS DIBELS assessment in this article. There is a sidebar titled DIBELS Involved in 'Reading First' Controversies [see below], but someone who pulls up the main article online will not know this.
With the online article, one gets an ad for Professional Development with iPod. The hard copy of Education Week contains ads for:
Edison Schools
Cambium Learning (including Sopris West, publishers of DIBELS
ASCD books
The Broad Center
College Board
Solution Tree Assessment Events
Seton Hall University doctoral program
ACT College Readiness system
Pearson assessments
Exemplars assessment
Aspire Assessment System
Riverside Edusoft Assessment and Assess2Know
ETS ("NCLB Renewal an Opportunity for Education Alignment")
I would say that Education Week delivered to its advertisers.
By Lynn Olson
Moriarty, N.M.
Grace, a tiny 3rd grader with long black hair and wide-set eyes, peers over her teacher’s shoulder at the results of the oral-fluency test she’s just finished, which appear on the screen of a hand-held computer as a tiny green triangle up in the right-hand corner.
“So, you read 147 words,” says teacher Laura Wallin of Mountainview Elementary School. She taps the screen and another graph pops up that plots Grace’s reading fluency over time. “At the beginning of the year, you started out at 92 words per minute, then 105. What do you think is helping you the most?”
“When we practice reading to our partners, it helps me,” Grace says, citing times when her partner told her not to skip words as she read out loud. “And it helps me when we have this time to go over it; it helps me know what to work on.”
Mountainview Elementary's reading coach, Tonya S. Newton, talks with 6-year-old Alyssa Adams about the oral-fluency gains that the 1st grader has made on a one-minute reading test given on a hand-held computer. "She blew my socks off!" Ms. Newton said about Alyssa.
—Kitty Clark Fritz for Education Week
Similar encounters between teachers and students occur on a regular basis in the five elementary schools in this far-flung rural school system, which is about the size of Rhode Island and enrolls some 4,000 students in kindergarten through 12th grade. But that wasn’t always the case.
Before the school system began using the Dynamic Indicators of Basic Early Literacy Skills, or DIBELS, assessments, the district had no common reading program in the early grades and no way of regularly checking students’ progress, district officials say.
“There were as many different programs going on as there were teachers,” recalled Joshua S. McCleave, the principal of the 330-student South Mountain Elementary School, another school in the Moriarty school district.
Three years ago, with the help of a federal Reading First grant, the district began using the DIBELS assessments across its elementary schools along with the mCLASS: DIBELS assessment and reporting system. The district’s experience helps illustrate the ways educators can use one popular assessment tool to better inform and shape classroom teaching and learning—a concept often labeled “formative assessment.”
The assessments, developed by researchers at the University of Oregon, include thrice-yearly benchmark tests to help screen and group students, as well as more-frequent, one-minute “progress monitoring” measures. Those measures track youngsters’ responses to instruction on a particular literacy skill, such as initial sound fluency, phoneme segmentation, reading of nonsense words, or oral-reading fluency. As early as kindergarten, the probes identify children considered at risk for not learning to read.
Using the mCLASS: DIBELS assessment system, developed by Wireless Generation, a for-profit company based in New York City, teachers are able to give and record DIBELS results on hand-held computers and get instant feedback. Teachers can then upload the results to the Wireless Generation Web site to track class, school, and district performance over time using online reports.
Naomi Hupert, a senior research associate with the Boston-based Education Development Center, which is evaluating New Mexico’s Reading First program, said the approach lets teachers collect and instantly access their own data without feeling they are just “administering something that was supposed to be sent off to the state or the district.”
“The greatest advantage we see,” Ms. Hupert said, “is that teachers, for the first time, felt like the data was for their own purpose.”
Flexible Reading Groups
Nationally, about 100,000 teachers in 49 states now use Wireless Generation’s software to conduct and analyze DIBELS assessments with some 2.3 million students.
Before adopting the program, every K-3 teacher in the participating Moriarty schools agreed to a list of “non-negotiables.”
When the Moriarty district first introduced computer-based reading tests, 29 percent of kindergartners scored at the lowest, or “intensive” level. That figure fell to 2 percent this year, while students testing at the top level, “benchmark,” rose from 28 percent to 93 percent.
The list includes an uninterrupted 90-minute reading block for students using the Harcourt Trophies reading program. During that time, students are flexibly grouped based on their needs, with the groups shifting regularly, depending on test results.
Schools also carve out additional intervention time outside the reading block: 30 minutes for all students and 60 minutes for those who need the most intensive help. In the three elementary schools receiving Reading First grants, the latter may include individual or small-group instruction from a reading coach or specialist.
“We were truly asking teachers to make paradigm shifts, and in asking them to do that, we were asking some of them to change the way they had been teaching for 20 years,” said Laura E. Moffitt, the coordinator of federal programs for the school district and the driving force behind its reading efforts.
When I taught 6th grade, I always had that lower reading group, and it just tore me up, because they couldn't read the textbooks, and I didn't know how to fix it. I think this will catch some of those kids."
Tonya S. Newton
Reading Coach Mountainview Elementary School
Teachers at every school meet at least once a month, by grade level, to analyze DIBELS and other data, adjust instructional groupings, and decide what to do next to support individual children. Teachers monitor children’s progress using DIBELS subtests as often as once a week, or at least once every six weeks, depending on whether youngsters perform at the “intensive” (high risk), “strategic” (some risk), or “benchmark” (low risk) level on the assessments.
“If we notice a child has fallen below their target score, we talk about it then,” explained Tonya S. Newton, the reading coach at Mountainview Elementary School. “What I’m trying to do is get the teachers to come up with the interventions during the meeting, because if you wait three weeks for a child, that’s a lot of time lost.”
Moriarty schools also share the information generated by the assessments with parents during back-to-school nights and parent-teacher conferences. And they provide homework packets and suggestions for parents to work with their youngsters on specific skills.
‘So Much More Information’
A recent analysis by Wireless Generation, based on results from roughly 200,000 students in nearly 1,300 schools in 31 states, found that frequent progress monitoring—about once a week—yielded the greatest reading gains for students at all reading levels, as measured by changes on the benchmark exams.
The mCLASS: DIBELS system in use in Moriarty provides Web-based reports on individual students, such as the one below, and reports for whole schools and districts.
The chart below uses colored bars to show changes in students’ reading “risk” status over the course of a school year.
Ms. Newton, who’s been teaching for 17 years, is a passionate defender of the approach, based on her own experiences. “When I taught 6th grade, I always had that lower reading group, and it just tore me up,” she said, “because they couldn’t read the textbooks, and I didn’t know how to fix it.”
“I think this,” she added, pointing to the data-covered walls of her office, “will catch some of those kids.”
At Moriarty Elementary School, for example, 2nd grade teachers began a grade-level meeting by looking at the performance of pupils who had recently moved from one reading group to another because of their DIBELS scores, to see if they were continuing to make progress. They also reviewed the work of two children who had temporarily been moved from the “strategic” to the “intensive” group so they could get more help.
Each school decides how to organize the instructional groups during the morning reading block. But generally, the schools use a “walk to read” model, in which students leave their homeroom teachers to go to another classroom composed of students at a similar reading level. Because students change reading groups frequently, based on new data, all of the teachers get to know every child.
‘Makes You More Aware’
During the meeting, teachers also pulled out detailed summaries for individual students to look at where they were struggling, and discussed what types of activities might address their needs.
“I think it makes you more aware of areas you need to delve into,” said 2nd grade teacher Carlotta Ballard. “Sometimes, you’re so used to looking at the big picture.”
“Plus, we all have different expertise,” added teacher Rebecca Cavasos, “and so we’re able to go to one another.”
Typically, teachers will look for trends across at least three different progress-monitoring reports before deciding to move a child’s reading group. They’ll also look at class work, homework, and results from other assessments, such as the Teacher Primary Reading Inventory and the tests that accompany the Harcourt Trophies series. Sometimes, teachers will call an emergency meeting if they feel they need to make a decision about a child right away.
Although teachers initially gave DIBELS as a paper-and-pencil assessment, many say it was too cumbersome to use regularly. Now, said 1st grade teacher Jaree Whitworth of the 410-student Edgewood Elementary School, “I just feel like I get so much more information.”
The Web-based graphics help. Colors—red, yellow, and green—highlight whether students’ performance on any particular subtest is at the “intensive,” “strategic,” or “benchmark” level. A little running man across the bottom of some screens shows students where they are in relation to end-of-year goals. And scatter plots highlight whether each progress-monitoring result puts a youngster at, above, or below his or her target performance.
Larry Berger, the founder and chief executive officer of Wireless Generation, said use of the hand-held devices saves teachers the equivalent of four or five instructional days per year. “It’s pretty rare in education that anybody automates something that teachers actually do all day and thereby saves them some time,” he said.
Sustainability an Issue
In Tammy Satterfield’s “strategic” reading group at Mountainview Elementary, about a dozen 1st graders do jumping jacks as they recite the vowel sounds that they’ve been working on from the day’s word chart. Once they’re seated, Ms. Satterfield points to the word wall and says: “We have some new words. Raise your hand if you can tell me some of our new words.”
Soon, the class moves on to breaking up words into their sounds and then sliding them back together to form whole words. “Why do I do this every day?” asks Ms. Satterfield. “Because it helps you read your words,” the pupils chant back. “Is it our only strategy?” the teacher asks. “No,” they chorus.
Later, the teacher acknowledged that she’s putting in more hours planning instruction, but says it’s worth it. “Before, you just hoped you were getting to everyone,” she said, “but with the progress monitoring and DIBELS, it’s less likely that children will fall behind because you’re in contact with them all the time. It’s such an interactive program.”
When 1st grade teachers discovered, for example, that most children were having trouble with oral fluency on the midyear assessments, they added a program during the afternoon intervention time called “Reading Naturally” that focuses on fluency. They also began frequent monitoring of 1st graders’ oral fluency earlier in the year.
In the fall of 2004, when the program began, 29 percent of kindergartners in the district were reading at the intensive level and 28 percent at the benchmark level on the DIBELS benchmark assessments. By this winter, only 2 percent of kindergartners performed at the intensive level, while 93 percent read at the benchmark level.
About 70 percent of 1st and 2nd graders now read at the benchmark level on the assessments. And special education referrals in the Moriarty district have dropped as regular education teachers have felt more comfortable addressing individual youngsters’ needs. Both the special and bilingual education teachers take part in grade-level meetings at each school.
But the biggest change, according to teachers and principals, may be the responsibility that educators now feel for students other than their own. “It becomes a way of creating a community,” observed Ms. Hupert of the Education Development Center, “where there’s a sense that every student’s reading development is the responsibility of every teacher.”
Yet Moriarty educators admit that challenges remain, particularly in figuring out specific strategies or activities based on the data.
“DIBELS is a screen,” Ms. Moffitt noted. “It doesn’t tell you how to change your instructional strategies. Pushing that is where we are now.”
District officials also worry about how to sustain the program once federal Reading First money goes away. The district has put together a working group to look at the issue of sustainability and make recommendations.
For many educators here, though, there’s no going back.
“We used to say, ‘Oh, that child is a late bloomer or developmentally not ready,’ ” said Principal Julie C. Roark of Edgewood Elementary School. “Now, it’s more about what else can we do, what else can we try? You don’t accept that children are just not going to get it.”
Sidebar
DIBELS Involved in ‘Reading First’ Controversies
By Lynn Olson
Although teachers in the Moriarty, N.M., public schools report positive experiences with the Dynamic Indicators of Basic Early Literacy Skills, or DIBELS, the assessments have generated a lot of controversy nationally.
The assessment tool, developed by researchers at the University of Oregon, is now approved for use under the federal Reading First program in 45 states to monitor student progress on reading fluency and other measures.
But a contentious hearing before the U.S. House Education and Labor Committee probed allegations that the widespread use of DIBELS may stem, in part, from inappropriate promotion of the tests by federal officials as part of the rollout of the $1 billion-a-year Reading First program. ("House Panel Grills Witnesses on Reading First," April 25, 2007.)
See Also
Read related stories:
• "Instant Read on Reading, In Palms of Their Hands"
• "Homegrown Tests Measure Core Critical-Reading Skills”
• “ ‘Just-in-Time’ Tests Change What Classrooms Do Next”
A report by the U.S. Department of Education’s inspector general, released in March, suggested that a federal contractor did not appropriately screen consultants, some of whom had financial ties to DIBELS, for conflicts of interest. An earlier IG report concluded that the Education Department appeared to promote DIBELS over other assessments during workshops designed to help state officials complete the rigorous Reading First grant application.
The University of Oregon researchers who developed DIBELS served as advisers on the design of three department-sponsored Reading Leadership Academies in winter 2002 and the resource materials that were handed out at them. They also presented sessions at the events and later were consultants on implementing Reading First.
In addition, the inspector general found some evidence that officials in several states may have been directed to adopt DIBELS rather than the assessments they’d initially selected for use in Reading First schools. The inspector general cited conflicts of interest involving several federal consultants with financial ties to DIBELS who were sent to advise states that ended up including those products in their grant proposals.
Federal officials and consultants have said that they acted properly in their decisions involving DIBELS. At the April 20 hearing, however, House Democrats charged that the developers of DIBELS profited from the advice they gave to federal and state officials for Reading First.
Meanwhile, critics also charge that DIBELS’ ability to measure students’ reading skills is being oversold.
“First of all, it’s a very narrow instrument,” said Samuel J. Meisels, the president of the Erickson Institute for Advanced Study in Child Development, in Chicago. “It measures all kinds of fluency: initial-sound fluency, nonsense-word fluency, oral-reading fluency, word-use fluency, and then combinations of those fluencies, which they call comprehension.
“Essentially, what we’re talking about is speed,” he continued, “and, in the case of the nonsense words, reading out of context,” which he said is especially a concern for beginning readers who do not come from literacy-rich environments.
Comprehension Compromised?
At least one study questions whether the assessment tool is a good measure of whether students understand what they read. The study, by researchers at Michigan State University, focused on the use of DIBELS among 3rd graders in one small school district in the Midwest.
It found that DIBELS Oral Reading Fluency scores did predict performance on the TerraNova, a standardized achievement test, although students’ performance on DIBELS accounted for less than 20 percent of the variability in those scores. The study also found that students scored poorly on their ability to retell stories they had read, suggesting the tests may be sending a message that reading rapidly is more important than reading for comprehension.
The authors, G. Michael Pressley, Katherine Hilden, and Rebecca Shankland, suggest a need for more studies of DIBELS by scholars not associated with the test, including research on how well it predicts performance on measures reflecting the full range of reading skills.
Though studies have found that DIBELS predicts students’ scores on state reading tests, Mr. Meisels argued that the studies he’s seen involve too few students to generalize from the results.
So why do teachers like DIBELS? “There’s always a great deal that’s gained by a one-on-one assessment,” Mr. Meisels said. “You do get teachers looking at kids and listening to them and using analytical skills. All that’s really great.”
Lynn Olson
Education Week
2007-05-02
INDEX OF NCLB OUTRAGES