My beef with IWBs—a tool that many district technology directors and school principals seem to love—is a simple one: Interactive Whiteboards do little to change instruction in today’s classroom.
They facilitate a presentation culture in a time when our students are craving opportunities to participate. They encourage and enable teachers to hold on to traditional practices that need to be buried if our schools are going to stay relevant to kids.
Oh yeah—and they cost a heaping cheeseload of cash, consuming already limited technology budgets that could be better spent elsewhere.
And in case you aren’t familiar with my background, I’ve got plenty of experience with IWBs. In fact, for the better part of an entire year, my classroom was outfitted with a full set of whiz-bang gizmos from Promethean—student responders, slates, and a real-live Interactive Whiteboard.
I spent that year writing professional development courses for Pearson Achievement Solutions, trying to detail ways that teachers could use Promethean’s tools to create student centered classrooms.
What I found was disturbing to me: While it was entirely possible to write student-centered lessons for the IWB, it was a pain in the keister. Truly great lessons took hours and hours to plan and develop, making the tool pretty useless to me simply because I didn’t have hours and hours to plan.
What’s more, the IWB became passé pretty darn quickly to my kids. The “your students will learn more because they’re motivated by the technology” argument that the IWB salesmen push on y’all at conferences proved to be patently false. Like any tool, the IWB neither created or discouraged motivation in my kids.
Translate my experience across entire schools outfitted with thousands of dollars of Promethean kits and you’ll understand why I’m convinced that we’re wasting our cabbage on IWBs.
That’s why a Tweet from Sonny Mangana, the Director of Global Research for Promethean—pointing to a new study released by Promethean conducted by research superstar Bob Marzano on the impact that IWBs have on student achievement—caught me by surprise last week.
You see, I really don’t need any research to convince me that IWBs aren’t worth buying! I’ve got first-hand experience to fall back on.
Sonny was a pretty persistent fella, though, and several hundred Tweets later—no joke, literally hundreds of Tweets later—I decided to read the report.
And honestly, I’m glad that I did simply because it’s a report that is likely to be used by school leaders to justify purchasing even more IWBs for their schools because the main finding—that IWBs can produce a 15.9 percentile gain in student achievement—is so darn juicy.
Before you whip out your school checkbook and run off to the Promethean store, though, please note:
That Year 1 of this Promethean study took some flak in the educational research world.
Bob Marzano is an amazing researcher who has literally changed education by putting numbers behind practices and helping educators to identify the changes that hold the most potential for reforming schools. I respect him greatly.
But he’s only human. And the methodology that he used in the first year of this study for Promethean met with criticism—including this five part critique from VCU Educational Research Professor, Dr. Jon Becker.
Spend some time reading through Becker’s critique and you’ll find Promethean’s claims on the positive impact that IWBs have on student achievement a bit harder to swallow.
That the current iteration of the Promethean research only looked at the work of 46 classroom teachers.
Seriously. 46 teachers working with 1,575 students.
While I’m not an expert in things like the impact that sample sizes have on educational research results, I do know that I wouldn’t feel comfortable spending thousands of dollars of a limited technology budget based on the findings of a study that looked at the work of only 46 teachers.
That the teachers involved in the current study intentionally chose to participate.
Think about that for a minute. If you asked the teachers in your building whether they wanted to be a part of a research project studying the impact that IWBs have on student achievement, who would volunteer?
If your building is anything like the ones I’ve worked in, the chances are that you’d get a bunch of top fliers, right? Super motivated types that are into the idea of researching their practice?
And while that’s great—I love top fliers who are into researching their practice—wouldn’t that influence the results of Promethean’s research?
I think I’d trust the study more if it included a random sample of teachers simply because it would be more reflective of the instructional reality of the average building.
That the teachers involved in the current study already had a high comfort level with Promethean tools.
Of the 46 teachers studied in Year 2 of Promethean’s research on the impact IWBs have on student achievement, 32 expressed a high degree of confidence in their ability to utilize their Promethean tools effectively.
Again, that’s great—I’m jazzed that there are teachers finding ways to effectively integrate the tools that their districts are providing them with—but is that distribution an accurate reflection of the confidence levels that you’d find across an entire building?
Think about your own school. Are YOU confident that 70% of your teachers would know how to make the most of IWBs?
If not, then don’t expect to see Promethean’s results replicated in your own building.
That the teachers involved in the current study were told that one objective of the study was to “differentiate those teachers who obtain positive effects from Promethean ActivClassroom from those who do not” (p. 56 of the PDF).
While this quote may seem innocuous at first, I want you to think about the potential impact that it might have on participating teachers.
Essentially Promethean is saying, “We’re looking for teachers who use these tools well. Show us what you’ve got.”
Wouldn’t that change the amount of time and energy that the teachers spent developing their IWB lessons? And if so, doesn’t that skew the results of the study?
From the looks of it to me, Promethean’s sample of teachers were self-selected high fliers who were very comfortable with IWBs and who had extra motivation to craft quality IWB lessons.
While that’s all fine-and-dandy, you’ve got to decide whether or not that’s an accurate description of your teaching staff before you can be confident that you’ll get the same results from IWBs installed in your own building, don’t you?
There’s one more thing I want you to keep in mind about Promethean’s current study before you start buying IWBs:
That the students involved in this project knew whether they were a part of the “in” or the “out” group.
In an honest—and honorable—attempt to keep this research project from disrupting instruction, participating teachers were asked to teach the same unit to two different classes of kids. One group was taught using Promethean technologies and the other wasn’t.
Now think about your students for a minute. If they knew they were a part of an important research project studying the impact of IWBs and they were in the class learning with the IWB, how woul
d they respond?
More importantly, if they knew they were a part of an important research project studying the impact of IWBs and they were in the class learning without the IWB, how would they respond?
Isn’t it possible that at least some of the students in the treatment class—the class using the IWB—might work harder simply because they were excited to be a part of an important research project?
And isn’t it possible that at least some of the students in the control class—the class who sees the IWB in the front of the room but doesn’t get to use it because they’re in “the other” class—might not work as hard because they’re hacked off about being left out?
If your kids are anything like mine, that’s EXACTLY what they’d do.
And considering that the sample sizes of students in each group were somewhat small—700-800—wouldn’t that have an impact on Promethean’s results?
So what does this all mean?
It’s actually pretty simple: Promethean’s study—like any study on instructional practices—provides some valuable insights into the effect that IWBs have on classroom instruction.
As Sonny said in one of his dozens of Tweets, “Some information is always better than no information.”
But it would be irresponsible to believe that every student in your school is going to make 15 percentile point gains in student achievement as long as you spend a bit more cash on Promethean’s tools.
There’s too many factors that go into successful teaching—and into the results of this research report—to think that the right tool is the silver bullet to helping your kids learn.
PS: I’ve intentionally chosen not to address most of Promethean’s statistical, researchy findings in my critique simply because I struggle to count! Far be it for a guy like me to wade into those waters.
I do want to mention one finding, though, simply because it sticks in my craw the wrong way. You’ll see it on page 24 of the PDF:
“Again, 15 out of the 46 studies (or 32.6%) were considered statistically significant (p < .05). Conversely, 67.4% were not considered statistically significant.”
Now, I know nothing about educational research, but this seems to say that of the 46 teachers who participated in the study, 15 saw statistically significant gains in student achievement.
That’s 1 out of every 3, right?
Sonny tried to explain this to me in a Tweet—something about “correcting for attenuation” and “incredibly complex statistical models.” Marzano even addresses it in his report, saying:
“Using meta-analytic techniques, aggregate findings might be statistically significant even though a number of the individual studies are not significant.”
But that all feels like statistical hocus-pocus to a guy like me.
I’ll have to rely on someone with more experience in educational research to make that decision, though. I’m just plain not qualified to determine whether or not this is something to be concerned about or not.
Hint, hint, Jon Becker!