Solution Seeking or Finger Pointing?

I’ll never forget a principal that I met a few years back who I’ll call Tommy Weiss*.  Tommy was the principal of a middle school who was convinced that his teachers were lazy, careless and ignorant to their own instructional shortcomings—and he’d decided to use data to batter them into submission.

“I’m going to take all of their common assessment scores and post them on our building’s data wall for the whole world to see,” he said.

Knowing that making data transparent is important for driving change in buildings, I could kind of understand where ol’ Tom was coming from.  Something in his tone made me question his intentions, though, so I figured I’d offer him some advice.

“Tom—are you at least going to post the scores anonymously?  What about school-wide conversations?  How will your building reflect together about what these numbers show?  As a classroom teacher, I’d feel a bit threatened by my name and numbers being splashed across a wall without any surrounding conversations or explanations.”

His reply caught me by surprise:  “Posting data without names?  Are you kidding?  I want these teachers embarrassed about how poorly their students are doing.  How else are they going to change.  In the business world, they call this ‘confronting the brutal facts,’ and it’s time we got brutal in teaching.  If you don’t want your name attached to low scores then do a better job.”

Shocking, huh?  But as I found out a few months later, not totally unusual.

That’s when I was sitting in our school’s auditorium during a data analysis staff development day and saw our end of grade test scores projected in size 248 font on the 70 foot screen in front of the entire faculty.  “If you’ll look carefully at the sixth grade reading results,” droned the county speaker, “you’ll notice that the teachers at this grade level are decidedly average at best. But not to worry–once your students get to seventh grade, those teachers will get them caught up to speed.  They’re much better.”

I’d never been called decidedly average in front of my entire faculty before.  It was humiliating at best.

Heck, it was borderline brutal.

And it was completely unproductive.  I’ve walked around with a chip on my shoulder about data AND those who work in untested subjects ever since.  “Put me in a position where there are no assessments to evaluate my performance and I’ll be a star too!” I grumble.  “And whatever happened to the myth that we’re a team working together to improve teaching and learning?”

Now, I’d like to hold each of these guys responsible for their actions and humiliate them here on my blog.  It certainly would make me feel better to publicly call them out on their mistakes in thesame way that they seem so committed to the idea of publicly calling out the teachers of tested subjects.

But to do so would overlook one of the reasons for their poor decisions:  We really haven’t been data driven for all that long!

While it’s embarrassing to admit, for about a hundred years, there was little in-depth analysis being done on the work in our buildings.  As long as the influential members of a community were satisfied with their schools, questions weren’t asked and teachers were trusted to provide feedback about student learning.

The result:  The struggles of poor and minority students were hidden—right along with the incompetence of lazy and inept teachers who’d earned tenure and just didn’t care anymore.  Combined with an interconnected new world economy that hasfewer manufacturing jobs, it is no longer enough for schools to hope that they’re reaching every student.

We simply must become more systematic about our work—and data analysis is a part of that process.

The challenge is that few educators—-principals and district level leaders included—are all that great at using numbers to change conversations in schools.  Tommy Weiss was wrong in a million ways—but not because he’s an evil dude.  He’s just a guy trying to make sense of new changes that he’s been poorly prepared for.

Which is why we need more Charlie Colemans.  Charlie is a middle school principal in British Columbia and a Solution Tree Professional Learning Communities associate who writes about his practice on the All Things PLC blog.  In a recent entry, he wrote about how he deals with the brutal facts in his building:

First, the teacher and I had a private conversation. I told her that while reviewing the Term 1 results, one of her classes really stood out as a cause for concern. Before she could feel attacked or defensive, I suggested that this must be a very tough class with a number of students who had obvious learning challenges.

This set her at ease, and she was able to share with me a number of her concerns and challenges with this particular class. I apologized for not noticing the challenges sooner and asked her how I could help support her and these students. Together we brainstormed some possible solutions, and I promised to work toward some of them.

Charlie goes on to emphasize that responsible conversations about data are “solution seeking opportunities, not finger pointing exercises.”

This simple truth has got to find its way into more of our schools if we’re ever going to see teachers embrace data analysis—and change begins with a school-wide commitment to the idea that results are a reflection of the strengths of our practices instead of the strengths of our practitioners.

* Name changed to protect the identity of the clueless.

18 thoughts on “Solution Seeking or Finger Pointing?

  1. K. Borden

    Mr. Heiny:
    Your blog is chock full o’ great information for the mom of a kiddo with the need to embrace technology.
    I wonder how online testing would be greeted. I read a statistic on the Census Bureau back to school data sheet that there is one computer in the schools for every four students.
    Where there is enough will there is a way. (Corny and old fashioned but I believe it) The will is the issue.

  2. K. Borden

    Mr. Ferriter:
    Thank you for the very helpful response.
    At the open house this year I noticed a great deal more discussion by teachers about the EOG’s and particularly noticed trend data (ex: students generally performed less well on multi-step problem solving). Hearing it I thought back to the rather scant reports we are given as parents.
    The Blue Diamond is actually something I am familiar with a bit. I will confess it deeply bothers me that these assessments are done and parents are not given information. (Sidebar: We could likely talk for a while about the legal issues of NC Wise and compliance with Federal Law requiring parents have access to all permanent records on request).
    The absolutely infuriating thing is that the reports that do break down by skill level and such are available, just not provided to parents and from your answer teachers. You should not have to manually attempt to compile/reorganize readily available data.
    I do empathize. When my daughter started school and paper started coming home I took a box and after looking over it with her collected it. At first it was a matter of postponing deciding what memories to keep till later. But, I began to notice trends. (Ex: She always missed x multiplication fact, y word types she didn’t spell correctly.) Being a glutton for punishment :), I started a list/notes. I can now see gaps in curriculum coverage, where she need practice beyond what she received in class…. Doing this has been incredibly valuable but the concept of how overwhelming it would be to do it for many students is not lost in the least on me.
    One trend I noted that saddened me. Papers were often graded by other students (I assume for review purposes) and comments from teachers were absent, most especially positive/encouraging ones or ones that suggest skill improvement opportunities. This has improved greatly in some cases this year and last as we receive more rubrics and state curriculum references along with returned work, which allows me to have more productive conversations with my daughter.
    It is one of the reasons I so strongly support more planning/non class time for teachers. Hopefully with more time they would be better able to provide the communication, reflection and analysis that would benefit students and parents.

  3. Bill Ferriter

    K. Borden asked:
    Once the results come in each year for NCLB compliance testing (EOG’s..), do teachers receiving students who have been tested receive a full report of the student’s scores for all previous years? Does that report include any input or analysis from previous teachers?
    Hey K,
    The reporting for NCLB testing results is actually pretty limited in my opinion. While we can access a student’s final scores on the exams from any previous year, all we see is the final scale score and percentile.
    There is no disaggregation of that information at all. Which makes the information somewhat meaningless and hard to respond to.
    What I’d love to see is the average of all of my students performance by skill. For example, I’m a language arts teacher. I’d like to know how my students did as a group on main idea questions, on author’s purpose questions, on cause and effect questions etc.
    That would help me to find gaps in my teaching—and to identify potential groups for instruction in subsequent years—but we don’t get that information.
    Counties like Wake often provide teachers with other multiple choice assessment tools to use during the year that have potential to provide some of this information. In Wake, this tool is called Blue Diamond.
    Unfortunately, the same problems exist. Reporting is only done at the objective level—so I can tell which of my students struggle with a certain objective, but not which specific skills within that objective that they struggle with.
    That’s a huge problem in Language Arts where one objective might cover anywhere from 4-8 individual skills.
    What teachers are forced to do if they want disaggregated information is go through individual tests and track question responses by hand. I’ll do that every now and then because I know it’s the right thing to do—-but with 85 kids, it takes me about 2.5 hours to just write down every child’s responses.
    And the entire time I’m cursing because I know that the tool should be able to give this information to me automatically!
    Does this make sense to you?
    One of the bits of resistance that teachers have to using data is the tools that we have for analysis are not particularly effective. Coupled with our already limited analysis capacity, this is a recipie for data disasters!
    Bill

  4. K. Borden

    A question:
    Once the results come in each year for NCLB compliance testing (EOG’s..), do teachers receiving students who have been tested receive a full report of the student’s scores for all previous years? Does that report include any input or analysis from previous teachers?

  5. K. Borden

    Mr. Ferriter:
    I suppose you could characterize my response as cautionary. Warning not the throw the baby out with the bathwater. Abusive use of data to create walls of shame, really has no positive purpose. However, I sincerely hope by sharing our experience to highlight the positive potential for data collection and use.
    Too often I read things like “testing and accountability are ruining education.” I too rarely read things like “testing and accountability can improve the experience for students and teachers by…” (fill in the blank)
    Our experience is one example, there must be more. I want to believe that with enough teachers and enough parents working to find ways to use these tools things can get better.
    It wasn’t too late for my daughter, and it doesn’t have to be too late for others.

  6. Bill Ferriter

    Wow—-This post has drawn some great conversation, huh?!
    And my original purpose was only to draw attention to the fact that data is often used REALLY poorly in schools by undertrained professionals.
    Here are some reactions:
    K Borden wrote:
    Honestly, we should not have had to seek private testing to find out my daughter had a handwriting disability. It is a readily observable disability. We as parents missed it because we had no basis for comparison. Three years of teachers missed it with ample opportunity to compare and actually observe her during her productive hours. She was written off as disorganized and lazy.
    While this isn’t exactly what I was trying to tackle in the original post, I couldn’t agree more—-Teachers have failed in many ways to give parents reliable feedback about the success and struggles of their students.
    There are dozens of reasons for this—Little training and even less time top the list—but no excuses.
    We’ve got to find efficient ways to assess kids and report to parents about a handful of key indicators.
    Period.
    Mike wrote:
    The clear thrust of my comments here is that the experience of professionals, particularly experience earned over time, is valuable, commonly more valuable than the data produced by any single instrument or measurement taken at one point in time. For those of us who have spend decades reading the research (to say nothing of actually doing the work), this is hardly a controversial assertion.
    I always appreciate Mike’s reminders that the knowledge and skills of teachers are the best form of assessing student strengths and weaknesses—and I think that most parents would trust the feedback of the most accomplished teachers over any single indicators of student ability any day.
    The problem is that there aren’t enough accomplished teachers to go around—and the feedback given by the vast majority of teachers is mediocre at best and down right negligent at worst.
    Mike and I have disagreed about this before—I don’t trust teachers and administrators to police our profession because we haven’t done that job well over the past several decades.
    If we ever get to the point where we know that every teacher is top-notch, then I’m with you, Mike.
    Until then, I’m afraid that external indicators are an equally important part of the assessment equation.
    I’d even go as far as to argue that external indicators can be valuable tools for the most accomplished teachers in identifying strengths and weaknesses.
    It’s the ability of a system with over 3 million teachers to provide the kinds of accomplished feedback that you suggest, Mike, that I doubt. Not that there aren’t handfuls of us who can.
    Bob wrote:
    Please, public educators, show what you know as measureable fact; don’t just opine. You never know who might read your comments. Don’t give them reasons to discount your good ideas.
    I hope this isn’t pointed at me, Bob!
    I’m not opining on anything in my original post—-I’m stating a fact:
    Teachers and school leaders are just beginning to learn to use data effectively. This leads to careless applications and assumptions that hurt progress in education.
    That was the gist of my thinking!
    Rock on,
    Bill

  7. K. Borden

    Mike said, “But I’m a bit distressed as I detect what seems to be a personal impetus in your arguments. A decidedly anti-teacher stance that seems to devalue their input and contribution to education in favor of testing. Am I mistaken?”
    You are both correct and incorrect in your assessment. Is there a “personal impetus” in my arguments? You betcha! I am moved to care about issues surrounding education, deeply and profoundly moved. Being a mother will do that to you. Living a life made possible because of the possibilities becoming well educated provided also greatly compels me to want to see as many as possible reap those benefits.
    However, you are gravely mistaken when you assert I have “A decidedly anti-teacher stance that seems to devalue their input and contribution to education in favor of testing.” Great teachers and effectively used testing and data collection can coexist harmoniously. However as Mr. Ferriter stated, “We really haven’t been data driven for all that long!
    While it’s embarrassing to admit, for about a hundred years, there was little in-depth analysis being done on the work in our buildings. As long as the influential members of a community were satisfied with their schools, questions weren’t asked and teachers were trusted to provide feedback about student learning.”
    Mr. Ferriter once chided me a bit for not calling him Bill. I am likely older than he is, but you see I believe teachers deserve respect. So, as a model of behavior for my daughter I call teachers by their last names, or at least by their first name preceeded by a title if they express that preference. In your case, the name I have to use is Mike, but I will call you Mr. Mike if that works for you. The point being, if anything it is not an anti-teacher stance you detect, it is a desire to see teachers with the power they have rally for the changes that will benefit students and improve their profession.

  8. Mike

    Well now. I was under the impression that forums like this are not peer refereed professional journals, but opportunities for (hopefully) informed commentary (opinions, if you will) on issues relating to education. Am I mistaken? Do my comments need citations, footnotes, sources? Are experience and reason no longer enough? Of course, it’s for each reader to determine the validity of a given argument, as it should be.
    And to K. Borden, with whom I seem to occasionally fence, I’m a little perplexed. For the final time, I’m not discounting the value of data obtained through competent instruments and measures, merely observing that it must be examined and weighted properly. The clear thrust of my comments here is that the experience of professionals, particularly experience earned over time, is valuable, commonly more valuable than the data produced by any single instrument or measurement taken at one point in time. For those of us who have spend decades reading the research (to say nothing of actually doing the work), this is hardly a controversial assertion. It is, in fact, common sense, common sense that underlies virtually all of human experience and every profession. Thus we value professional education, training and experience in any field because we know this yields the best results. Is it perfect in every case? Surely not, as we recruit from every profession solely from the human race. But it is the best we have and is very good in application in the real world.
    But I’m a bit distressed as I detect what seems to be a personal impetus in your arguments. A decidedly anti-teacher stance that seems to devalue their input and contribution to education in favor of testing. Am I mistaken? After all, please recall that I’m not saying that testing–or the data generated from it–should be abandoned or utterly discounted, merely that it should be used properly, carefully, and the results analyzed with equal care to determine if it will be of value. Perhaps my educator’s brain is so indoctrinated that I just can’t see the truth, but to me, this seems like common sense and has stood my students, and me, in good stead for many, many years.

  9. Bob Heiny

    Kudos to K. Borden. You’re reciting as a useful reminder what every teacher learned in a required tests and measurements class to earn certification.
    Another reminder from the same class: datum is and data are.
    I’m going to step out of line here, Bill, with a heartfelt plea.
    Please, public educators, show what you know as measureable fact; don’t just opine. You never know who might read your comments. Don’t give them reasons to discount your good ideas.
    Thanks for the post, Radical.

  10. K. Borden

    Mike:
    You said “Your own story of your offspring indicates that you too recognize that relying on a single source of data can lead to bad outcomes, thus did you acquire additional, independent evaluations.”
    Actually, our experience demonstrates that normed testing even with its flaws offers valuable information to parents and teachers.

  11. K. Borden

    Mike,
    Again, I call on teachers to make the data relevant. Simple calls for allowing teachers to have the full and final say on assessing students wont cut it.
    Honestly, we should not have had to seek private testing to find out my daughter had a handwriting disability. It is a readily observable disability. We as parents missed it because we had no basis for comparison. Three years of teachers missed it with ample opportunity to compare and actually observe her during her productive hours. She was written off as disorganized and lazy.
    I become sad when I consider the students who do not have parents who can afford to have private assessment.
    Those tests given in school for placement and accountability gave us the clues that we needed more information. They demonstrated she was far more able than she had been credited for by teachers. They (tests) prompted action that resulted in change without which she would have been left to fail despite her incredible ability.
    You said “So once again, data, if produced as a result of competent assessments and research, is just fine, as long as one doesn’t imagine it to be some sort of educational holy grail that should immediately determine or mandate broad and sweeping action.”
    NCLB? Testing and data collection that prompt the opportunity for a student in a school to transfer or be tutored? Those changes?

  12. Mike

    Dear K. Borden:
    Alllow me to clarify a bit, though I thought my point was reasonably clear. I am not suggesting that all data is bad, nor am I suggesting that all testing is bad, merely that all too frequently, bad testing and bad research produces bad data. Your own story of your offspring indicates that you too recognize that relying on a single source of data can lead to bad outcomes, thus did you acquire additional, independent evaluations.
    Sadly, much of what passes for brilliant, transformationl “research” in education these days is poorly designed, poorly done, and produces unreliable results, often sparking horrendous in service presentations to auditoriums full of long suffering teachers who can spots the flaw in the research and the theories that underlie it in an instant, and who know far more about their students than any data set can possibly reveal.
    So once again, data, if produced as a result of competent assessments and research, is just fine, as long as one doesn’t imagine it to be some sort of educational holy grail that should immediately determine or mandate broad and sweeping action. Your story, again, indicates that you understand this. Good for you.

  13. K. Borden

    Mike wrote:
    “”Data” often tells us little or nothing, and is all too often meaningless, accomplishing little more than the production of raw material for those administrators in the central office who live for the generation and manipulation of data.”
    Data is not the problem, nor is testing, surveying or assessing to obtain it. Knowing how to use the information gained is where the system works or fails.
    Data gained by testing earned my family’s support. In our system testing begins the first week of 3rd grade with Pre-EOGs. 3rd grade is also the year screening with use of the COG-AT and ITBS for gifted services occurs. It is the first year these normed references become available to parents via the schools.
    Mike, the data we received as a result of those three tests changed our child’s life and prospects for the future. Why? Because it helped prompt us to learn more and seek answers. In the years before we had been told our daughter was not using her time wisely, failing to consistently complete assignments and displaying poor work habits. Her mastery of subject matter was noted. Then we began to receive test results. Data began to reveal the need for answers.
    Here is the key. Had we taken the data thus collected and not used it to ask questions and seek explanations I shudder to think what outcomes we would have experienced. We chose to research, learn and try to make sense of it all. That lead us to realize she should be further tested. Yes Mike, that we should gain more data.
    Thank goodness we had the resources to do so and we did! Jaded by so much, we were determined to be thorough. We had her independently and simultaneously tested and examined by two different private providers using two different sets of data collection.
    We learned from that data and the observations of the testers she needed to be evaluated by an occupational therapist. We knew she was a creative, intelligent, child, testing perhaps added perspective and prompted us to learn more about the needs her abilities present. Testing and the resulting data set us on the path to better understand the challenges she faced and what we could do to help her meet them.
    She has a handwriting disability. Her experience with handwriting has been compared to having an incredibly high functioning computer with a faulty, unreliable printer. The process of writing excessively consumes mental and physical energy and creates a distraction. Had we known earlier, the prognosis for correction was great. However, at the age we discovered it, the prognosis for correction was very poor.
    We could have chosen to be angry, display the data to point fingers and blame. We didn’t. We chose to use the data we gained to address her needs and her abilities. She and we now know what to do to help her use time more wisely, complete assignments and employ work habits that will serve her well. We know how to encourage her strengths and accomodate her weaknesses.
    I can still remember those early weeks of third grade when she came home and said how much she enjoyed the bubble tests. Now I know why she enjoys them, she can show what she knows without writing.
    Data, and testing, serve a purpose. They provide information. The failures come not in collecting the information but in failing to use it to generate effective change.

  14. Susan Graham

    Isn’t it great? Data as a weapon! Data was suppose to be a tool to help accomplish the goal of student learning. Instead student learning has become a tool to accomplish the goal of the right data outcomes.

  15. Mike

    There is a very old axiom commonly taught in any good Supervision 101 class: Public praise, private criticism. This has held true for centuries because people respond positively to those who try to help them as long as that help is given in a non-threatening environment. And of course, most people like to have public pats on the back. The axiom reflects nothing more than a very basic understanding of human nature.
    What you’ve described is all too common in education circles. It’s a very bad, indeed, incompetent, supervision technique known as “shotgun supervision.” Instead of focusing on the problem like a well aimed rifle shot, one blasts in the general direction of the problem with a shotgun, perhaps missing the problem entirely, hitting people who aren’t problems, and causing entirely new and unrelated problems.
    You have a principal who is afraid of confrontation, who has no common sense or courage, and who takes work issues personally. “How dare those teachers fail to excel! Don’t they know they’re working for the finest principal on Earth? Don’t they know their incompetence reflects on me!? So he tries to dump on everyone, expecting everyone to read his mind and immediately fix, fix, makee all better! Very bad supervision and doubly so in that it will not engender the change he has in mind, will likely have the opposite effect, and will surely tick off his subordinates.
    If a principal has an underperforming teacher, it is their job to discover why the teacher is underperforming. Do they lack training? Are they using ineffective techniques? Were they gifted with a bunch of knuckleheaded students (more so than usual–and this does happen with alarming frequency)? Are the tests being relied upon to produce data valid and actually competent? It then falls to the principal to actually speak with the teacher, clearly and positively identify specific areas that they need to improve, suggest specific ways that they can accomplish that improvement, establish a time frame for improvement, and frequently stop by to assess and assist. In this way, a principal actually deals with those who need improvement in a way likely to produce the desired results, and they don’t tick off those who are doing well. If the teacher does not or cannot improve, the principal has done all that the can reasonably do and is able to document that.
    “But teachers should know when they’re not doing well! That’s not my job!” Cries the incompetent principal. Why do you think you get paid 3-5 times more than teachers? Because you’re so pretty? If it’s not your job, to whom does it belong? The custodians?
    Posting raw scores on a wall accomplishes three primary things:
    (1) It demonstrates a principal’s incompetence and lack of regard for his subordinates;
    (2) It ensures that those who really need help won’t recognize it and get it;
    (3) It really ticks off those who are doing the job properly
    Let’s not get too excited about “data,” by the way. Competent supervisors understand the job and through actual observation–often of the many small things that good teachers should automatically do–can correctly diagnose and help correct teaching deficiencies. “Data” often tells us little or nothing, and is all too often meaningless, accomplishing little more than the production of raw material for those administrators in the central office who live for the generation and manipulation of data.

Comments are closed.