The Data Crush. . .

Brian Crosby—-one of the great minds blogging over at In Practice—wrote a great bit recently about a shift in schools that has run amok:

Now gathering data to drive your instruction is a good thing. So good a thing in fact that each of our little programs generates its own set of data and we get to compile and organize it all – whether it is important data as far as informing our teaching or not…much of this is being done on computers, so therefore its been decided that it takes little time … its easy to input and output the data … and here’s a chart to write all the data down in columns so we have all the data in one place. So much so that it is taking a lot of what we used to use as planning time to do all this bookkeeping of data.

Teachers are therefore cutting back on other aspects of their jobs that require time. Like planning.

Brian's thoughts really resonate with me—and I don't work in a high-poverty, program driven community!  Even working in a building where teachers still have the professional flexibility to make instructional decisions based on their own knowledge and experience,  I struggle with collecting, manipulating and reporting results.

It’s a labor intensive process in the best of places. In high poverty schools where everything has to be carefully documented, I'll bet it could be crippling.

But the effect here is the same as it is in Brian's school—I spend less time grading, planning and providing feedback today than ever before because I'm spending so much more time collecting and manipulating (not analyzing) the piles of numbers that are required to make us appear "data-driven."

That's kind of shocking, isn't it? 

Given the documented evidence that Marzano has laid out in What Works in Schools about the impact that curricular design and effective feedback have on students, you'd think our schools would be far more inclined to free teachers to do the knowledge based work of using information to identify and react to trends, rather than bogging them down in a bajillion random programs that require a bajillion random reports.

Until we automate the process of collecting data in every school—using handhelds, convergence devices or spreadsheets to make this process easier—and until we set priorities on the kinds of data that we want to collect, we’re going to be crushed.

Formative assessment—which is the key to continuous improvement—is dependent on taking action.  When teachers are pushed to the point where simply collecting information prevents follow-up, formative assessment is dead.

7 comments

  1. Gregory Louie

    Hi Bill,
    I totally agree.
    1. The push for data-driven instruction is unproductive when teachers do not have time to reflect on the meaning of the “data.”
    2. There is a tacit assumption that the assessments driving the “data” collection are well thought-out and appropriate.
    A few years back, I was influenced by a popular PD phrase: “Aligning the written, the taught and the assessed.”
    In my opinion, this type of alignment can only be accomplished by an individual teacher in an individual classroom. Trying to align written materials produced by textbook writers from one part of the country, with assessments written in another part of the country and delivered in a single classroom is an exercise in frustration and doesn’t make any sense. It only makes sense if all parties involved remain in communication through several iterations of the development of materials. Even then, you would need an army of graduate students, psychometricians, curriculum designers, etc to develop one fully aligned curriculum that would be quickly outdated and could not possibly fit every student.
    On the other hand, if a teacher designs his/her own materials, creates their own assessments and delivers both in their own classrooms, such an alignment, however imperfect is possible.
    What happens in most public school districts demonstrates a deep lack of trust in the professionalism of teachers.
    On a more positive note, the work of the Berkeley Evaluation and Assessment Research Center provides methods for creating formative assessments that are embeddable within a curriculum. These assessments are designed to reveal student thinking and help a teacher create developmentally-appropriate learning progressions that might serve as a foundation for curriculum designs.
    The learning progressions movement when fully developed will allow all parties to identify what level of proficiency a student has achieved along with proscriptive suggestions for improvement. This will allow for fully individualized learning systems.
    For more details, I suggest you “google” the web using the search term “BEAR assessment system”.
    Happy surfing!
    Greg

  2. sweber

    The December/Jan. Educational Leadership features many insightful articles on the topic “Data: Now What.”
    Several of the articles are available online at:
    http://www.ascd.org/publications/educational_leadership/current_issue.aspx
    An article that is similar to Bill’s recent posting is written by Douglas Reeves. Some educators may disagree with the viewpoints in this edition of Educational Leadership, but it provides more insight on data analysis, data overload, and the need for meaningful data.

  3. Mike

    Dear Matt Johnston:
    Allow me to clarify my comments just a bit. I am not reflexively against the generation and use of all data. The essential thrust of my comments is and remains that those in positions above classroom teachers have such different priorities and day to day concerns, that what they mandate is seldom of value to teachers, and often, an unnecessary and onerous burden. A few examples:
    In Texas, where I teach, we have mandatory, high stakes testing in English, social studies, math and science. In English, the test is constructed in such a way that we must spend six to eight weeks drilling kids in the specific formats and nuances of the test (losing that time to real learning). Much of what is required does not dovetail with the English curriculum in any way except in that the test is written and answered in English. The state would reply that all one need do is teach the state standards and every student will be more than adequately prepared. Again, their distance from the realities of the classroom puts them in the position of being woefully uninformed, or of being liars. The entire testing process generates mountains of data. Yet, knowing that a given student failed the test and the particulars thereof tells me nothing at all that I don’t already know about that student, and virtually nothing at all about how they can do better next time. And all this at the cost of tens of millions per year, merely for this single test.
    In my district, one elementary school stands out, always scoring far above the others in academic achievement (as measured by the state, of course). Mountains of data exist, of course, to determine this, but to what effect? This school is outstanding because its students are outstanding. A quirk of geography has ensured that virtually every student attending the school is from an upper-middle class (or higher) home with two parents. These are kids with every human advantage, including parents who care deeply about the education of their children. Data avails us nothing in this case because demography does matter, however politically incorrect it might be to say it. No this does not mean that I treat kids from lower socioeconomic backgrounds differently than those from higher groups, merely that I’m capable of recognizing reality and dealing appropriately with it.
    My high school is in the top rank of high schools in the state. Is knowing that my school scores 2.3% higher on a high stakes test than Generic High School in a nearby community of use to me? It might be “fun to know and tell” information, but is it something that I’d feel worth even ten minutes of my time to generate and collect? Not in this lifetime.
    So while I don’t dismiss data and its benefits out of hand, in recent years, I’ve seen very little from the data process that has been of value to me or my students. Shouldn’t what is of value to teachers and students be the focus of our efforts and concerns?

  4. Matt Johnston

    Bill,
    I am a big a believer in data that can be used and turned into information (and there is a very big distinction.) However, I wonder how much of this “data” is being used?
    I know federal and state mandates require the collection of mountains of data. so of this information, how much is used in actual classroom practice? Is there any valuable crossover?
    But in reaction to Mike’s comment, does the “insightful feedback” necessarily preclude data collection and analysis? In short is it one or the other or can the two coexist?
    Often, for those tasked with practice level data collection, the problem is not collecting and collating the data, but its relevance to the task at hand–that is teaching. Most dissatisfaction with data collection and analysis is leveled at the issue of relevance, what data is relevant to the classroom teacher and what is not?
    I can see and appreciate the argument that all the focus on data collection and analysis, the quantitative side of matters, may detract for the qualitative side of teaching, but I still wonder what the proper balance is? Can there be a balance?

  5. Matt Johnston

    Bill,
    I am a big a believer in data that can be used and turned into information (and there is a very big distinction.) However, I wonder how much of this “data” is being used?
    I know federal and state mandates require the collection of mountains of data. so of this information, how much is used in actual classroom practice? Is there any valuable crossover?
    But in reaction to Mike’s comment, does the “insightful feedback” necessarily preclude data collection and analysis? In short is it one or the other or can the two coexist?
    Often, for those tasked with practice level data collection, the problem is not collecting and collating the data, but its relevance to the task at hand–that is teaching. Most dissatisfaction with data collection and analysis is leveled at the issue of relevance, what data is relevant to the classroom teacher and what is not?
    I can see and appreciate the argument that all the focus on data collection and analysis, the quantitative side of matters, may detract for the qualitative side of teaching, but I still wonder what the proper balance is? Can there be a balance?

  6. Mike

    Bill:
    You’ve pointed out one of the fundamental divisions between teaching and administrating. The more distant one becomes from the classroom, the greater the divide. From building principals to district administrators to state level educrats to Federal level educrats, the gap in knowledge and common sense grows ever more wide until at some level–state educrats in many states–particularly federal educrats, all that exists is the generation, collection and crunching of data and the legislative and public relations machinery necessary to mandate the production and transmission of data and to convince the public of the unprecedented and magical effects occurring as a result.
    At some point these massive constructs exist primarily to do whatever is necessary to ensure that they not only continue to exist, but that their data crunching, public relations producing machinery continues to grow, hence, to maintain and increase in power. After awhile, such people start believing their own press clippings and the means becomes the end.
    Meanwhile, back in any classroom USA, teachers shut the door when the bell rings and do what they can. Until every level of education practice and policy returns to asking one simple question–What does the teacher need to provide the best possible learning opportunity–things will not improve. When we begin to think that data, no matter how efficiently it is gathered and presented, no matter how relatively little time that process takes away from actual planning, grading, assessing and teaching, we have lost track of the fundamental nature of teaching and learning. Instead of teaching kids, we’re manipulating data.
    I generate all of the data I’ll ever need in the process of making, grading and teaching. I can see, clearly, each and every day whether Johnny is making progress in each and every state mandated standard (don’t get me started) because I can compare his work, day to day, week by week, month by month. If another teacher can’t do this well, we need to help them, we need to teach them how, not flood them with data.
    In essence it may be a matter of prioritizing. We have very limited time. If my choice is between having an extra hour each day to grade papers in more depth and detail, to give insightful feedback to kids or to gather and input data, terribly sorry, but I’ll always choose the former. That a great many of those in positions of power over my pay grade would reflexively choose the latter and spend untold millions trying to convince me that I am wrong is perhaps the best statement of the problem I can imagine this Thanksgiving day.