Thursday, December 21, 2006

I was goaded by my second cousin (a high school student in Canada) to update my blog. I have to dedicate some time to this important exercise!

Monday, January 23, 2006

Teachers Issue Memo on MAP Testing

The teachers at one of the schools in my district issued a memo on MAP testing at the end of last week. The teacher that wrote the memo first handed the memo around to her colleagues and then, this morning, contacted the school principal to discuss her memo. The following is the openning paragraph of the memo:

“Maps [sic] testing is really bad for our students at [school name] because it does not provide valid predictions of CSAP proficiencies nor improvement strategies. Therefore, three rounds of Maps testing per year is an ineffective allocation of [school name] resources. To paraphrase a famous quotation, all that is necessary for the triumph of damaging educational policies is that good people in education keep silent (Practical Strategies)" (This citation is from Alfie Kohn a known polemical figure opposed to any standardized testing)

The memo includes two pages of quotes from anti-standardized testing figures and research data taken out of context. However, in this post I do not intend to point to the errors in the reasoning, but rather reflect on a bigger set of disappointments.

1. MAP testing is has never been used in a high-stakes manner in our district. Student promotion, access to high level classes, or graduation has never been dependent on performance on NWEA MAP tests. MAP testing has always been promoted as an opportunity to understand where a student is in terms of skill development and trajectory.
2. MAP testing was never intended as an assessment that the student takes and then never hears about again. MAP testing is meant to be a used in collaborative discussions between colleagues, with parents, and with students. Students should be part of the greater discussion of performance, progress, and goal setting. Students should not be assessed, receive a score an then never hear about it again.
3. MAP results are powerful to students and teachers because of the Learning Continuum, not because they accurately predict performance on the state-mandated assessment. The Learning Continuum provides a profile of the skills that a student is mastering, is learning, and will be learning in the future. The Learning Continuum is explicit and detailed with respect to skills and vocabulary that the are in the student's realm of proximal development. When teachers only focus on the projection associated with a test like this they will always be disappointed.
4. MAP testing is not intended to replace teacher collaboration on standards-based assessments. Teachers should always work together as a group to understand student proficiency, develop assessments, and analyze the results. These practices will have significant impact on student performance in the long run. MAP should play the role of additional data point and reliable and valid assessment for tracking student growth longitudinally.

Data analysis takes practice and collaboration. However, discounting an assessment because it is standardized and norm referenced is irresponsible. Teachers, students, and parents need to work together to understand student performance and trajectory.

Friday, January 13, 2006

"The educated man is the man with the best bullshit detector." I attribute this quote to Lewis Binford, my mentor in graduate school. He always attributed the quote to R. Maynard Hutchins the former University of Chicago president from 1929-1945. Lew described the context as President Hutchins being asked in a press conference to describe the educated man and he thought for a moment and then said the bullshit detector quote.

Taking classes from Lew Binford was a bit like reading a John Fowles book. When John Fowles writes a book he writes in a way that you get an amazing story and a clinic in how to write simultaneously. Lew always seemed to take us on that same journey...we sat at rapt attention hanging on each detail of his amazing stories and incredible knowledge of archaeology, but we were really learning how to learn. Not just learning how to consume information as passive "learners".

What does all this have to do with data-driven classrooms? It is my opinion that students need to own their data and lead the analysis. I doubt many progressive thinking educators would disagree with me on this point. Research shows that when students are seeing and owning their data and progress that achievement increases. With this knowledge comes a question: are we effectively teaching students how to learn? Or what to know? I believe that our current model for educating students and educating teachers ignores the learning how to learn and focuses on what to know. The result is a lack of probing data analysis and misunderstanding of the purpose of data.

I submit these posts for publication reluctantly because I do not have the answers. I hope someone out there is interested in engaging me in this conversation.

Tuesday, January 10, 2006

This morning I attended a 5th grade class that was working on analyzing their own NWEA MAP data. The class was led by a technology TOSA in our district and the students were instructed how to look up their scores, plot these on a graph, and set a goal for the Spring testing session. The students were engaged and excited. Here is a great film loop of the trainer (Dave Tarwater--my colleague) working with the students.

The students hand plotted a bar graph, which included estimating approximately where to draw lines. They acquired their data from a table of data that is in sticker form on the front of their personal folder (the sticker report is really cool--see example in film loop). The students set an overall goal within a subject area for Spring testing.

film loop of activity (download the freeware if you haven't already)

What I really want to see teachers and students do next is to look at the Learning Continuum from NWEA and start to talk about the individual skills they are developing. What does the assessment mean for their development of necessary skills. In this way students would own their learning all the way down to the skills. The goals would be focused on developing the skills, not just targeting a test score.

That said, I thought this activity was motivating for students and me. It is a giant step in the right direction

Monday, January 09, 2006

I have been pondering the issue of data-driven competency a lot lately. (Listen to this podcast if you want to hear the source for some of my ideas)(Also see this site for more of Dr. McLeod's work) I have come to the conclusion that many teachers are not prepared for analyzing and acting on student achievement and few administrators are effective at coaching Data-Driven Decision Making. I believe that following factors contribute significantly to this issue: (1) Teachers/administrators believe that data come with meaning attached. In other words, there is a belief that if we could only decipher the meaning of the information what we need to do in the classroom would be self-evident. When the instructional responses are not obvious there is a belief that there is something wrong with the assessment. (2) Teachers/administrators want assessment to result directly in student activities. This is a variation on #1. An example of this is computer-assessment that spits out data and also spits out worksheets that a student should complete to improve. This approach results the false sense that the data have informed instruction and the student must learn. (3) Teachers/administrators are not trained to look at multiple sources of data to make decisions. When we are faced with data that seem to contradict each other we are perplexed and blame the assessment.

I think we fool ourselves into to thinking that assessment data are most similar to a gas gauge in a car. In other words, we know when a gas gauge hits empty we must stop and get gas or the car will stop running. Student assessment data are not that diagnostic, they are much more subtle. The metaphor that works best for me right now is that of a stock broker. A stock broker uses a variety of data to make a decision about whether to sell or buy, including PE ratio, stock price trajectory, and recent news about the corporation. The stock broker must look at all different types of data and make an informed decision. Teachers have assessment data from a variety of assessments, the student's trajectory, and the affective characteriztics of the student to make instructional decisions. Do we train teachers to do all that?

The tone of the post can seem blaming in some ways. In no way do I intend to cast blame on our teacher preparation institutions, the teachers, or the building administrators. In fact, as a district leader i must accept responsibility for improving our ability to analyze data.

Sunday, January 08, 2006

Welcome to the Data-Driven Classroom Blog!

Disclaimor: First, I am the co-owner/developer of an ASP that is aimed at classroom level use of interim assessment data. Second, I am the Assessment director for an urban school district in the Denver-metro area. While the application my partner and I are developing is not yet completed and we are not actively selling it, I think it is important for me to make it clear to the readers that I do have this bias. I will not try to eliminate my bias that there are good products for teachers and really bad applications, but I also will not actively sell our product in this space. Since you are at the blog you know the web address for the product.

I look forward to many posts and an engaging conversation about data.

Joe