Wednesday, January 31, 2007

Alan November Presents like it is 2005

I had the pleasure to attend an Alan November (also blogs here) lecture today as a pre-session of the CASE conference. I attended with a number of colleagues (the Superintendent, COO, Chief of Ed Programs, CAO, Director of ELA, and Director of Technical Ed). Alan November has long been considered a guru of educational technology and technology integration. He has been a beacon for many school districts and has held the attention of leaders in and out of technology in a very positive way.

That said, the presentation I attended today was uninspired and lacked a discernible focus. It started with the typical Freidman approach of pointing out that the world is shrinking and there are more gifted students in China than there are children in America. This is a tried and true mechanism for grabbing our isolationist/nationalist attention and making us pay attention. He dropped one other shocker on us:
electronic whiteboards that we all love have been proven to have a negative effect on education by dumbing down the curriculum. I know that got the attention of the attendees because I later heard our COO relaying that fact to some community colleagues that are leading the design of our new high school/community college campus. Alan November wanted to get our attention with the whiteboard research and simultaneously point out that teaching with technology is not about technology it is about good teaching. That makes great sense, right?

Here's the rub: (1) the attention getter wasn't exactly what it appeared to be. (2) After arguing that teaching is the key he went on to demonstrate his superior web searching ability.

First, the
BBC news report on the whiteboard evaluation was vague. It did not provide a link to the actual study, did not reveal what metrics were used, and ignored the relationship between professional development and classroom use of this (powerful) tool. What is really compelling is that the BBC site where this article was published has a "comment" component. The comments are thorough and point out the flaws of the study. If Alan November had been the least bit thoughtful he would not have used this study as an attention grabber, but would have pointed out that the read/write web is amazing at giving both sides of a biased storied. When readers comment they point the shortcomings of the research or journalism (or both). Alan November sadly missed an opportunity to make a point and instead went for the cheap crowd pleaser. Now people all over Colorado will be saying, "whiteboards have a negative effect on education, they dumb down the curriculum, Alan November says so." This is an example of Web 1.0 where information is given one direction and the consumer accepts it.

Alan November spent the three and half hours I attended his session showing us what a great internet searcher he is. Did you know that if you use "host: uk" you get only sites from the United Kingdom? Okay, we get it, now move on. Nope. "host: tr:, "host: za", "host: ma" and on and on. We did a Skype call to New Orleans to en employee of November Learning...hey, that was cool in 2005. Wikipedia...neat. Even cited research that the NY Times did that wikipedia has an average of 4 errors per article while Brittanica has 3. No real difference, right? However, I think the research was done by Nature and Britannica disputes the results.

The world is different. Neat. Unfortunately, the technology that was focused on (wikipedia, searching, podcasting, skype, wikis, blogs, RSS) are all pretty old at this point. Frankly, I expect an empassioned presentation on the vision for a new world with real practical strategies for us to achieve this vision.

Monday, January 29, 2007

Measure What Matters

Here is a basic premise: great teaching is one of the inputs that will lead to student learning. In fact, it is one of the only inputs educators have control over. Because of this we invest time and money into providing high quality professional development so that teachers will improve and student academic achievement will likewise improve. Districts focus on proven strategies, like
Marzano's classroom strategies that work.

Then, we assess the students and wait. Here are the possible outcomes:
1. Student achievement improves
2. Student achievement declines
3. Student achievement shows no discernible growth

If student achievement improves, we pat ourselves on the back and continue doing what we did before. If it declines we immediately try something new. If there is no discernible growth or loss we can either stick with what we already started or try something new.

The problem is that we never took the time to figure out if the changes we expected in the classroom ever occurred. That's right, we never measured what mattered...whether the teaching practice actually changed. I know people in education think that student achievement is an appropriate measure to determine the effectiveness of professional development, but it isn't. If the point of professional development is to change the input (teaching practice), then that is what needs to be measured.

The challenge that I am motivated to accept is to develop a method to rapidly, repeatedly, and continuously measure the quality of the input (teaching practice), so that my district can evaluate the change following professional development, pinpoint areas of concern, and differentiate professional development for our 1000 employees.

Wednesday, January 24, 2007

Second Lifers meet your Child's Second Life

I read this morning on that the BBC is working with an affliate to release a virtual world for children (ages 7-12). It is described as having an emphasis on safety and responsibility. This seems to be an emerging market of sort. What is the potential to jump the curve and begin developing simulations for children that deepen student thinking.

Monday, January 22, 2007

Web Sense

The recent install of a new web filter in my district forced the discussion of best practice in web filtering, acceptable use policies and enforcement, and training on digital discipline. This topic has been discussed in many publications and on more blogs than can be aptly linked to here. However, this post in particular attempts to capture the range of sentiment and tenacity of arguments regarding web filtering. The key is coming up with solutions.

A team of district and school employees came together to try and address this issue for our district, in the short-term, and hopefully for the long-term. We debated from our various polemics for awhile and then settled in to stake out some common ground. In the absence of effective teacher supervision of students, no accountability for students signing the acceptable use policy, and little discussion of digital discipline in or out school we resolved to do the following: (1) adopt conservative parameters for filtering, (2) design a lesson for teachers and students on the acceptable use of the internet, (3) begin enforcing the policy with consequences, and (4) raise the urgency to implement the K-12 I-Safe curriculum.

We are not naive enough to believe that we will prevent students from accessing objectionable material on campus. By Wednesday last week (one day after the new filter was installed)students had a reliable method for circumventing the filter. However, we limit the accessibility to pornography and other blocked sites for the most students and send the message that they are violating some rule (even if we have never fairly explained it to them or followed through on effective enforcement).

We believe that to be effective the district has couple the clarification and enforcement of the acceptable use policy and teaching of pro-social use of the internet and other modern media. Improving the effectiveness of the AUP and teaching pro-social web skills is going to be a work in progress.

Saturday, January 20, 2007

Keeping Children Safe On Line

I work in a relatively small district where the families and, to be frank, the district is on the losing end of the ever expanding digital divide. The disadvantages of this situation are numerous and appalling. On the other hand, being slow to adopt technology means that the district has never really been confronted with the issues related to the read-write web (Web 2.0). Where other districts have had to struggle with censorship and myspace my district has simply blocked the site and moved on (we know that web filtering is rarely successful in blocking sites and does not develop digital discipline among kids). As I mentioned in a post earlier this week we recently ran into a problem when it was discovered that our new web filtering system was removing really relevant and appropriate web sites. As a result we called a group together to review our protocol and examine best practice from both a technical and instructional approach.

When I was stuck in traffic in downtown Denver Friday night I heard a story on NPR about Club Penguin, an online social networking site for tweens. That's right, students/children from the age of say 8 to 12, 13, or older might participate in this virtual world. What was really compelling about the story and the site is the lengths they go to to ensure that it is a safe place for children to "play" with their "friends". In fact, Club Penguin says it is proud to be one of the few sites that has ever qualified for the Better Business Bureau - Kid’s Privacy Seal of Approval. They monitor the discussion online, they screen for key words, prevent children from entering personal information, and use paid memberships to identify all community members.

The Club Penguin model made me think of the presentations I have seen (online) by Dick Hardt who speaks of Identity 2.0. You can see Dick's presentation here (even if you don't care about identity, Dick Hardt is probably one of the best presenters so watch it). Dick Hardt is leading the technology community to think and work on the next frontier for web identity...verifying that you are who you are and that you deserve to be treated special. Dick Hardt is leading the effort to ensure that identity is portable and can be authenticated.

Now back to be original discussion piece...internet filtering and ensuring student safety. First, we have an obligation to filter the internet for inappropriate material as required under the child internet protection act (CIPA). Second, we have an obligation to teach students, families, and the community to be disciplined in the digital world. There are threats and we cannot "block" them from our children. We need to encourage responsibility. Finally, the model put forth by Club Penguin and the work of Dick Hardt and others encourages me that world of Web 2.0 is evolving and could get increasingly safe.

Thursday, January 18, 2007

Education Reform in Colorado

Based on the audience response to the comments of Marc Tucker and William Brock at Wednesday's forum in downtown Denver at least some people believe they are onto something. One remark that seemed to garner particularly impassioned response from the people around me was made by Marc Tucker. He said, "we need fewer and more meaningful assessments." He went on to suggest that multiple choice assessments were not authentic measures of student learning and we should depend on assessments that require constructed response and challenge students to access higher order thinking skills.

No doubt this would be a popular statement from anyone that has thought about the authenticity of our assessments. In fact, challenging students to apply their knowledge, perform thought experiments, or synthesize is what we expect in the 21st Century and cannot be measured by instruments like CSAP.

However, I take issue with the assertion that we need fewer assessments. In fact, i would submit that we may not assess students enough. If we are to adopt a more efficient management model for our public schools, then we must agree to measure continuously our success. That means that while the more authentic assessments that Ticker described are desirable, they are lag indicators. The results of those tests are after the fact. The equivalent would be a CEO arguing that the real problem is that the measure for actual profit at the end of the year is flawed and needs to be fixed and the weekly, monthly, or quarterly measures really need to be retired.

Businesses depend on "lead" indicators. Lead indicators are those measures that point to the health of the company or the likely success of the financial objectives. Lead indicators are necessary to determine when to change course, re-double efforts, or to eliminate unprofitable ventures. Education needs lead indicators too (in the form of assessments) to make decisions regarding student interventions and organizational management. Education would be remiss to replace high quality lead indicators in favor of a more authentic summative assessment.

Wednesday, January 17, 2007

How to Persuade (and fudge the truth) with Data and Graphs

I went to a presentation today by the authors of "Tough Choices or Tough Times" Marc Tucker and former Senator and Labor Secretary William Brock (R-Tennessee). This report is being hailed as a potential for reform in Colorado's schools. No matter who you are or where you work if someone from the outside proposes massive and unrecognizable reform in your business you feel a little unsettled. That said, I feel like all of the recommendations the authors are making are worth examination. Other wise education bloggers have taken issue with the practical aspects of the report, I don't intend to do that here but I encourage you to read this post.

I have two issues with the Tough Choices presentation. First, the authors used a number of scary statistics and scenarios all our jobs will be in India before long). My philosophy TA in college called that "argument by scary pictures." The argument made was that more students must achieve higher education for the US to remain competitive. That assumes that US colleges are adequately preparing students to be competitive in the new creative fields (that require analytic thinking).

Second, the authors used a graph to make the argument that while spending has increased over the past 30 or so years, student achievement has not. I did not get a copy of the graph and did not have time to jot all the numbers down, but I did get the first number and the last. In the first year they showed a value of $3,400 and in the last year $8,977. On the other hand the student performance only raised a few points from 208 to 217. However, the authors did not take into account the change in real dollars. Unfortunately, I did not get the beginning year or the end year jotted down, but I think it began around 1970 and ended in 2005. See the graph here.

So, they argued that we have spent more dollars and that "clearly" hasn't worked. If the initial year was 1970 and the initial amount was $3,400, then in relative dollars the per pupil expenditure in 1970 was $17,096 according to this calculator. It is discouraging that there was not an opportunity to ask questions and the presenter did not address whether it was real or relative dollars. It felt slightly fraudulent in the way it was presented and that they wanted to pull one over on us. As if a bunch of educators would look at a graph, be shocked and beg for change.

I do not know where the reforms are headed, but when a presenter overlooks a key piece of information (like real or relative dollars) and appears to be trying to "lie" with data and visuals their credibility is severely damaged.

Tuesday, January 16, 2007

Internet Filtering in the Age of Insanity

I like to follow the writing of Seth Godin, Kathy Sierra, and Guy Kawasaki through their blogs. None of these writers are work in education (though Kathy Sierra was a trainer with Sun Microsystems), but we have a lot to learn from what they have to say about the world of customer service, software development, and entrepreneurship. Kathy Sierra recently posted on those little things that businesses do to make you smile (see this post to smile along), but the message I glean repeatedly from her writing is that the role of any organization (business, non-profit, or government agency) is to ensure that our users, clients, or customers can kick ass by using our product or service. In the end our users, clients, or customers need to feel like it is all about them, not about us. We in public education should learn from the sage musings of these leaders. We in administration should really pay close attention to what it is we do that allows our customers (schools, students, and the community) to kick ass and what we can do additionally that will bring that little smile.

Today was one of those days when it all went bad. Our district installed a new fire wall and with it new web filtering. The web filtering expands on what was previously used (websense) and immediately caused me a headache. I start my day by using 30-60 minutes to read my email, catch up on my RSS aggregator, and make a to do list. was blocked. add to that Flickr, technorati, any blogspot site, google and yahoo images, itunes, podcast alley,, stumble upon, digg, slashdot, wikispaces, pbwiki, and the high student council page.

I would have laughed if I had not been so annoyed. The Educational Technology advisory committee was not alerted prior to this change taking place and when we complained we were told to provide a list of sites to unblock. What is we and every other school district in the country are facing is the constant push and pull of open authentic environments versus closed artificial environments. When have a moral, ethical, and legal obligation to monitor and sanitize some Internet traffic. With that said, we must also have a professional obligation to work with our colleagues to create reasonable parameters for this cleansing. For the most part my district has been open (with the exception of blocking myspace and facebook), but they suddenly and detrimentally decided to turn closed without the slighest consideration for what that means for users (including students) in the schools.

Several education bloggers have previously examined this subject in depth and have touched on a variety of the issues. Blue Skunk and Moving at the Speed of Creativity have good discussions of the issues.

If you are within my district firewall you cannot read this blog or any of the sites listed above.

Monday, January 15, 2007

Online Data Collaboration

I learned from Emily Chang's blog about a new online data analysis/visualization web application called Swivel. I loaded some fake student performance data to see how usefull this might be to teachers. If you want to see the datasets I loaded search for "millerjtx".

Swivel has potential, but is too early in development to be truly useful to the time-crunched classroom teacher. However, the potential for collaborative exchange and discussion of data and visualization is encouraging.

Ski Lessons and Data Analysis

My friend's daughter Lily took her first ski lesson on Saturday at Copper Mountain. She is five years old and could not have been more thrilled to be on skis for the first time. As we watched her ascend the magic carpet from afar we could see her going through the motions of stops and turns. They looked a bit like dance moves to me, but this simple motion indicated to me that she was having a blast and learning a lot. At the end of the lesson the parents are given a report card and the instructor gives a short update (maybe a minute). I haven't pursued exactly what was on the report card, but the instructor said she needed one more lesson before getting on a lift. I guess she needed to improve her turning ability.

I have been thinking of the role of data in the ski lesson industry. I wonder if instructors are creating data or looking at data before starting a lesson. Would it benefit an instructor to know before they start that a child that is entering their class took a lesson one month ago and never mastered turns? Would it benefit an instructor to know more about that child's turning ability, like whether the child is crossing skis, catching an edge, or just not attempting turns yet? Would it benefit the instructor to know if the child has taken the Highpoint lift twice this month (and presumably skied down)? I am unfamiliar with the business of winter resorts and particularly unfamiliar with the practice of teaching children and adults lessons (I am a below average snowboarder that hasn't taken a lesson in years), but I am curious about their use of data. I am also curious about the relationship between a high quality learning experience and the likelihood of returning to the resort (not sure if the instructors collect student satisfaction data).

Copper Mountain should be collecting and using two types of data. First, they should collect and distribute high-quality learner data to their instructors prior to every lesson. These data would include previous lesson report cards, information regarding the number of visits to Copper and lifts used (this is clearly available in their system). Second, they should analyze the relationship between student learning and visit behavior. In other words, what is the relationship between visits to the mountain and the experience in the lesson. They should be interested in both how much the student learned and how satisfied the customer reported to be.

In the K-12 education arena we expect teachers to respond appropriately to variance in their classroom with a differentiated approach. We do not want teachers to march through a lesson as if completing the material in a timely manner was the most important goal of the class period. We understand that the most important aspect of the class is that students master the expectations and those students that achieve mastery quickly are given ample opportunity to take on new challenges. Teaching is not the most important aspect of a lesson, but instead learning is the key.

Just like in the world of K-12 education, an instructor becomes empowered when they have access to data. We know Copper Mountain collects customer data, but do they collect and use data on teaching and learning? In a business as competitive as the ski industry if Copper Mountain were to become known as the top instructor team in America it could truly set them apart from all the other choices we have in Colorado. This would be a sophisticated way to jump the curve and take and existing product and make it better.

Thursday, January 11, 2007

Data and Interventions

The principals in my district meet monthly for discussions regarding best practice in the schools and use this as an opportunity to have collegial. This meeting constitutes their Professional Learning Community and is void of announcements and bureaucratic interruptions that are not focused on instruction. In addition, the meetings are planned by principals and the majority of the content is delivered by principals. Outsiders like me are occasionally invited when topics are particularly relevant to our work in the district.

This morning was one of those days when I was invited to join the principals and it was inspiring and exciting. Three different principals presented on how they are using data to make change in their schools. Hollene Davis, the principal at Central Elementary School, gave an insightful and thought-provoking presentation on the use of multiple achievement measures to group and re-group students. Hollene walked us through a process she worked through with her teachers. She presented us with a class of students sans names and their scores on CSAP, NWEA MAP, DIBELS, CELA (English Language Acquisition Test) ans an additional comprehension measure. It was a real group of students selected for intensive intervention. The scores were contradictory and mixed. Many students did not seem to be struggling in reading based on several measures. When the teachers evaluated the data they came to the same conclusions we did and they began to ask for more data or to dig into the data a little more. The teachers were compelled by the data to get answers because there was no emotional attachment to the information (there were no names, no student faces, no biases in the data). Teachers were using data and asking for more.

In the past student selection for intervention was entirely based on teacher recommendation and had only loose connection to data. After examining the data sans names the teachers began to realize the compelling nature of multiple measures and to accept responsibility for using these data to change the trajectory of individual students.

The work that is going on at this school under Hollene's leadership will undoubtedly lead to change for the school and for each and every student that attends.

Monday, January 08, 2007

Graphics that Work for Data Analysis

Edward Tufte's fourth grand principle regarding effective visual display of quantitative information is to "completely integrate words, numbers, and images". Tufte is imploring report writers to ensure that the text that explains a graphic is on the same page as the graphic. Tufte is also arguing that the person observing the graphic should not be required to learn a system to understand the meaning. In some cases there is background knowledge required to make sense of the information being displayed, the amount of background required should be the bare minimum of the likely readers. In creating effective graphics for use by teachers and administrators I can assume a minimal understanding required to perform those jobs. Teachers will know what a "scale score" and "proficiency level" while the average member of the public may not.

Here is a sample of simple excel graph that takes into account the fourth grand principle. It displays the scores in a simple way with the data table below. It completely integrates the visual and the data. The visual leaves something to be desired, so I have two more graphics to discuss.

The next two graphics are include a good example of the fourth principle and a poor example of the fourth principle. Both examples were the result of projects where I designed the visuals. The first example comes from a custom data analysis portal designed by Adams County School District 14 staff to analyze NWEA MAP data and the results of the Colorado Student Assessment Program (CSAP). The portal has been a smashing success with our targeted end users (teachers). We hear frequently how access to data in a convenient and user-friendly format has enabled teachers make decisions informed by student data. However, one complaint we have received is that where teachers can drill down to look at sub-scores for a test period the goal areas are not defined (see graphic below). It simply says, "goal 1". Users have to open a PDF document that translates the goal to language such as "number sense". Tufte artriculates that these words should be completely integrated and teachers should not be forced to open a new document and toggle between the two.

The graphic below displays the results for an individual student on the Colorado English Language Acquisition assessment (CELA). The test returns seven total scores and many sub-scores. A reader of this report knows instantly how a student performed overall and in each of the sub-areas. There is little question and absolutely no necessity to toggle to a key. The words, colors, and symbols clearly direct the reader to a usable description of student performance. See more examples of CELA reports here.

The CELA graphic more clearly allows the user to kick butt at what they do best, teaching. The portal display example encourages users to kick butt at toggling between documents or memorizing the goal area descriptions. Which would you rather have happen in your schools?

Wednesday, January 03, 2007

How Do We Improve Data Analysis Skill?

It seems coming from education that we should have a rather simple answer to this question (it would likely include some rubric and self-assessment). The reality is that practice is the only sure way to improve analytic skill among administrators and teachers.

In grad school I would work feverishly to collect data, scrub the data, run descriptive and multivariate tests, and generate as many possible scatter plots as my Pentium I computer could handle. I would print my "results" and stare with black pen in hand to mark the compelling patterns. Without fail my advisor would come and join me for these analytic sessions. The story seemed the same every time: (1) he would set aside the statistical analysis suggesting that there was nothing to see there, (2) he would look at the patterns I identified (usually strong positive or negative correlations) and politely point out that there was nothing interesting in the auto correlations I observed, and (3) finally, he would draw odd shaped circles around groups of data on the scatter plots. These circles usually looked like amoebas, but definitely the patterns I had learned about in statistics class. He would then say, "I want to know more about these cases, try running this same relationship, but hold some variable constant." He found a compelling pattern on almost every printout. When I returned for more analysis I almost always found something complex, but compelling in the way he suggested I re-explore.

My advisor (Lewis Binford) never taught me to analyze data. He engage me and encouraged me to be curious. I learned by watching and practicing.

A culture of high-quality data analysis depends on leadership. It depends on high expectations for all members of the culture (teachers and administrators). Administrators must accept the mantle to be data analysis leaders. To be more prepared than their staff. Administrators need not come with the answers, but with the questions. Administrators need to nurture teachers and challenge them to be curious. Administrators should lead individuals to be kick-ass analysts, not simply facilitate the group to agreement on data analysis and cause.

Multiple Variable Visual Displays

The third Grand Principle from Edward Tufte is "Use Multiple Variables". As Tufte says, the world that we are trying to understand is multivariate. so our displays should be too. Tufte refers to the Napoleon March poster as a great example of the integration of multiple variables. In his example you have temperature over time, change in the size of the army, direction of movement, and time. It is an extraordinary display of the relationship between variables. Although I disagree with Tufte's assertion that this visual "Shows Causality", it clearly shows important and relevant relationships.

Teachers and administrators should accept the challenge to use multiple variables in their visuals during the exploration of achievement. This principle is espoused by many leaders in education (
Schmoker, Love, Reeves, Stiggins, DuFour, and others) with Victoria Bernhardt being the most recognized champion for multiple measures. Bernhardt advocates for the exploration of the relationship between two or more of four dimensions that are important to school reform (student achievement, school processes, demographics, and perceptions of students, staff and parents). Bernhardt advocates for exploring the intersection of these dimensions to get at the root of a problem.

We are challenged to ask two-dimensional questions like, "what is the relationship between state assessment scores (student achievement) and grades (school processes)?" "Do students with positive attitudes toward school (student perceptions) perform better on the state assessment (student performance)?" Three-dimensional questions might include, "Do grades (school processes) have any relationship to state assessment performance (student achievement) for language learners compared to non-learners (demographics)?" What is key is that as teachers and adminstrators review these data they have access to appropriate visual displays of these data or know how to create usable charts. In addition, teachers and administrators must be compelled to ask the next question and manipulate the data.

Let's just take the first question, "what is the relationship between state assessment scores (student achievement) and grades (school processes)?" This crudely drawn diagram is similar to the relationship we found in our district for high school students. The relationship (shown in a scatter plot) is completely random and cannot be explained by any slope. By adding a third variable/dimension the graphic takes a dramatically different look. We can now see that those students that are not language learners have a positive relationship between grades and state assessment performance. In other words, for non-language learners grades appear to measure something similar to what is being measured on the state assessment.

Although the visual display clearly reveals a pattern (one that is made-up), it does not tell me what causes this. However, it sparks a far deeper and more engaging discussion regarding grading and student performance when placed in three dimensions.

Show Causality?

The second grand principle that Edward Tufte espouses is to "show causality" in visuals. Tufte illustrates his point using a poster of Napolean's march in 1812. The poster shows multiple variables interacting and effectively links the military loss to temperature, not his opponents. As temperatures plummet the armies of Napolean are decimated. Tufte argues that this visual effectively shows causality.

For our purposes we need to contrast visuals used for argument and visual displays used for exploratory purposes. In data analysis for data-driven decision making the analyst is not “showing causality”, but instead wants a visual display of data that allows the data to talk back. Student achievement data (when arrayed with a comparative dataset) do not include causality, but are useful in describing reality. In analyzing student achievement data we are not juxtaposing data that result in cause, but rather looking for patterns. For example, a teacher might notice that all her students performed lower than average on vocabulary on the recent district benchmark assessment when compared to the rest of the students in the school. Nothing about that visual shows causality, but it is an elegant and appropriate display to begin exploring the reality of student performance. Thus, for exploratory data analysis “show causality” is not a necessary condition for a high-quality visual display.

When teachers and administrators begin attributing cause for an observed pattern, Tufte’s principle of “showing cause” is a necessary condition for a high-quality visual.

Again, excuse my scribbles. They lack the design and attractiveness that i would prefer, but illustrate the point nonetheless.

Tuesday, January 02, 2007

Enforce Comparison

Seven years ago I attended a seminar by Edward Tufte that described how to effectively display quantitative data. I recently ran across the notes from this seminar and was reminded of a few key points that are relevant to data in the K-12 setting. One of Tufte’s Grand Principles was that any data display must “enforce visual comparison”. In other words, we must always ask the question, “compared to what?”.

To “enforce comparison” be sure to: (1) select a relevant comparative dataset, (2) place the comparative data on the same graphic (or minimally on the same page), and (3) acknowledge the comparative data in the descriptive phase of any data dialogue.

The relevance of the comparative dataset can be measured on two-dimensional grid (crappy image drawn on tablet PC above--will work on photoshop soon) with relevance on the x-axis and the difficulty of acquiring the comparison on the y-axis. The ideal scenario is to obtain extremely relevant comparison without considerable effort. Highly relevant comparative data would be from schools are classrooms that are demographically similar. The most common occurrence is that we obtain comparative data that marginally relevant, but easy to obtain. For example, when we compare the performance of students within a school to all the other students nationally without taking into account demographic or longitudinal growth factors. Since highly relevant comparative are often difficult to obtain and can require some statistical or technical sophistication to distill, teachers are often reduced to using the less relevant data. This is not a total loss and should not be used to undermine the value of student performance data. Instead, teachers and administrators should ask for more relevant comparative data from central administration and test vendors. In addition, schools should work to create examples of highly relevant comparative datasets by comparing students and classrooms within the school.

Visual design is often overlooked in presenting student, school, or district achievement data. Data displays are often sloppy and fail to place comparative data that are necessary for analysis in proximity with the school data. It is important to place the comparative on the same graphic or minimally on the same page.

High quality fact-based statements about student data of any kind reference comparative datasets. Avoid simply saying, “3rd grade students scored 65% proficient or above on the CSAP in 2006.” Even saying, “3rd grade students scored 65% proficient or above on the CSAP in 2006, which is a 4% improvement from 2005” is unsatisfactory because the 2005 data are not necessarily a relevant comparative dataset (different students in a different year). On the other hand, saying “3rd grade students at Betcher Elementary school score 25 scale score points higher on vocabulary than demographically similar students from across the district” makes a statement of performance relative to a relevant comparative dataset.

While Tufte is focused on effective use of visual display to convey a message (make a causal statement), his principle of Enforce Comparison is extremely relevant as a basic starting point for educators engaged in data-driven decision making.