Thursday, August 02, 2007
In our development of the BSC, which is as much a framework for our strategy as it is a reporting tool, we first developed the strategy map. The strategy map frames the interrelationship between perspectives and objectives. In addition, the strategy map is communicating what is important to the district. What becomes problematic is that everything that is important is not easily measurable. For example, one of the things that we have identified as important is to "improve student products." We think this is important because in the 21st Century it is not enough for students to complete the requirements of the traditional school system (e.g. worksheets). Students need to be demonstrating their skills at 21st Century skills like presenting, analyzing, and communicating to and with the wider global community. That said, we haven't the slightest idea how we are going to measure the improvement in student products. What is key is that we still put the objective on the BSC. Even if we haven't figured it out we send the message that this or will be important to the district achieving its mission.
Wednesday, June 20, 2007
Tuesday, March 20, 2007
In a previous post I argued that we should be working to "measure what matters". In other words, if we want to know whether a teacher is effective we need to develop an example of "effective" and tools for determining whether the teacher is achieving that example. Teachers and leaders need to know where teachers on a scale (or rubric) and they need to know what to do next to improve. Years of service apparently can differentiate a 1st year teacher from a 5th year teacher, but after that there is little difference between a 5th year teacher and a 15th year teacher. Instead of seeing our longest serving teachers as the most skilled, we should create measures to see our most skilled teachers as our most skilled teachers.
Monday, March 19, 2007
Many Eyes is user-friendly and addictive. I kept loading new forms of data to try different views. I loaded school district demographics over time, speeches made by Margaret Spellings, and rap lyrics from Ice Cube and TuPac.
Imagine the potential in a K-12 setting where students could be challenged to collect data (primary collection or secondary) and then these data would be shared across the world. their classmates could comment, but so too could an expert in the field they are studying. This tool not only creates opportunities to see the world in a new way (literally), but also to collaborate and understand the world more deeply (or see it in a new way metaphorically).
Check out these examples I created:
Example #1: District demographics over time. Click the image here and drill-down (using the plus signs) to the demographics by school, ethnicity, and gender.
Example #2: A tree map of two years of demographic data reveals change in color. Check out the hover over Ajax features.
Example #3: Tag Cloud of recent testimony from Secretary of Education Margaret Spellings. Any idea what Spellings intends to focus on? Could you imagine students using this to compare the language in two poems, rap lyrics, books on the same subject in two different decades, speeches...or anything else? Imagine how engaged students would be to see the text they are analyzing come alive.
Wednesday, March 07, 2007
First, technology serves humans. If the technology fails or the user cannot figure out how to get value from it, then the design is the problem. A simple concept that seems to be overlooked when considering how to display data to teachers. Second, design is not art. Art is to be enjoyed and design facilitates use. Third, designers do not create experiences, they create artifacts to experience. This seems to be akin to Kathy Sierra’s argument that serving our customers means that in the end it is about them kicking ass. Fourth, great design is invisible because it solves a problem and works well. We take it for granted. Fifth, simplicity is the ultimate sophistication. Distill the design to the absolute needs to solve the problem at hand, that is all.
This all applies directly to design of data systems for school districts to use. If the ultimate goal is for teachers to use the data system to analyze data and track student progress, then the design of the system must be teacher-friendly, focused on their experience, and simple enough that the user does not have to read the freakin’ manual.
Monday, February 12, 2007
Above was my picture for the first scene in the storyboard. Here is my storyboard for the first two images. The metaphor of a storyboard is great because it makes you think through the emotion and imagery that you want to create, just like a great director does with a movie. The storyboard version of talking to parents about students could result in better metaphors and descriptions of the student's actual performance. It is a deliberate method for creating a vision for the parents.
Tuesday, February 06, 2007
No matter what assessment or performance related material we are sending home to parents we should be sure that we are clear about the following things: (1) What is the test, when did your son/daughter take the test, and why the results are important. (2) How your student scored some comparison data (e.g. how did the rest of the state, district, or school score). (3) How you can support your student to improve or maintain high performance. (4) When the student will test next. (5) What is the best way to contact their teacher to get more information.
How often are we successful at all of these steps?
Monday, February 05, 2007
The DOI is a more thorough measure of broadband penetration and better indicator of the divide. As FCC member Michael Copps wrote in a November Op-Ed the reason for our lagging status is because the ridiculous rates that high-speed providers are able to charge in a non-competitie market. Unfortunately, there is a divide even within the United States between those that have and those that cannot afford the rates. I just looked up the cost of cable internet in my neighborhood and found that it was $59/month and $49 for installation. Nearly $800 per year for high-speed internet service! So, in those areas of America where $800 a year is a stress on the pocketbook there is going to be even less boradband penetration. Those areas tend to be rural and urban poor.
It is alarming that the richest country in the world is only 21st on the DOI, but what is more staggering is the complete absence of outrage over the expanding digital divide within America. If access to obtain and create new information is going to be the difference maker in the future for the students of today, we have a moral obligation to help our less fortunate students cross the divide.
Wednesday, January 31, 2007
That said, the presentation I attended today was uninspired and lacked a discernible focus. It started with the typical Freidman approach of pointing out that the world is shrinking and there are more gifted students in China than there are children in America. This is a tried and true mechanism for grabbing our isolationist/nationalist attention and making us pay attention. He dropped one other shocker on us: electronic whiteboards that we all love have been proven to have a negative effect on education by dumbing down the curriculum. I know that got the attention of the attendees because I later heard our COO relaying that fact to some community colleagues that are leading the design of our new high school/community college campus. Alan November wanted to get our attention with the whiteboard research and simultaneously point out that teaching with technology is not about technology it is about good teaching. That makes great sense, right?
Here's the rub: (1) the attention getter wasn't exactly what it appeared to be. (2) After arguing that teaching is the key he went on to demonstrate his superior web searching ability.
First, the BBC news report on the whiteboard evaluation was vague. It did not provide a link to the actual study, did not reveal what metrics were used, and ignored the relationship between professional development and classroom use of this (powerful) tool. What is really compelling is that the BBC site where this article was published has a "comment" component. The comments are thorough and point out the flaws of the study. If Alan November had been the least bit thoughtful he would not have used this study as an attention grabber, but would have pointed out that the read/write web is amazing at giving both sides of a biased storied. When readers comment they point the shortcomings of the research or journalism (or both). Alan November sadly missed an opportunity to make a point and instead went for the cheap crowd pleaser. Now people all over Colorado will be saying, "whiteboards have a negative effect on education, they dumb down the curriculum, Alan November says so." This is an example of Web 1.0 where information is given one direction and the consumer accepts it.
Second, Alan November spent the three and half hours I attended his session showing us what a great internet searcher he is. Did you know that if you use "host: uk" you get only sites from the United Kingdom? Okay, we get it, now move on. Nope. "host: tr:, "host: za", "host: ma" and on and on. We did a Skype call to New Orleans to en employee of November Learning...hey, that was cool in 2005. Wikipedia...neat. Even cited research that the NY Times did that wikipedia has an average of 4 errors per article while Brittanica has 3. No real difference, right? However, I think the research was done by Nature and Britannica disputes the results.
The world is different. Neat. Unfortunately, the technology that was focused on (wikipedia, searching, podcasting, skype, wikis, blogs, RSS) are all pretty old at this point. Frankly, I expect an empassioned presentation on the vision for a new world with real practical strategies for us to achieve this vision.
Monday, January 29, 2007
Marzano's classroom strategies that work.
Then, we assess the students and wait. Here are the possible outcomes:
1. Student achievement improves
2. Student achievement declines
3. Student achievement shows no discernible growth
If student achievement improves, we pat ourselves on the back and continue doing what we did before. If it declines we immediately try something new. If there is no discernible growth or loss we can either stick with what we already started or try something new.
The problem is that we never took the time to figure out if the changes we expected in the classroom ever occurred. That's right, we never measured what mattered...whether the teaching practice actually changed. I know people in education think that student achievement is an appropriate measure to determine the effectiveness of professional development, but it isn't. If the point of professional development is to change the input (teaching practice), then that is what needs to be measured.
The challenge that I am motivated to accept is to develop a method to rapidly, repeatedly, and continuously measure the quality of the input (teaching practice), so that my district can evaluate the change following professional development, pinpoint areas of concern, and differentiate professional development for our 1000 employees.
Wednesday, January 24, 2007
I read this morning on Profy.com that the BBC is working with an affliate to release a virtual world for children (ages 7-12). It is described as having an emphasis on safety and responsibility. This seems to be an emerging market of sort. What is the potential to jump the curve and begin developing simulations for children that deepen student thinking.
Monday, January 22, 2007
A team of district and school employees came together to try and address this issue for our district, in the short-term, and hopefully for the long-term. We debated from our various polemics for awhile and then settled in to stake out some common ground. In the absence of effective teacher supervision of students, no accountability for students signing the acceptable use policy, and little discussion of digital discipline in or out school we resolved to do the following: (1) adopt conservative parameters for filtering, (2) design a lesson for teachers and students on the acceptable use of the internet, (3) begin enforcing the policy with consequences, and (4) raise the urgency to implement the K-12 I-Safe curriculum.
We are not naive enough to believe that we will prevent students from accessing objectionable material on campus. By Wednesday last week (one day after the new filter was installed)students had a reliable method for circumventing the filter. However, we limit the accessibility to pornography and other blocked sites for the most students and send the message that they are violating some rule (even if we have never fairly explained it to them or followed through on effective enforcement).
We believe that to be effective the district has couple the clarification and enforcement of the acceptable use policy and teaching of pro-social use of the internet and other modern media. Improving the effectiveness of the AUP and teaching pro-social web skills is going to be a work in progress.
Saturday, January 20, 2007
When I was stuck in traffic in downtown Denver Friday night I heard a story on NPR about Club Penguin, an online social networking site for tweens. That's right, students/children from the age of say 8 to 12, 13, or older might participate in this virtual world. What was really compelling about the story and the site is the lengths they go to to ensure that it is a safe place for children to "play" with their "friends". In fact, Club Penguin says it is proud to be one of the few sites that has ever qualified for the Better Business Bureau - Kid’s Privacy Seal of Approval. They monitor the discussion online, they screen for key words, prevent children from entering personal information, and use paid memberships to identify all community members.
The Club Penguin model made me think of the presentations I have seen (online) by Dick Hardt who speaks of Identity 2.0. You can see Dick's presentation here (even if you don't care about identity, Dick Hardt is probably one of the best presenters so watch it). Dick Hardt is leading the technology community to think and work on the next frontier for web identity...verifying that you are who you are and that you deserve to be treated special. Dick Hardt is leading the effort to ensure that identity is portable and can be authenticated.
Now back to be original discussion piece...internet filtering and ensuring student safety. First, we have an obligation to filter the internet for inappropriate material as required under the child internet protection act (CIPA). Second, we have an obligation to teach students, families, and the community to be disciplined in the digital world. There are threats and we cannot "block" them from our children. We need to encourage responsibility. Finally, the model put forth by Club Penguin and the work of Dick Hardt and others encourages me that world of Web 2.0 is evolving and could get increasingly safe.
Thursday, January 18, 2007
No doubt this would be a popular statement from anyone that has thought about the authenticity of our assessments. In fact, challenging students to apply their knowledge, perform thought experiments, or synthesize is what we expect in the 21st Century and cannot be measured by instruments like CSAP.
However, I take issue with the assertion that we need fewer assessments. In fact, i would submit that we may not assess students enough. If we are to adopt a more efficient management model for our public schools, then we must agree to measure continuously our success. That means that while the more authentic assessments that Ticker described are desirable, they are lag indicators. The results of those tests are after the fact. The equivalent would be a CEO arguing that the real problem is that the measure for actual profit at the end of the year is flawed and needs to be fixed and the weekly, monthly, or quarterly measures really need to be retired.
Businesses depend on "lead" indicators. Lead indicators are those measures that point to the health of the company or the likely success of the financial objectives. Lead indicators are necessary to determine when to change course, re-double efforts, or to eliminate unprofitable ventures. Education needs lead indicators too (in the form of assessments) to make decisions regarding student interventions and organizational management. Education would be remiss to replace high quality lead indicators in favor of a more authentic summative assessment.
Wednesday, January 17, 2007
I have two issues with the Tough Choices presentation. First, the authors used a number of scary statistics and scenarios all our jobs will be in India before long). My philosophy TA in college called that "argument by scary pictures." The argument made was that more students must achieve higher education for the US to remain competitive. That assumes that US colleges are adequately preparing students to be competitive in the new creative fields (that require analytic thinking).
Second, the authors used a graph to make the argument that while spending has increased over the past 30 or so years, student achievement has not. I did not get a copy of the graph and did not have time to jot all the numbers down, but I did get the first number and the last. In the first year they showed a value of $3,400 and in the last year $8,977. On the other hand the student performance only raised a few points from 208 to 217. However, the authors did not take into account the change in real dollars. Unfortunately, I did not get the beginning year or the end year jotted down, but I think it began around 1970 and ended in 2005. See the graph here.
So, they argued that we have spent more dollars and that "clearly" hasn't worked. If the initial year was 1970 and the initial amount was $3,400, then in relative dollars the per pupil expenditure in 1970 was $17,096 according to this calculator. It is discouraging that there was not an opportunity to ask questions and the presenter did not address whether it was real or relative dollars. It felt slightly fraudulent in the way it was presented and that they wanted to pull one over on us. As if a bunch of educators would look at a graph, be shocked and beg for change.
I do not know where the reforms are headed, but when a presenter overlooks a key piece of information (like real or relative dollars) and appears to be trying to "lie" with data and visuals their credibility is severely damaged.
Tuesday, January 16, 2007
Today was one of those days when it all went bad. Our district installed a new fire wall and with it new web filtering. The web filtering expands on what was previously used (websense) and immediately caused me a headache. I start my day by using 30-60 minutes to read my email, catch up on my RSS aggregator, and make a to do list. http://www.bloglines.com/ was blocked. add to that Flickr, technorati, any blogspot site, google and yahoo images, itunes, podcast alley, del.icio.us, stumble upon, digg, slashdot, wikispaces, pbwiki, and the high student council page.
I would have laughed if I had not been so annoyed. The Educational Technology advisory committee was not alerted prior to this change taking place and when we complained we were told to provide a list of sites to unblock. What is we and every other school district in the country are facing is the constant push and pull of open authentic environments versus closed artificial environments. When have a moral, ethical, and legal obligation to monitor and sanitize some Internet traffic. With that said, we must also have a professional obligation to work with our colleagues to create reasonable parameters for this cleansing. For the most part my district has been open (with the exception of blocking myspace and facebook), but they suddenly and detrimentally decided to turn closed without the slighest consideration for what that means for users (including students) in the schools.
Several education bloggers have previously examined this subject in depth and have touched on a variety of the issues. Blue Skunk and Moving at the Speed of Creativity have good discussions of the issues.
If you are within my district firewall you cannot read this blog or any of the sites listed above.
Monday, January 15, 2007
I have been thinking of the role of data in the ski lesson industry. I wonder if instructors are creating data or looking at data before starting a lesson. Would it benefit an instructor to know before they start that a child that is entering their class took a lesson one month ago and never mastered turns? Would it benefit an instructor to know more about that child's turning ability, like whether the child is crossing skis, catching an edge, or just not attempting turns yet? Would it benefit the instructor to know if the child has taken the Highpoint lift twice this month (and presumably skied down)? I am unfamiliar with the business of winter resorts and particularly unfamiliar with the practice of teaching children and adults lessons (I am a below average snowboarder that hasn't taken a lesson in years), but I am curious about their use of data. I am also curious about the relationship between a high quality learning experience and the likelihood of returning to the resort (not sure if the instructors collect student satisfaction data).
Copper Mountain should be collecting and using two types of data. First, they should collect and distribute high-quality learner data to their instructors prior to every lesson. These data would include previous lesson report cards, information regarding the number of visits to Copper and lifts used (this is clearly available in their system). Second, they should analyze the relationship between student learning and visit behavior. In other words, what is the relationship between visits to the mountain and the experience in the lesson. They should be interested in both how much the student learned and how satisfied the customer reported to be.
In the K-12 education arena we expect teachers to respond appropriately to variance in their classroom with a differentiated approach. We do not want teachers to march through a lesson as if completing the material in a timely manner was the most important goal of the class period. We understand that the most important aspect of the class is that students master the expectations and those students that achieve mastery quickly are given ample opportunity to take on new challenges. Teaching is not the most important aspect of a lesson, but instead learning is the key.
Just like in the world of K-12 education, an instructor becomes empowered when they have access to data. We know Copper Mountain collects customer data, but do they collect and use data on teaching and learning? In a business as competitive as the ski industry if Copper Mountain were to become known as the top instructor team in America it could truly set them apart from all the other choices we have in Colorado. This would be a sophisticated way to jump the curve and take and existing product and make it better.
Thursday, January 11, 2007
This morning was one of those days when I was invited to join the principals and it was inspiring and exciting. Three different principals presented on how they are using data to make change in their schools. Hollene Davis, the principal at Central Elementary School, gave an insightful and thought-provoking presentation on the use of multiple achievement measures to group and re-group students. Hollene walked us through a process she worked through with her teachers. She presented us with a class of students sans names and their scores on CSAP, NWEA MAP, DIBELS, CELA (English Language Acquisition Test) ans an additional comprehension measure. It was a real group of students selected for intensive intervention. The scores were contradictory and mixed. Many students did not seem to be struggling in reading based on several measures. When the teachers evaluated the data they came to the same conclusions we did and they began to ask for more data or to dig into the data a little more. The teachers were compelled by the data to get answers because there was no emotional attachment to the information (there were no names, no student faces, no biases in the data). Teachers were using data and asking for more.
In the past student selection for intervention was entirely based on teacher recommendation and had only loose connection to data. After examining the data sans names the teachers began to realize the compelling nature of multiple measures and to accept responsibility for using these data to change the trajectory of individual students.
The work that is going on at this school under Hollene's leadership will undoubtedly lead to change for the school and for each and every student that attends.
Monday, January 08, 2007
Here is a sample of simple excel graph that takes into account the fourth grand principle. It displays the scores in a simple way with the data table below. It completely integrates the visual and the data. The visual leaves something to be desired, so I have two more graphics to discuss.
The next two graphics are include a good example of the fourth principle and a poor example of the fourth principle. Both examples were the result of projects where I designed the visuals. The first example comes from a custom data analysis portal designed by Adams County School District 14 staff to analyze NWEA MAP data and the results of the Colorado Student Assessment Program (CSAP). The portal has been a smashing success with our targeted end users (teachers). We hear frequently how access to data in a convenient and user-friendly format has enabled teachers make decisions informed by student data. However, one complaint we have received is that where teachers can drill down to look at sub-scores for a test period the goal areas are not defined (see graphic below). It simply says, "goal 1". Users have to open a PDF document that translates the goal to language such as "number sense". Tufte artriculates that these words should be completely integrated and teachers should not be forced to open a new document and toggle between the two.
The graphic below displays the results for an individual student on the Colorado English Language Acquisition assessment (CELA). The test returns seven total scores and many sub-scores. A reader of this report knows instantly how a student performed overall and in each of the sub-areas. There is little question and absolutely no necessity to toggle to a key. The words, colors, and symbols clearly direct the reader to a usable description of student performance. See more examples of CELA reports here.
The CELA graphic more clearly allows the user to kick butt at what they do best, teaching. The portal display example encourages users to kick butt at toggling between documents or memorizing the goal area descriptions. Which would you rather have happen in your schools?
Wednesday, January 03, 2007
Teachers and administrators should accept the challenge to use multiple variables in their visuals during the exploration of achievement. This principle is espoused by many leaders in education (Schmoker, Love, Reeves, Stiggins, DuFour, and others) with Victoria Bernhardt being the most recognized champion for multiple measures. Bernhardt advocates for the exploration of the relationship between two or more of four dimensions that are important to school reform (student achievement, school processes, demographics, and perceptions of students, staff and parents). Bernhardt advocates for exploring the intersection of these dimensions to get at the root of a problem.
We are challenged to ask two-dimensional questions like, "what is the relationship between state assessment scores (student achievement) and grades (school processes)?" "Do students with positive attitudes toward school (student perceptions) perform better on the state assessment (student performance)?" Three-dimensional questions might include, "Do grades (school processes) have any relationship to state assessment performance (student achievement) for language learners compared to non-learners (demographics)?" What is key is that as teachers and adminstrators review these data they have access to appropriate visual displays of these data or know how to create usable charts. In addition, teachers and administrators must be compelled to ask the next question and manipulate the data.
Let's just take the first question, "what is the relationship between state assessment scores (student achievement) and grades (school processes)?" This crudely drawn diagram is similar to the relationship we found in our district for high school students. The relationship (shown in a scatter plot) is completely random and cannot be explained by any slope. By adding a third variable/dimension the graphic takes a dramatically different look. We can now see that those students that are not language learners have a positive relationship between grades and state assessment performance. In other words, for non-language learners grades appear to measure something similar to what is being measured on the state assessment.
Although the visual display clearly reveals a pattern (one that is made-up), it does not tell me what causes this. However, it sparks a far deeper and more engaging discussion regarding grading and student performance when placed in three dimensions.
For our purposes we need to contrast visuals used for argument and visual displays used for exploratory purposes. In data analysis for data-driven decision making the analyst is not “showing causality”, but instead wants a visual display of data that allows the data to talk back. Student achievement data (when arrayed with a comparative dataset) do not include causality, but are useful in describing reality. In analyzing student achievement data we are not juxtaposing data that result in cause, but rather looking for patterns. For example, a teacher might notice that all her students performed lower than average on vocabulary on the recent district benchmark assessment when compared to the rest of the students in the school. Nothing about that visual shows causality, but it is an elegant and appropriate display to begin exploring the reality of student performance. Thus, for exploratory data analysis “show causality” is not a necessary condition for a high-quality visual display.
When teachers and administrators begin attributing cause for an observed pattern, Tufte’s principle of “showing cause” is a necessary condition for a high-quality visual.
Again, excuse my scribbles. They lack the design and attractiveness that i would prefer, but illustrate the point nonetheless.
Tuesday, January 02, 2007
Seven years ago I attended a seminar by Edward Tufte that described how to effectively display quantitative data. I recently ran across the notes from this seminar and was reminded of a few key points that are relevant to data in the K-12 setting. One of Tufte’s Grand Principles was that any data display must “enforce visual comparison”. In other words, we must always ask the question, “compared to what?”.
To “enforce comparison” be sure to: (1) select a relevant comparative dataset, (2) place the comparative data on the same graphic (or minimally on the same page), and (3) acknowledge the comparative data in the descriptive phase of any data dialogue.
The relevance of the comparative dataset can be measured on two-dimensional grid (crappy image drawn on tablet PC above--will work on photoshop soon) with relevance on the x-axis and the difficulty of acquiring the comparison on the y-axis. The ideal scenario is to obtain extremely relevant comparison without considerable effort. Highly relevant comparative data would be from schools are classrooms that are demographically similar. The most common occurrence is that we obtain comparative data that marginally relevant, but easy to obtain. For example, when we compare the performance of students within a school to all the other students nationally without taking into account demographic or longitudinal growth factors. Since highly relevant comparative are often difficult to obtain and can require some statistical or technical sophistication to distill, teachers are often reduced to using the less relevant data. This is not a total loss and should not be used to undermine the value of student performance data. Instead, teachers and administrators should ask for more relevant comparative data from central administration and test vendors. In addition, schools should work to create examples of highly relevant comparative datasets by comparing students and classrooms within the school.
Visual design is often overlooked in presenting student, school, or district achievement data. Data displays are often sloppy and fail to place comparative data that are necessary for analysis in proximity with the school data. It is important to place the comparative on the same graphic or minimally on the same page.
High quality fact-based statements about student data of any kind reference comparative datasets. Avoid simply saying, “3rd grade students scored 65% proficient or above on the CSAP in 2006.” Even saying, “3rd grade students scored 65% proficient or above on the CSAP in 2006, which is a 4% improvement from 2005” is unsatisfactory because the 2005 data are not necessarily a relevant comparative dataset (different students in a different year). On the other hand, saying “3rd grade students at Betcher Elementary school score 25 scale score points higher on vocabulary than demographically similar students from across the district” makes a statement of performance relative to a relevant comparative dataset.
While Tufte is focused on effective use of visual display to convey a message (make a causal statement), his principle of Enforce Comparison is extremely relevant as a basic starting point for educators engaged in data-driven decision making.