The summer break is over or nearly over, and suntan lotion will soon be pushed to the back of the medicine chest. As I reflect on the many on-site meetings to review survey results which I led with superintendents and department heads, a natural question frequently came up: ‘”Compared to what?”
This question is particularly challenging when the district is participating in K12 Insight’s systematic solution for the first year. Clearly, the ideal reference point for the perceptions of any aspect of the district’s performance is a measurement asking the exact same question a year ago (seasonality has a surprising impact on stakeholder data). When this data is not available, however, there are a few strategies we employ to enhance our understanding of survey data.
Since our organization works with many districts, we certainly have a sense of what we typically expect and can therefore put the data in that context. But another technique is to look at other questions that a respondent is presented on the same topic. If we ask five questions about parent engagement and a specific item’s rating stands apart (high or low) from others in this cluster, it is fair to make inferences about the ‘outlier.’ In cases where we see similar results for items in the same cluster, we can disaggregate the degrees of intensity that we often combine (Satisfied and Very Satisfied, for example), in order to differentiate one item’s feedback from another’s.
Years of baseline data are a wonderful resource, but a thoughtful consideration of data in a district’s first year implementing the K12 Insight solution can yield meaningful information. While the best answers to the “Compared to what?” question may take time to uncover, during a district’s first year with us we can answer an equally powerful question: “Where are we now?”