The US Department of Education just released the results from their survey of the uses of educational data in schools:
The report indicated that schools are storing more and more data, yet still have not been able to utilize the data to improve educational methods and outcomes.
The survey covered 427 districts with different demographics.
Virtually 100% of districts report that they have student information systems (providing data at least on attendance and enrollment) with over 70% reporting that they have had an SIS for at least six years.
Almost 80% have systems that report on benchmark assessments (generally state assessments). Of course, this data usually is released too late to do anything about it for the students who took the test. It helps a district improve when they understand that, say, 30% of its students did not meet 3rd grade reading standards six months ago, but it's too late to help bring those students up to standard.
About three-quarters of districts have data warehouses that allow of reporting on student data, and about two out of three have curriculum management systems. The growth of these systems has primarily been requirements of No Child Left Behind.
While this is a laudable result of pressure by the federal government, almost two-thirds of districts reported that they did not get adequate actionable data from these systems. A common reason cited was that the data remained isolated in each system.
Districts looking to improve their use of data to inform education practices require the following:
- reporting that consolidates and accesses the data of the different systems,
- tools to analyze the data to determine which combinations of factors contributed to success or failure of students,
- professional development, support, coaching, dedicated time during the working day, and the use of regular data review meetings for teachers and administrator on how to utilize data to improve instructional outcomes,
- action notices to the people who could effectuate changes, and
- task management and followup reporting for administrators and educators.
Examples of the types of analysis that should be available (because the data is already captured somewhere) are:
- How does student performance correlate to student participation in various instructional programs?
- How does student performance correlate to various teacher characteristics, like teacher education level, teacher expertise, length of service, and so on?
- Of the different methods used to teach the same content, which ones worked better on what types of students?
The report finds that the greatest perceived area of need among learning how to examine student data to identify effective practices, how to adapt instructional activities based on the needs revealed by data, and using curriculum-embedded formulative assessments to guide instruction. In fact, due to a combination of gaps in the five items listed above, the teacher use of data to improve their instructional practices was relatively rare.
The report makes clear that mandating schools and states to give tests and store data is only one element in successfully using data and information systems to improve educational outcomes. Without bringing a means to access the information that is stored in the various systems, without a means to analyze the data, without a means to use the data to generate action plans, without dedicating time and providing coaching and support, and without followup, the data and expense just isn't going to have an impact.
We will be at TCEA in Austin next week. Hope to see you there!
One more thing, a lot of people have asked to see pictures of our (my wife and me) trip to New Zealand. Here are pictures of the first 7 days, including our 3-day Routeburn hiking trek. And here are pictures of the next 9 days, including bicycling around wine country in Marlborough and our bike trip in the Nelson Lakes (60 miles a day). New Zealand is drop dead gorgeous.