With one computer keystroke, officials in North Carolina’s Guilford County school district can see which students are at risk of dropping out years before it might happen. They can check class attendance, current grades, and past years’ grades. They can see whether a student is older than his or her classmates, and whether that student lags behind in credits earned.
The district’s advanced data system analyzes those factors and pinpoints which students may not make it to graduation. Principals and teachers then design customized learning plans to help students earn a diploma. The 72,300-student district’s dropout rate is 3 percent—one of the lowest among large districts in the state.
“If we didn’t have the capacity to target and identify these students, we would not be able to provide them with resources and customized scheduling,” says Terrence Young, the district’s chief information officer. “Data is certainly a driving factor behind our low dropout rate.”
The federal No Child Left Behind Act requires districts to track student achievement and slice and dice the data into different subgroups. Since the measure was signed into law six years ago, school officials have moved to pay more attention to data systems and the information those systems can provide to help with everything from streamlining district operations to raising test scores. States, too, have begun instituting their own education data systems, and experts see a culture shift in a field that once put the idea of analytic data systems low on a list of priorities.
“Unless you have the capacity to analyze this data,” Young says, “you’re data-rich, but information-poor.”
But the push to put high-quality data systems in place has been slow, says Aimee R. Guidera, the director of the Austin, Texas-based Data Quality Campaign. Her organization was established two years ago to call more attention to the issue of data collection by districts and states and to improve the quality of the data.
“As more and more conversations took place about alignment for state standards and high school exit exams and college-entry standards and high school rigor, … all the discussions couldn’t be informed because we didn’t have access to that information,” Guidera says. “Data is the least sexy of all issues, and nobody wanted to talk about it.”
The Data Quality Campaign focuses mostly on state data systems, and a progress report from the organization released in November found that states are moving forward. The report found that 47 states have data systems in place that include five or more essential elements for success identified by the organization. Some of those elements include the ability to track student test scores from year to year, having an individual student-identification system, and the ability to match students with their teachers.
1: Make sure your datawarehousing system has the ability to integrate data from different systems within the district that collect various types of information. Make sure your data system is flexible and can work with different types of software.
2: Emphasize accuracy. If your system is not accurate, teachers and administrators won’t have confidence in it and consequently will not use it.
3: Decide whether you have the capacity to develop your own data system or want to hire an outside vendor to provide one. When choosing a vendor, make sure you are clear about what kinds of information you want the data system to provide.
4: Provide training to staff members, including teachers and administrators, who are going to use the system so they understand how to tap into the information and how to interpret data.
5: Talk to people in the district, including staff members and parents, to learn what questions they want answered through data analysis.
6: Talk to other districts and organizations that have effective data systems to get information about how to build or buy one that will work for you.
But it’s difficult to determine how districts are faring in their efforts. The U.S. Department of Education is in the process of interviewing officials in 500 districts to evaluate the capabilities of their data systems, says Timothy J. Magner, the director of the office of educational technology at the department.
“We’re really seeing a culture change,” he says. “There’s been a seat-of-the-pants decisionmaking process until recently because getting access to data and analyzing it has been so difficult. Now we have the tools and the technology that allow us to get at data in a more structured, timely, organized fashion, creating a culture of continuous improvement.”
Several types of data can be collected, often with different systems, Magner says. One type involves the instructional side of the district. Once such data are collected, systems have the ability to drill down to the student level and analyze factors that may affect individual performance. “You can look at trends among individual students, buildings, and programs,” Magner says.
His surveys so far have found progress in some districts when it comes to analyzing student data, he says, as well as the automation of time-consuming processes, allowing districts to use personnel more efficiently.
Automated data systems that collect attendance records, for example, can free up employees to do jobs that focus more on students instead of on data entry, Magner says. And data systems that collect a variety of information on students—from the length of their bus rides to the books they check out of the library to the meals they eat—can help educators draw connections between those factors and student achievement.
“Schools can understand earlier in the process how a student is faring and why,” Magner says. “The results can come back quickly, and educators can intervene.”
In Minnesota’s 40,000-student Anoka-Hennepin district just north of Minneapolis, officials are using their data system in a variety of ways. At the start of the 2007-08 school year, every teacher used the system to look at student demographics, enrollment, testing, academic history, grades, and attendance.
“We actually have a complete student-summary report which includes 90 percent of the data that used to be in the paper folder,” says Georgia Kedrowski, the district’s assistant director of technology. “A teacher can drill down and find a group of students who need intervention programs.
“We’ve eliminated a lot of the barriers to using data, because it’s fast and it’s easy. Teachers are loving it,” Kedrowski says.
ACT
www.act.org
The Administrative Assistants Ltd.
www.aalsolutions.com
Century Consultants Ltd.
www.centuryltd.com
Claraview Inc.
www.claraview.com
Cognos
www.cognos.com
Confluent Technologies
www.confluentasp.com
CTB/McGraw-Hill
www.CTB.com
Datawise Inc.
www.datawise-ed.com
eDistrict
www.edistrict.net
Edustructures
www.edustructures.com
eScholar
www.escholar.com
ESP Solutions Group
www.espsg.com
IBM
www.IBM.com
Infinite Campus Inc.
www.infinitecampus.com
OS4Ed
www.os4ed.com
Otis Educational Systems Inc.
www.otised.com
Pearson
www.pearsonschool.com
Public Consulting Group
www.pcgus.com
Q3 Solutions
www.q3solutions.com
SAS
www.sas.com
School Information Systems Inc.
www.sisk12.com
SchoolNet Inc.
www.schoolnet.com
STI Education Data Management Solutions
www.sti-k12.com
SunGuard Pentamation
www.pentamation.com
TetraData Corp.
www.tetradata.com
Third Day Solutions
www.thirddaysolutions.com
Triand Inc.
www.triand.com
Wireless Generation
www.wirelessgeneration.com
Anoka-Hennepin is using an outside vendor, the Central Minnesota Educational Research and Development Council, or cmERDC, a nonprofit organization based in St. Cloud, Minn., to provide its data system, called Viewpoint. Many districts around the country hire outside companies to put data-warehousing systems in place, though some build their own. Still others are using data systems provided by the state education agency.
Anoka-Hennepin went with an outside provider because the district had few resources to devote to the effort, including staff and money, and didn’t have the technical capacity to develop a homegrown system, Kedrowski says.
But Kedrowski says an earlier version of the system had kinks that had to be worked out. “It was not as technically sound or easy to use,” she says. “The first and most difficult thing for any district is to be able to supply good data to the data structure.”
Though it sounds simple, one of the most challenging problems for districts is ensuring that each student has a unique identification number that sticks with him or her through an entire school career, Kedrowski says. In Minnesota, for example, students have a state identification number and a local identification number. Sometimes a student’s name may be entered as “Mike” and at other times as “Michael,” resulting in several different entries for the same student. Transfers between schools or districts also can sometimes confuse the system, Kedrowski says.
“Every time you have a piece of data, you must make sure you’re using the same unique ID,” she says. “If you don’t do that and people look at the data and it doesn’t match up with the student, you lose credibility so fast. If that happens, you spent an awful lot of time and money, and people won’t use the system because they don’t trust it.”
A study released in November by the University of Texas at Austin of data use in the 11,500-student Natrona County school district in Casper, Wyo., found a series of problems, including a lack of integration among data systems and issues with accuracy. “We heard many instances of groups that did not trust student demographic data provided by the district, sometimes to the point of maintaining their own databases,” the report found.
Shawn T. Bay, the founder and chief executive officer of White Plains, N.Y.-based eScholar, which provides educational-data-warehousing systems to districts and state education agencies, calls schools “the most difficult data environment I’ve ever worked in.”
Accuracy, he says, is crucial. “You think everybody needs accuracy, but in some industries if you’re plus or minus a couple of units, it’s OK,” Bay says. “But in school districts, you’re trying to help individual children. If you get their test results attached to someone else, you’ve made a big mistake.”
Data systems also must be able to talk to each other. In a school district, there are likely to be different systems for different purposes, such as collecting attendance, tracking special education students, and gathering test scores. A system that isn’t able to tap into all that information simultaneously and compare and analyze it won’t provide the answers teachers and administrators are seeking, Bay says.
“It needs to be that as soon as you can think about a question, you can get answers, and you don’t get sidetracked when you go to get the data, so that you never get to the conclusion and aren’t able to take action,” he says. “This way, you can ask some really interesting, deep, probing questions.”
Districts also have to be sure that teachers and administrators get the training they need not only to use the system, but also interpret the resulting trove of data.
The Technology Counts 2006 report focused on data-driven decisionmaking. Read the feature, “Delving Into Data.”
In North Carolina’s Guilford County district, Young says, the data system provided staff members with information in a format that used Excel spreadsheets. “We quickly realized who could use Excel and who couldn’t,” he says. The district later instituted Excel training to fill the skill void that existed.
Despite the new depth of understanding that data systems can provide to school officials, Magner of the federal Education Department cautions that they are just tools to be used when thinking about how to help a student achieve. Students are not just test scores on paper, or demographic statistics.
“It’s important to recognize that we’re talking about kids here, and we shouldn’t reduce them to data,” says Magner. “But this information gives teachers, policymakers, and education professionals information they can use to reflect on the experiences they’re seeing with an individual.”