Early in 2018, ed-tech company Clever launched a new product aimed at helping tens of thousands of schools tackle a vexing challenge: determining whether students are actually using the software and learning apps on which the K-12 sector spends billions of dollars each year.
Less than 18 months later, though, it pulled the plug. Clever announced it was “sunsetting” the effort and moving in a new direction.
What happened? Why did the effort flop? And what does Clever’s surprise pivot say about how serious schools are about keeping track of all the technology that has flooded public school classrooms?
“What we found once we were in market was that it proved to be a nice-to-have more than a must-have for a lot of district leaders,” Dan Carroll, the company’s co-founder and chief product officer, said in a July interview. “They were excited to try it out, but [tracking ed-tech usage] wasn’t one of their top three or five priorities for the year.”
A Multi-Pronged Problem
That analysis will likely come as a disappointment to those K-12 CTOs and CIOs who describe the opacity around ed-tech usage as a significant—and multi-pronged—problem.
If schools don’t know what digital and online learning tools teachers and students actually use, their thinking goes, how can they decide what to purchase and renew? How are they supposed to better focus scarce professional-development resources? How can they determine what ed-tech strategies will actually impact student achievement?
This special report—the first in a series of three special reports for the 2019-20 school year that Education Week is producing for K-12 ed-tech leaders—examines how schools track tech usage and what steps they should take to make better use of educational technology tools. Read the full report here.
A number of products on the market aim to help. Companies such as LearnPlatform and BrightBytes, for example, offer analytics and dashboards that present information such as how many students log in to various software programs, how much time they spend using them, and what kind of progress they are making.
BrightBytes has tried to highlight the problem of scattershot usage. In 2018, for example, the company released an analysis based on how nearly 400,000 students in 48 districts used 177 different learning apps. It found that the typical district didn’t use 30 percent of the ed-tech licenses it purchased.
Given such realities, it was significant news when San Francisco-based Clever announced it was launching a new product called Goals. The company had already won the trust of roughly 60,000 schools by offering easy-to-use solutions to mundane problems, such as loading class rosters into learning software and allowing students to access the dozens of online tools with a single username and password.
The company’s new idea was similar: Give educators and administrators a single place to look across multiple learning tools at two simple metrics: how much time did students spend using each program, and how much progress did they make?
Clever officials were so convinced that districts would find such information valuable they made Goals their first service that schools would have to directly pay to use.
But a year after it was launched, the product had just 50 or so paying customers.
Worse, maintaining the service at even that limited scale proved to be a major time-suck for Clever’s engineers.
“We thought it would be very hard and expensive, which certainly proved true,” Carroll said. “We just couldn’t justify it financially.”
Unanticipated Complications
One of the school systems that tried Clever Goals, but ended up dropping it, was School District U-46 in Illinois.
It’s an ongoing challenge to make usage data relevant to both the specific tool being used and the specific way it’s being used in a given school or classroom, said Matt Raimondi, the district’s assessment and accountability coordinator, in an email.
Usage data can end up being pretty noisy or misleading. A usage report might show 20 percent of students using software X, and that could actually be 100 percent of the students that are supposed to be using it.”
For example, for one service, it might make sense to report how many licenses are actually used. For another, it might be more valuable to report out how much time students spent on-task, using the product. For a third, the most useful information might be the number of lessons or modules each student completes.
Further complicating matters, one school or district might use a tool for core instruction for all students, while another might use the same tool as remediation for a small subset of students.
“Usage data can end up being pretty noisy or misleading in that regard,” Raimondi said. “A usage report might show 20 percent of students using software X, and that could actually be 100 percent of the students that are supposed to be using it.”
Clever’s Carroll said that was just one of several issues the company faced with Goals.
Even a single usage metric, such as time-on-task, proved nearly impossible to standardize, he said. How do you capture that with learning tools that require off-line work, for example?
And even when different companies were focused on the same data, he said, they often presented the information in different formats. That meant linking each software program into the Goals dashboard required a custom integration, which proved costly and time-consuming.
But the biggest hurdle by far, Carroll said, wasn’t technical at all. It was finding ways to help educators and administrators move beyond just looking at usage data to actually doing something useful with the information.
“That kind of change management is really a Herculean effort,” he said. “You need cultural buy-in, great training, and time allocated for people to look at the data and talk about it. We kind of underestimated the challenge there.”
‘Priorities of District Leaders’
Paige Kowalski isn’t one to say “I told you so.”
But back when Clever Goals was launched, the executive vice president of the nonprofit Data Quality Campaign warned that real data-driven change only comes when educators and administrators get structured opportunities to turn information into action.
“There’s this belief out there that if we just get schools one more data point, it will really tell them something,” Kowalski said in an July interview. “But it doesn’t. It just begs more questions. I think that’s what Clever ran into.”
While the company has changed direction and pulled Goals off the market, it isn’t getting out of tracking ed-tech usage altogether.
Clever is now working on a free analytics tool that seeks to report student and teacher visits to online resources, including the amount of time they spent. The new product is in a testing phase at 300 schools. The company is eyeing early 2020 to make the product more widely available.
Once the tool is up and running, Clever users won’t be charged extra to use it, Carroll said. The company still isn’t prepared to offer teachers the kind of time and labor-intensive coaching needed to make usage data valuable for them, he said. So the new product will be focused on helping school and district leaders track aggregate usage and make system-level decisions.