Bullying emanating on social media from a device that many kids keep right by their bedsides. “Challenges” that encourage the destruction of school property. Violent threats to commit a massacre at an elementary school.
Social media has transformed childhood, and given the adults who work with or care for kids a litany of concerns that previous generations of educators and parents could never have imagined.
Now, many educators want to know how Meta, the company that owns some of the oldest and most popular social media platforms—Instagram, Facebook, and WhatsApp—plans to help schools handle those challenges, or even acknowledge their own role in creating them.
Many don’t feel that they’ve gotten a clear answer yet.
The issue has become particularly heated since documents released last year through a whistleblower revealed that Meta conducted extensive research on the negative impact of its platforms on children’s mental health and the spread of false information, but failed to act on any of those findings.
“I do think that they owe an explanation. I think they owe it not just to the parents and the educators but to the world,” said Bill Bass, the innovation coordinator for the Parkway School district in Missouri, in an interview after Meta’s head of global security, Antigone Davis, spoke to a room full of educators at the International Society for Technology in Education’s annual conference last month in New Orleans. “I think there is a lack of trust that is inherent now” between educators and social media companies, even those companies that are working to secure student data and think about student mental health.
‘A whole slew of bullying that we couldn’t look at’
During the ISTE panel, Davis outlined her company’s lesson plans for parents and teachers, parental management tools, and Meta’s efforts to “build up social learning tools within that digital literacy.”
And the company has tools on its platforms to identify “potentially bullying content and to remove it if it violates our policies,” Davis said. But, she added, “there’s a whole slew of bullying that we couldn’t look at and tell what is happening.”
For instance, she used the example of classmates making fun of a fellow female student’s skirt and one saying “nice skirt” in a comment on a picture. “There’s zero way for us to know that is bullying without additional context,” Davis said. “Sometimes, we can get that additional context but generally we’re not going to.”
Meta has tools that allow teens—and other users— to red flag words that might be used to bully them or that they don’t want to see on their feeds.
While he was pleased to hear that Meta has teacher resources, Matthew Winter—an instructor for the Utah Education Network, which works with districts throughout the Beehive State on technology needs—wishes all prominent social media companies could somehow figure out how to give educators a tutorial on their many features so they can help kids and parents.
He wants to know, “this is what happens on Snapchat when a kid logs in and this is what happens when they get into Instagram. This is what Instagram Live is. This is what TikTok is,” he said.
Right now, “we have to go out and explore it. We have to figure it out first.” And that can be time-consuming for teachers who have a lot on their plates, he pointed out, and especially for those who are not tech savvy.
‘A Band-Aid that still keeps kids in that ecosystem’
Educators at ISTE pressed Davis and Jacqueline Beauchere, the global head of platform safety for Snap Inc., the company behind Snapchat, on how the companies aim to ensure the safety of kids under the age of 13, who aren’t legally allowed to use their flagship platforms but often sign up anyway.
They noticed a stark difference in their answers.
Beauchere said Snapchat just isn’t for younger users.
“We are not designed for children under the age limit,” Beauchere said. “I can’t emphasize that enough. Snap is 14 plus. Those rules are there for a reason, and they really need to be abided by.”
But Davis suggested her company could find a way to safely offer younger kids access to social media platforms.
“Opening the door for the ability to have some degree of much more monitored technology for younger people may be part of what we need to do,” Davis said. Meta, she said, already has “Messenger Kids,” a platform with what she described as stringent parental controls.
Bass found that answer “shortsighted. I don’t think it’s a solution. It’s a Band-Aid that still keeps kids in that ecosystem.”
One thing Davis did not mention: Instagram for Kids. Meta was initially planning to develop a version of the social media platform for younger children, but paused its plan after pushback from critics who, like Bass, saw it as just another way for the company to hook kids early.
And Maureen, a former teacher from Western Canada who now works for an education nonprofit but did not want to give her last name because it is not her job to speak to reporters, didn’t “hear a corporate responsibility to do something other than a business model of trying to get kids into the apps,” she said in a brief interview after the panel ended. The platforms are “doing a lot of damage to kids’ mental health. There are great things about social media, there are places to use it, but these corporations need to step up.”
Winter, from the Utah Education Network, agreed. “I think there needs to be a little bit more foresight [from companies] about what they put out in the future. What is it going to do to our kids?”