16+ Ways EdTech Harms Children
If EdTech doesn't educate, what does it do?
The original version of this essay was posted on my website on February 11, 2025.
Just as there is no such thing as a safe cigarette, there is no such thing as safe EdTech.
At the end of the day, the business model underlying Big Tech and its sweater-vested cousin, EdTech, is extractive, manipulative, and harmful to children and development.
Period.
It isn’t (just) about the fact that too much time on school iPads is bad for learning or vision or focus (all true), or that “learning” apps are gamified to keep children on a platform longer (see here); or that kids can access ChatGPT and porn on their school-issued devices (they can); it is the fact that the only way these products will be profitable for their makers is if they collect the data of its users and Big Tech and EdTech are identical in the way they extract user data for profit. These companies make deliberate choices to choose profits over people, and they will continue to do so until they are forced to do otherwise.
When it comes to EdTech, however, it is easy to fall into the trap of believing that if this tech tool is for “educational purposes,” it must then be safe and good, right?
Unfortunately, no, not at all, or at least, not in the way that EdTech tools are currently being deployed. (And just a reminder— EdTech is *not* TechEd.)
I’m tired of hearing that children need these products to “be successful” in education. As my brilliant friend and colleague, Dr. Jared Cooney Horvath quips, “Teach kids to think and I’ll guarantee they’ll be able to use tech; teach kids to tech, and there’s no guarantee they’ll ever be able to think.”
We need to raise children who can think. Our democracy depends on a thinking citizenry.
On that note…
How Does EdTech Harm Children?*
EdTech invades privacy. Everything a child does on a school-issued device is subject to surveillance by private companies with a vested interest in their data.
Surveillance erodes trust, especially for children who are in a vulnerable developmental stage where they are learning to discern what and who are safe from what and who are not safe.
EdTech is Big Brother. EdTech isn’t just operating at school; EdTech platforms can monitor student devices even while they are at home.
EdTech shares children’s data. EdTech companies partner with local law enforcement, providing behavioral records to help “predict” future criminal activity.
EdTech is highly commercialized. Children, even within “educational” apps, are exposed to advertisements and marketing strategies to increase engagement.
EdTech apps, games, and curricula are embedded with manipulative design techniques (sometimes called “persuasive design”) whose sole goal is to keep users on the screen longer to serve them more ads. This is fundamentally at odds with what children need for healthy learning and development.
EdTech increases screen use. Too much screentime is harmful to children’s health, whether it is their physical health, mental health, or cognitive health. We know that outside of school hours, children ages 8-18 average 7.5 hours per day on screens.
EdTech exposes children to harmful content. Children are viewing inappropriate content at school: porn, violence, self-harm videos, and extremist content, as well as engaging with Chat GPT or other forms of AI that has driven a child to commit suicide. (AI “tutors” are simply AI chatbots.)
EdTech isn’t monitorable. Even the most competent IT department cannot monitor every app, student device, or website. New sites pop up daily, children always find the workarounds, and as long as devices are internet-connected, there will be problems.
EdTech blames parents. Tech companies in general but EdTech companies in particular love to place the burden of “safe technology use” on parents (or teachers), rather than the leaders who offered these harmful tools or the technology companies who intentionally built them this way in the first place. Children as young as five years old are asked to sign “Responsible Use Agreements” when they are issued school-tech (and parents are told they are responsible when their child misuses it). In the current landscape of “compulsory” EdTech in schools, informed consent by parents is dubious at best and a new amicus brief filed by the FTC calls into question whether or not schools can rely on the “school official” exemption for consent.
EdTech requires engagement to thrive. Children require human relationships to thrive. It is in the context of relationships where learning occurs, not in the “ability” of a manipulative app to engage a child. Parents and schools may desire reduced “screentime”; but the goal of EdTech platforms is to increase engagement. These two goals are diametrically opposed.
EdTech feeds on student data. Students’ data is highly valuable and highly vulnerable. EdTech platforms are ripe for leaks and hacks. In Vermont, the new Attorney General announced that in the first 30 days in 2025, there had been 31 data breaches alone— in one month. Other data breaches have occurred, capturing not only private student data, but teacher records as well.
EdTech platforms can be discriminatory, whether intentional or not. Bias is baked into AI datasets, for example, and shared student demographic information that can be shared with recruiters and colleges can allow for filtering of characteristics such as race or family socioeconomic status. Despite industry claims that EdTech can close achievement gaps, the opposite is occurring.
There are thousands of EdTech tools on the market. Most parents have no idea how many unique platforms their child is using for school. Internet Safety Labs found that on average, schools use 125 distinct EdTech platforms. Some districts’ lists number in the thousands, and these are the ones considered “approved” or “vetted” and do not include apps or websites that individual teachers may utilize. At a bare minimum, parents should be provide a complete list of all district-approved apps, products, and software.
FERPA, the law designed to help parents access EdTech data, is not enforced. Even when parents attempt to seek the data that EdTech companies have collected about their child, they are met with silence, resistance, minimal support, or confusion, even though FERPA, a federal law, requires schools to provide this information to parents.
EdTech is all about the business model: profit over people. It is one thing for a school to collect information about its students; it’s entirely another for a corporation to take that same information and sell it for profit, sometimes even back to the school. As long as EdTech tools must connect to the internet to function (so data can easily be collected), children’s data is not protected (and the tool isn’t safe for children).
Off the top of my head since I first drafted this essay earlier this year, I would also add:
EdTech displaces teachers. If the business model of tech is to “scale,” then “scaling” in education means doing more with less. In this case, teachers are “less”— no wonder GenAI tools are flooding classrooms. That’s the next step— AI tutors instead of teachers.
EdTech displaces creative pursuits. Can a child learn something via an EdTech platform? Probably. But the question isn’t so much what it “can” do as what it displaces in the process. Too often, tech displaces the stuff that actually matters, like critical thinking or creativity.
EdTech erodes democracy. It sounds dramatic, but it’s a very real risk. Those who control the children, control the future. (And yes, that is a paraphrase of a quote attributed to H*tler, not someone I’m prone to citing.)
Do you have any other harms to add to this list? Add them in the comments. There is much to be concerned about.
As a reminder, Tech Ed is not EdTech. We do not need internet-connected devices to teach children to read; 1:1 devices to learn about mis- or dis-information; or online curricula to teach math or writing.
Any conversation about EdTech must begin with understanding that the business model itself is the underlying problem, and that until these companies are forced– or elect (ha)– to change, EdTech products as they currently exist will never be safe for children.
*Adapted from this resource written by my colleagues at EdTech Law Center.


