Big Tech Firms Can’t Be Trusted to Decide Children’s Futures

The rise of online learning during the pandemic has accelerated the already rapid growth of the educational technology sector. But the Silicon Valley firms taking over the classroom remain far too unaccountable — and they’re massively extending the automated processes that unfairly decide children’s futures.

Advanced EdTech systems automate services from curriculum design, through admission decisions to course scheduling to assessments and career tracking. (Robo Wunderkind / Unsplash)

The pandemic has only hastened the transformation of the education sector, with students increasingly forced to rely on screens and platforms. But is it being transformed in ways that serve and promote children’s best interests — or the commercial interests of the private firms who increasingly dominate the educational landscape?

As reliance on educational technologies (EdTech) rises — with the global market growing thirty-two-fold since 2010, reaching $16 billion last year — so, too, does the influence of their providers. Indeed, these latter are actively and strategically shaping the direction and character of education systems in ways that lawmakers and educators can barely understand, and are not well-placed to monitor. Today, there is little oversight or dedicated regulation to protect children’s data — bearing significant long-term risks for society at large.

Little-understood digital education platforms and ever-evolving digital services — including data analytics and sophisticated tools for prediction and precision — are today being adopted by governments and supplied by private businesses. But there is little consideration for the long-term impact on children, education, and wider society. The foundation of such platform systems, many with embedded artificial intelligence, machine learning, and automated decision-making built in, depend on harvesting vast quantities of granular data generated and collected every day.

This enables constant “dataveillance,” behavioral control and, ultimately, a total loss of privacy and personal agency. Advanced EdTech systems automate services from curriculum design, through admission decisions to course scheduling to assessments and career tracking. The impact of this will extend into other sectors beyond education, including the employment market, health, and social services. If left unchecked, these systems could lead to automated discrimination and human rights abuses from the cradle to the grave. Yes — it could be that bad.

Children’s “Right to Future Tense”

Within this new educational ecosystem, the majority of end users cannot consent. They are children. Nor can their parents, because they are not consulted or properly informed. And yet, little regard has been made for the governance and regulation of the EdTech industry in key recent pieces of European legislation. This needs to change swiftly if children’s fundamental rights and their futures are to receive sufficient protection.

The education sector is becoming increasingly dependent on advanced digital products and services, most of which are privately developed, managed, and owned. Yet, educational institutions and students — the majority of EdTech end users — neither understand how these products and services work, nor have clarity about businesses’ ambitions for the sector.

The pandemic response has accelerated and exemplified this dependency — with education all around the world going online overnight. We see the dangers of such dependencies everywhere — from social networks harming democracy to dating websites sending users on “junk” dates.

Digital systems are being pushed in, institutionalized, and legitimized as frameworks for everyday decision-making across the spectrum of human activity. Who gets the job, who can take out a loan, who falls under what kind of pension scheme may all be decided by an algorithm going back to your childhood data — over which you have little control. These systems may enable inclusivity and automate various arduous processes, but the reliance of education on digital services will, if left unchecked, lead to “automating inequality.”

Poorer members of society are particularly prone to suffer automatic decision-making by algorithmic systems embedded in educational processes. Assessment, attainment profiling, and accountability rely on products (often developed by private companies); data and statistical understanding (often on teachers’ side of things); and algorithms that we have already seen fail students.

Some students have suffered from algorithmic bias for scholarships and college admissions, and during the pandemic students all across Europe have experienced anomalous grading. Last year, the Scottish government had to apologize for its poorly performing Qualifications Authority (SQA), after a hundred twenty-five thousand student assessments were downgraded. Students from poorer backgrounds are also more prone to be steered toward particular futures, as the “algorithm reckons.” The tracking of students through their schooling lifecycle can diminish their rights and choices, and automate their future paths.

Holistic Cultural Change

The EU Digital Services Act (DSA), currently being advanced through the legislative process, can be seen as the General Data Protection Regulation (GDPR) for speech. However, it omits any mention of children or education. This may or may not be deliberate. But if Europe is to maintain its global leadership in regulation, while breathing the dust of China and America in innovation, then its forthcoming laws also need to prioritize human choice, for instance the option for a child to opt out of surveillance-able applications and still be able to take an exam in class.

Nine-year-old student attends class online during the pandemic, 2020. (Ethan Miller / Getty Images)

The danger of colonizing societies and controlling human behavior through data analytics is not posed by social media platforms alone. It also owes to the structural transformation caused by “sector-agnostic” digital systems that capture entire societal infrastructures. Simply put, a child’s schooltime doesn’t end once she leaves the classroom; it goes beyond school walls, because the networked device with its multitude of applications captures information about her wherever she goes.

The EU’s Artificial Intelligence Act (AIA) is a key opportunity for improving the governance of the education sector. The AIA, for the first time, sets red lines in key areas on the usage of AI and explicitly mentions EdTech as a high-risk user of AI. Crucially, the existing provisions of the AIA are primarily focused on transparency requirements for high-risk AI; but current requirements are far more limited than those included in a leaked AIA-draft from January 2021 which went much further. Why have these been watered down?

The AIA as it stands does not ensure sufficient external audits by independent third parties. Yet, we have seen time and time again how Big Tech manipulate their transparency data to produce the results they want. Moreover, transparency reports are mostly voluntary. There is no consistency or rationale provided about what data is being shared and why.

The reports are usually not communicated in ways that make them comprehensible to the general public and there is no way to verify their accuracy. Experience with social media platforms shows the inadequacy of self-regulation and the failure of the practice of transparency reporting. External oversight and greater accountability for EdTech are urgently needed.

Defending the Citizen

EU legislation for digital services should deliver accountable EdTech and an “ethical race to the top.” Unfortunately, legislative development today heavily relies on lobbying, favoring powerful companies typically based in Silicon Valley and thus subverting the chances of success for smaller competitors (SMEs). Considering that SMEs make up 99 percent of EU businesses, ethical tech SMEs should at least be given fair opportunities in the market. The development of legislation should be characterized by openness and inclusivity ensuring the fair representation of SMEs and their participation in the early stages of the process.

The responsibility for regulating EdTech does not solely lie with the EU. National governments also have a major role to play. In particular, considerable improvements are urgently needed in the way EdTech is procured by public authorities. An interesting example here is the procurement rules in Scotland. There, data controllers and other local authorities, not the central government, set the rules. Decisively, they do not allow for student assessment data to be used to create league tables or compare schools for a different purpose than the one the data were originally collected for.

Making EdTech Accountable

Meaningfully accountable EdTech will enable society to understand the values and norms that are driving research and development; it will help safeguard children’s best interests and evidence real opt-out options and choices for young people. But EdTech must be properly held to account.

Appointing a national commissioner or EU High Level Group or other similar institution to ensure that children’s rights are systematically considered in all legislative measures would be a significant step in the right direction. Responsibility for children’s fundamental rights cannot be siloed into and fragmented across individual legislative packages. A holistic approach to children’s rights in EdTech at both EU and national levels is needed, with the key challenges addressed in each legislative package, whether it is DSA, AIA, or public procurement.

In educational settings, children’s rights must be front and center. With children being the main end users of EdTech it is imperative to produce regulatory frameworks that will protect and empower them against the backdrop of a rapid black-boxed transformation. We must take care that education does not become corporatized and children’s data does not become a tradable commodity for EdTech giants. Ultimately, if proper attention isn’t paid today, the burden will fall on wider society.

Share this article

Contributors

Velislava Hillman is a researcher in education technologies, children, and youth. She is founder and partner of education data | digital sovereignty (EDDS), a pan-European consortium that aims to empower and protect children’s data.

Ioanna Noula is a childhood and education expert. She is cofounder and partner of education data | digital sovereignty (EDDS), and cofounder and director of research at the Internet Commission, a non-profit organization focusing on advancing corporate digital responsibility.

Ben Wagner is an assistant professor at the Delft University of Technology, and a visiting researcher at the Human Centered Computing Group, University of Oxford.

Filed Under