When one does not see what one does not see, one does not even see that one is blind. — Paul Veyne

Mapping the Invisible

The ethical concerns surrounding AI in education resist capture within a single view. This is an interactive map — a way to see how forces at one scale cascade into conditions at others.

Scroll to explore
Planetary Scale
I
Planetary Forces & Economic Systems

Global

The broadest forces creating the material and economic conditions within which all educational decisions about AI occur. Carbon emissions, resource extraction, labor exploitation, and digital divides — often invisible to educators making classroom-level decisions, yet fundamentally shaping what tools are available, at what cost, and with what hidden consequences.

A. Environmental Injustice +

The physical and ecological costs of AI infrastructure, systematically externalized and rendered invisible to end users in educational settings.

Climate Impact & Resource Extraction

Massive computational requirements generate significant carbon emissions and consume enormous energy and water. Data centers disproportionately burden marginalized communities. Rare earth minerals — cobalt, lithium — are extracted under exploitative conditions in the Global South, perpetuating colonial patterns where wealthy nations consume technology while poor nations supply raw materials at tremendous human and ecological cost.

Digital Inequality & Access standardization

AI tools require substantial computational resources, reliable internet, and current hardware — widening educational gaps between well-resourced and under-resourced communities. The educational ecosystem increasingly assumes AI literacy and access, creating new forms of exclusion that operate between Global North and South, urban and rural, well-funded and under-funded.

Temporal Lock-in

Once educational systems reorganize around AI capabilities, reversal becomes extremely difficult — creating path dependencies that constrain future educational possibilities. Current decisions are made without input from future students who will inherit these dependencies, raising questions about intergenerational consent and our responsibility to preserve alternatives.

B. Economic Extraction +

How AI commodifies human creativity, labor, and learning while perpetuating global economic inequalities.

Labor Exploitation & Displacement labor

AI development relies on poorly compensated workers performing psychologically damaging content moderation — essential to making systems appear safe for education, yet invisible and unacknowledged. Within education, AI threatens to displace human educators while educators themselves inadvertently generate data that may automate their own work.

Intellectual Property & Cultural Theft

AI models are trained on copyrighted works and intellectual property without compensation or consent, transforming human cultural production into corporate assets. This extends to Indigenous knowledge systems and non-Western epistemologies consumed and repackaged for profit, perpetuating colonial dynamics of cultural theft.

Surveillance Capitalism & Data Commodification surveillance

At the systemic scale, surveillance operates as global data commodification — the transformation of human experience into extractable value. Educational platforms transform intimate details of student thinking into profitable data commodities, extending surveillance capitalism into the most private spaces of human development.

II
Organizations & Governance

Institutional

Where global forces become local realities. Schools, districts, and universities negotiate with vendors, establish policies, and allocate resources — mediating between planetary pressures and classroom experience. This is where surveillance capitalism materializes as specific contracts, where corporate consolidation becomes vendor lock-in.

A. Corporate Control +

How commercial interests shape educational priorities and infrastructure for profit rather than pedagogical goals.

Market & Institutional Capture standardization

Corporations prioritize engagement metrics over educational outcomes. Vendor lock-in reduces institutional autonomy. Unlike previous technologies requiring deliberate adoption, GenAI embeds into existing infrastructure — Google Workspace, Microsoft Office, LMS platforms — making opting out nearly impossible and bypassing traditional consent mechanisms.

Surveillance & Algorithmic Harm surveillance

At the institutional scale, surveillance materializes through collection infrastructure and policy frameworks. Platforms collect vast data about student thinking and learning patterns, creating permanent digital artifacts. AI systems perpetuate biases with compounding effects across intersecting identities — students facing multiple forms of marginalization encounter disproportionate algorithmic discrimination.

B. Failures of Governance +

The governance challenges and professional impacts that institutions and educators face when adopting AI.

Governance Gaps & Policy Vacuum

Deployment has outpaced regulation, creating legal ambiguity around data ownership, accountability, and rights. Technology companies shape policy through regulatory capture. Crisis-driven adoptions establish precedents that persist long after emergencies end, embedding ethically problematic technologies without adequate review.

Erosion of Professional Integrity labor

Institutions adopt AI without adequate frameworks, placing educators in ethically complex situations without support. Algorithmic supervision — tracking lesson plans, analyzing interactions, measuring engagement — undermines professional autonomy and trust, contributing to burnout and alienation from the creative, relational aspects of teaching.

III
Knowledge & Pedagogy

Epistemic

How AI reshapes the fundamental processes of learning, knowing, and teaching. Where institutional decisions materialize as changes in pedagogical practice — where "personalization" meets standardized output, where efficiency gains come at the cost of pedagogical intentionality, where assessment systems break down.

A. Threats to Knowledge Integrity +

Epistemological challenges when AI-mediated systems reshape how knowledge is produced, validated, and circulated.

Crisis of Truth & Verification

AI generates confident but incorrect information, challenging students' epistemological development. Synthetic content floods information ecosystems, complicating authenticity and credibility assessment. Convincing false narratives raise concerns about epistemic foundations for democratic participation.

Academic Integrity & Research Automation

Tracing AI content to original sources complicates citation and attribution. AI-generated research may justify AI adoption, creating circular validation. Automation raises fundamental questions about what counts as scholarship and whether AI-assisted knowledge carries the same epistemic weight.

Hidden Cognitive Curricula surveillance standardization

AI encodes particular thinking styles — linear, productivity-oriented — while suppressing divergent, contemplative, or non-Western approaches. This hidden curriculum shapes not just what students learn but how they learn to think, narrowing what counts as legitimate thought and marginalizing ways of knowing that don't align with computational logic.

B. Pedagogical Disruption +

How AI disrupts actual learning processes with implications for human development and educational purpose.

Learning Process Disruption

Eliminating productive struggle impedes critical thinking and metacognitive development. AI compresses learning time, collapsing processes that required hours of contemplation into seconds of generation — eliminating the productive pauses, confusion, and consolidation that allow knowledge to become truly owned rather than merely accessed.

Pedagogical Standardization & Flattening standardization

AI generates standardized responses that fail to account for infinite contextual variability — unique cultural backgrounds, learning differences, situational complexities. AI-generated materials appear sophisticated but lack the depth and intentional scaffolding expert educators bring, producing curriculum-shaped objects that mimic the form of good materials without their substance.

Erosion of Educational Purpose labor

Teaching involves ethical responsibilities beyond content delivery — modeling critical thinking, fostering moral development, nurturing intellectual courage. AI integration increasingly frames education through efficiency metrics rather than humanistic growth, reducing learning to optimization problems and students to data points.

IV
Development & Agency

Individual

Where all upstream forces ultimately arrive. Students' psychological development, relationships, sense of agency, and formation of identity. A student experiencing learned helplessness or struggling to develop an authentic intellectual voice is experiencing the downstream effects of decisions made at every other scale.

A. Relational Harm +

How AI affects students' capacity for human connection, social-emotional growth, and meaningful relationships.

Artificial Relationships & Social Isolation

Students develop parasocial relationships with AI — one-sided attachments lacking reciprocity, growth, and authenticity. These may interfere with real social skills, emotional intelligence, and the capacity for genuine intimacy that emerges only through actual human interaction with its inherent messiness and demands for mutual understanding.

Emotional Dependencies

Students may develop dependencies on AI's consistent, non-judgmental, always-available responses — without the complications of human relationships. Teachers serve as crucial emotional anchors providing empathy and validation from genuine understanding; technological substitutes create affective mismatches that fail to meet actual emotional and developmental needs.

Surveillance & Loss of Trust surveillance

At the individual scale, surveillance transforms pedagogical relationships themselves. When AI monitors student behavior, relationships shift from trust-based connection to algorithmic discipline. Students become less likely to take intellectual risks, admit confusion, or engage authentically when they know their every action feeds systems that evaluate and judge them.

B. Loss of Agency & Identity +

How AI shapes students' sense of self, capacity for independent thought, and psychological formation during critical developmental periods.

Threats to Identity & Authenticity

"Who am I as a thinker?" becomes confused when expressed thought originates from AI rather than personal struggle. The private nature of AI interactions means students' intellectual development occurs invisible to educators and mentors who might otherwise provide guidance and the human witnessing that helps young people develop authentic self-knowledge.

Agency & Learned Helplessness labor

Over-reliance creates dependency patterns undermining confidence and willingness to engage challenging tasks independently. This fosters learned helplessness — a belief that one's own thinking is inadequate. Yet students also actively negotiate, resist, and creatively reappropriate these tools, finding agency in unexpected places.

Developmental Disruption standardization

Instant AI responses reinforce expectations for immediate solutions, undermining patience, persistence, and tolerance for ambiguity essential for deep learning. The compression of developmental time prevents gradual maturation — the rhythms that allow confusion to resolve into clarity, struggle to transform into competence, persistence to build into mastery.

Seeing the Whole Board

This framework does not prescribe what educators, policymakers, or institutions should do about AI in education. It offers something more preliminary but perhaps equally necessary: a shared vocabulary for recognizing how ethical concerns manifest differently depending on the level at which we observe them.

A policymaker worried about governance and an educator worried about student wellbeing are not discussing unrelated topics — they are discussing different points along the same causal chain, each requiring different responses. Use the thread controls to trace how a single concern — surveillance, standardization, labor — transforms as it moves through scales.

A cylinder projects as a circle from one angle, a rectangle from another — neither false, but each incomplete. Only by holding multiple shadows in mind can we apprehend the full geometry.