Decolonizing AI: Building Justice Into the Algorithm
- Ayo Olufade
- 5 days ago
- 4 min read
Author: Dr Ayo Olufade
Artificial intelligence (AI) is shaping everything from healthcare to education to governance. But here’s the reality many don’t want to admit: AI is not neutral. It mirrors the biases and inequities embedded in the data it consumes, biases that hit BIPOC communities, women, and LGBTQ+ individuals the hardest.
On the STEAM Spark: Think STEAM Careers Legacy Podcast, I spoke with Christian Ortiz, founder of Justice AI and architect of the Decolonial Intelligence Algorithmic (DIA) Framework. Our conversation pushed me to rethink AI not as a technical glitch to be patched, but as a justice issue.
Is colonial data a fossil record of oppression? As Ortiz told me, “Without reframing the epistemic lens, you just get a more polite oppressor.”
Why Bias in AI Is Political, Not Just Technical:
Big tech often describes AI bias as a technical hiccup, messy data, incomplete sampling, or a few errors in the system. But as Ortiz and researchers like Ruha Benjamin,Benjamin’s 'Race After Technology' speaks to a growing concern among many of tech bias & Princeton University’s Ruja Benjamin on bias in data and AI | The Data Chief Podcast, and Safiya Noble https://youtu.be/6KLTpoTpkXo & Safiya Umoja Noble – UCLA Department of African American Studies remind us, bias is systemic. It’s baked into the very structures of knowledge that AI is trained on.
Take a simple example. Ask most AI systems: “Why is Africa poor?” The answers will sound familiar: corruption, poor governance, and lack of infrastructure. Colonialism, if it appears at all, is treated as history, not as a system that continues to shape global economies.
Justice AI flips the question. Its response begins with: “Africa is not poor. It is one of the wealthiest regions in the world in natural resources, biodiversity, and cultural innovation. What is called poverty is the manufactured outcome of centuries of extraction.”
That’s not just an answer. That’s a reframing, one that refuses to normalize oppression.
The DIA Framework: Rebuilding How AI Thinks:
How the DIA Framework Powers Justice AI. Instead of smoothing over outputs, it interrogates inputs. It asks: Whose knowledge counts? Whose voices are missing?
Ortiz explained: “I rebuilt the way AI thinks so it doesn’t automatically follow colonial patterns.”
The framework integrates:
Indigenous science and oral traditions
Community archives and multilingual sources
Intersectional harm testing
Governance by the communities most affected
This isn’t an “add-on.” It’s a fundamental redefinition of intelligence that values ancestral wisdom alongside modern computing.
Privacy and Data Sovereignty:
Bias isn’t the only challenge — so is data exploitation. Most platforms feed your inputs back into corporate models. Your words become someone else’s training set.
Justice AI resists this model. Ortiz stressed: “I control the GPT within the framework, so user data never trains OpenAI’s model, and I cannot see user conversations.”
This commitment to community-controlled data ecosystems means users can engage AI without sacrificing privacy or sovereignty.
Why Representation Matters:
Representation in AI remains critically low. Black workers make up less than 5% of the workforce at major tech companies; women authors represent less than 20% at top AI conferences. The result? Systems that reproduce invisibility.
“Authorship means epistemic accountability,” Ortiz said. “As an Afro-Indigenous, queer, neurodivergent technologist, claiming authorship is reclaiming narrative control in a space designed to erase us.”
Representation isn’t symbolic. It’s structural. Seeing innovators like Ortiz at the center of this work sends a clear message: you belong in AI too.
Justice in the Classroom: Students as Co-Creators:
The part of our conversation that struck me most was about the classroom. Ortiz described students co-creating AI models based on their lived experiences:
“What excites me most is watching students realize they can train AI with their knowledge. Suddenly, the algorithm reflects their lived experience instead of erasing it.”
Imagine this:
African American students are fed on their family’s migration histories.
Latinx students integrating community traditions.
Indigenous students are training models with ecological knowledge.
Instead of being told their perspectives are marginal, students see them recognized as central. This isn’t just better AI. It’s transformative education. It helps students see themselves not as outsiders to technology, but as its authors.
Beyond Algorithms: Restoring Stolen Futures:
Justice AI reminds us that the algorithm is just one battlefield. “Justice is restoring stolen futures,” Ortiz reflected.
Colonialism didn’t only take land and labor. It stripped communities of their epistemologies, their ways of knowing. If unchecked, AI will digitize those exclusions under the guise of neutrality.
But imagine an AI future that:
Educates students with multiple epistemologies from day one.
Diagnoses Black, Indigenous, and disabled bodies accurately.
Flags laws and policies that reproduce oppression in real time.
That’s not just a technical fix. That’s a movement toward collective liberation.
The Question Ahead:
The future of AI is not predetermined. It’s being built in real time. Justice AI asks us to decide:
Will AI remain colonial - polishing outputs but preserving inequities?
Or will it become just - a technology that restores what was erased, amplifies the marginalized, and reflects humanity without distortion?
For educators, that means introducing students to decolonial AI in classrooms. For organizations, it means adopting frameworks that protect privacy and center marginalized voices. For policymakers, it means regulating AI in ways that advance equity, not exploitation.
The challenge is clear. The choice is ours.
Here is Our Call to Action:
💡 How do you see justice in AI impacting your classroom, your workplace, or your community?
Drop a comment below, I’d love to hear your perspective.
"Embrace every challenge as an invitation to uncover your true potential. In the world of STEAM, curiosity isn't just a tool, it's the spark that ignites innovation, while passion carves the path to lasting impact. Your journey isn't just about mastering knowledge; it's about using your unique talents to illuminate the way for others. Dare to dream big, work with unwavering dedication, and let your light shine brilliantly." Choose STEAM Careers: Shape the Future, Design Your Destiny! ~ Dr. Ayo Olufade, PhD
Comments