Protecting our indigenous culture by staying away from AI? Or dooming it to be forgotten?
- Pamela Minnoch
- Jun 2
- 2 min read
Reflections from a TechWeek kōrero
At a recent TechWeek event, I found myself in one of those networking circles where the conversation goes deeper than expected. We started with casual intros and small talk, but soon we were deep in a conversation about data sovereignty, specifically for Māori and Pasifika communities, and how it intersects with the rise of AI.
It wasn't a debate, more of a thoughtful exchange. But there was a shared tension. On one side, the very real and justified hesitations around AI and data use. On the other, a sense of urgency, not to be left out of the systems shaping our digital future.
The hesitation is not just understandable. It's necessary
When people talk about data sovereignty in indigenous contexts, they're not being overly cautious. They're remembering a long history of extraction of information, of resources, of stories and knowledge taken without permission and often without benefit to the communities they came from.
So it makes total sense that Māori and Pasifika people would approach AI with a degree of scepticism. Who's building it? Who's profiting? And what happens to our knowledge, our language, our identity when it's scraped into a machine?
But what happens if we say no?
Here's the flip side of the conversation. If we keep our data out of AI models entirely, we might protect it from misuse, but we also risk erasing ourselves from the digital world that's rapidly forming around us.
AI is becoming the lens through which much of the world sees and understands itself. If our stories, our language, our ways of thinking aren't in that lens, they won't be reflected back. Not accurately. Not at all.
Someone said, "If we don't show up in the training data, will we show up in the future?." and the answer is yes, but we may not be represented accurately.
Reframing the role we play
What if, instead of viewing AI as a threat, we saw it as a new platform for storytelling? What if we had the power, not just to give or withhold data, but to shape how it's used? To bring richness, accuracy, and respect to how our cultures are represented?
Of course, that means careful thought and clear boundaries. It means building models with indigenous oversight, using ethical frameworks with care, and making sure communities have both control and benefit. But it also means not sitting back. Not letting others define the narrative.
Choosing visibility, on our own terms
The conversation at TechWeek didn't land on a single answer, but it raised the stakes in a way I hadn't felt before. Yes, caution is wise. But absence can be just as risky as misuse. If we want the future of AI to reflect our world, our voices need to be in the room, and in the data.
The question is not just whether AI should use indigenous data. It's who gets to decide how, and why. And maybe that's a space where real empowerment can begin.
This post isn't about choosing sides. It's about starting a deeper conversation. Whose data is it? And who gets to decide how it's used?
I'd love to hear your thoughts!
Commenti