Power, Agency and the Speed of AI
- Pamela Minnoch

- 52 minutes ago
- 3 min read
Is AI moving faster than humans can keep up?
There's a particular kind of unease that comes with watching a technology evolve faster than our ability to make sense of it. Artificial intelligence has pushed us into that space. Every month, sometimes every week, we see new system emerging. More capable, more autonomous, more deeply woven into the decisions that shape our lives.
For many of us, there's a growing sense that society is being swept along by a current we didn't choose, at a speed we didn't consent to. And at the heart of this acceleration sits one question that will define the next decade:
How do we protect human agency when intelligence itself is being reshaped?
This is a question about power, not technology.
The shrinking window where humans still steer the direction
The truth is that AI is no longer a collection of niche tools sitting quietly in the background. It has become an infrastructure layer, like electricity or the internet, except this time, the infrastructure is capable of making decisions. And because these decisions increasingly influence how we work, how we access services, and how we participate in society, the people who build these systems now hold extraordinary power.
What makes this moment so fragile is not just the speed of progress, but the concentration of influence. A small number of companies, teams, and leaders are setting the rules, sometimes intentionally, often by accident, for how intelligence evolves. Their choices ripple out across industries, governments, and communities that never had a chance to participate in the conversation.
When the pace of technology change outruns the pace of democratic oversight, agency shrinks. The decisions we make as leaders become reactions, not deliberate choices. We're left trying to make sense of tools that evolve faster than our policies, our ethics frameworks, and in some cases, our imagination.
This is a reminder that power doesn't disappear, it shifts.
Why the speed of AI is different from every other technological shift
History is full of technological transitions. But AI is different because it isn't just replacing physical labour or automating routine processes. It's beginning to change how decisions are made. And once decision making shifts, everything else follows.
We're stepping into a world where:
risk models are written by algorithms,
customer journeys are shaped by predictive logic,
governments experiment with AI-assisted governance, and
everyday people increasingly rely on models to think, create, and act.
This is not the slow, generational transition of the industrial revolution. This is transformation in compressed time. And when systems move faster than humans who depend on them, our ability to respond thoughtfully narrows.
We lose room for nuance.
We lose room for public dialogue.
We lose room to question whether we want to follow the path being laid out.
The pace forces us into acceptance before we've even had the chance to understand what we're accepting.
The ethical heart of the matter: stewardship, not control
AI ethics isn't about slowing down progress or resisting innovation. It's about ensuring the direction of progress reflects the values of the communities who will live with its consequences.
Stewardship begins with acknowledging that:
technology should serve people, not replace their sense of agency,
humans, not algorithms, must remain accountable for decisions,
power structures must be visible and contestable,
and speed cannot become an excuse for abandoning responsibility.
Ethical leadership in the age of AI asks us to hold two truths at once: We cannot stop the pace of advancement. But we absolutely can shape the conditions under which it unfolds.
The future doesn't have to be something that happens to us. We can choose to make it something shaped with us, and ultimately for us.
A final thought: this is the moment that matters
There's a window right now. It's small but open. Where humanity still holds the pen. Where we can decide how AI integrates into society rather than waking up in a world where those decisions were made without us.
Leaders who step into this space with clarity, humility, and courage will help shape a future in which intelligence expands human potential rather than shrinking it.
Because the defining question of this era is no longer "What can AI do?" It's "how do we remain authors of our own story as intelligence evolves?"
This is a crossroads and our leadership matters more than you realise.



Comments