We the People, Not We the Algorithm: How the Government Failed to Ask for Our Consent
Part 2 of a 3-Part Series on Reclaiming the Right to Be Human
By Graham Skidmore
I Didn’t Consent to Be Represented This Way
I was raised to believe that government was designed to represent the people. That our elected officials were meant to listen, not dictate. That leadership required consent, not assumption. That liberty included the right to participate in decisions that shape our lives and our future.
But when it comes to artificial intelligence, its development, its deployment, and its growing influence, one thing has become painfully clear:
The people have been left out.
There have been no meaningful invitations for public input. No national dialogue. No space to question how this technology will shape our lives. Only top-down decisions are made behind closed doors.
Instead of protecting the public interest, our government is too often partnering with corporations to define what “progress” looks like, without asking who it should truly serve.
And let’s not overlook the irony here: The very technologies being praised as “smarter than us” are trained on the data we create.
Our words.
Our clicks.
Our choices.
Our humanity.
Our lived experiences.
We are the input, but not the decision-makers. We are the source, but not the beneficiaries.
This isn’t about rejecting technology. It’s about reclaiming representation.
Because if democracy means anything, it must mean this: that the will of the people, not the will of an algorithm, guides the future we build together.
And yet:
No one asked us.
We didn’t get a vote.
We didn’t get a town hall.
We didn’t get time to participate in what this means for our children, our work, our values, or our lives.
Instead, we were told it’s coming. Told it’s smarter. Told it’s inevitable.
But the question remains: Who gave permission to redefine the human experience without human input?
Whose Will Is Being Represented?
Let’s return to the foundational words: “We the People.”
Those three words are the opening line of the United States Constitution, a living document that affirms power flows from the people, not the other way around.
And yet today, government leaders are aligning with tech giants to determine how AI will be used in healthcare, education, the military, and economics, without genuine public dialogue. Policy decisions are being made behind closed doors. Consulted by CEOs, not citizens. Influenced by industry, not the everyday person.
This isn’t representation. It’s imposition.
When our data becomes a currency of trade without our awareness or consent, it’s not just a policy oversight; it's a significant issue.
We are not property. Our bodies, minds, and data are not commodities to be passed from government to business under the guise of innovation.
Consent, autonomy, and personal sovereignty must be seen as sacred. It’s time we expand our expectations, because our expectations shape the systems we live within.
Policy Without Permission
The speed at which AI is being adopted across federal agencies and public infrastructure is staggering, and largely invisible to the people it affects.
Facial recognition software is being used without clear consent. Predictive policing algorithms disproportionately target marginalized communities. AI is screening resumes, approving loans, and denying benefits, often with no transparency and no accountability.
This isn’t about innovation. It’s about involuntary participation in systems we never agreed to.
Governments justify these decisions with terms like “efficiency” and “progress,” but we must ask: Progress for whom? Efficiency at what cost?
The truth is, policy is how the government enforces its agenda. However, policy can also reflect a government's people’s values, if it chooses to accept its responsibility and listen.
Where Consent Breaks Down
There’s a vast difference between being governed and being ruled.
Representative government is rooted in dialogue, consent, and the protection of individual rights. Authoritarian systems impose decisions from the top, insisting they know what’s best.
So let’s ask plainly:
Did you choose to let an algorithm decide what job you’re qualified for?
Did you consent to having your data fed into systems you can’t access, audit, or question?
Did you agree that your children should grow up in a world where machines predict their worth and shape their choices?
Did you approve prioritizing billions in AI development while human needs, like mental health, education, and emotional intelligence, remain underfunded and overlooked?
Did you ever say yes to a future where artificial intelligence is fast-tracked, while the evolution of authentic human intelligence is sidelined?
If the answer is no, then something fundamental has been violated, not just trust, but the very principles of liberty and personhood.
Our digital identities deserve the same constitutional consideration as our physical ones. The government has a responsibility to protect our human dignity, not hand it over to private interests.
The Cost of Low Expectations
Much of what we’re facing today is the result of collectively lowered expectations. Our systems weren’t built to support thriving. They were built to maintain order, and now, those limits are costing us dearly.
Let me offer a few examples of low expectations that stifle both individual empowerment and economic growth:
When people capable of living independently are forced into systems of dependency
When people who want to be healthy can’t access the care they need
When people who want to work can’t access fair opportunities or earn a living with dignity
When people who want to learn can’t afford education or personal development
These are not personal failures. They are system design failures.
We must raise our expectations, not just for how we are governed, but for what government expects of itself. Not just to protect, but to empower. Not just to regulate, but to raise the standard of living and quality of life for all its people.
What Real Leadership Would Look Like
Ethical leadership doesn’t hide behind buzzwords like “efficiency” and “innovation.” It listens. It protects. It co-creates. It asks not just, “What can technology do?” but “What kind of life do we want for our people?”
This moment, this intersection between AI, policy, and power, is not just a crossroads.
It’s a mirror. It’s an opportunity for the government to step back and reassess:
What expectations do we hold for the quality of life in this country? And what expectations should the government be holding for itself on behalf of its citizens?
If it truly serves us, then the government must act as a steward of both innovation and well-being. It must ensure that the tools of progress serve not only profits, but also people, communities, and the planet.
Here’s what that kind of leadership could look like:
National Listening Tours to gather authentic public input on AI, automation, and how people want to live
Transparent AI Policy Councils that include citizens, spiritual leaders, and community elders, not just corporate lobbyists
Consent-Based Tech Infrastructure where public services are opt-in by default, and dignity is built into every digital interaction
Elevated Civic Expectations where health, education, safety, and purpose are treated not as privileges, but as baseline rights
Cross-Sector Collaboration where business innovation aligns with public ethics, not just shareholder returns
Public Investment in Human Potential, not just artificial intelligence, so our spiritual, emotional, and creative intelligences evolve alongside technology
This vision is not utopian. It’s already beginning.
In 2025, Hawaii became the first U.S. state to pass a climate change tourist tax, a consumption-based policy that acknowledges a deeper truth: That our collective future requires shared accountability.
It’s a bold example of government choosing stewardship over short-term gain. It’s a signal that policy can align with planetary ethics. And it reminds us that taxes are not just economic instruments; they are moral declarations.
We need more of that, policy that protects our resources, invests in our people, and serves the long-term well-being of all life.
We need a government willing to say, 'Yes, we will support innovation.' But we will invest in humanity first. Because a thriving society isn’t built by optimizing systems. It’s built by honoring the people those systems were meant to serve.
This Isn’t About Being Anti-Tech
This isn’t about resisting technology. It’s about challenging our government to reimagine what progress really means.
Because while billions are being poured into artificial intelligence, we have yet to see a national strategy for advancing the quality of human life. We have yet to see policies that prioritize the evolution of human intelligence, encompassing emotional, creative, and communal aspects, alongside machine learning.
We have yet to hear our leaders ask: What kind of world are we building with all this power?
Technology was meant to relieve us of repetitive tasks and repetitive thought processes, not to replace our agency, creativity, or sense of purpose. When wielded ethically, AI can support human evolution. But it is up to the government to lay the foundation for that alignment.
That means using policy, taxation, and investment not just to stimulate markets, but to elevate lives. It means expanding what we fund, protect, and incentivize in service of a healthier, more resilient society.
Here are just a few sectors where the government could lead with vision:
Healthcare: Prioritize individualized care models that reduce long-term costs and increase well-being
Education: Fund personalized, curiosity-driven learning to unleash human potential and inspire new industries
Employment: Support businesses that value people and purpose, not just productivity
Justice Reform: Incentivize systems that expand access, healing, and trust in public institutions
Mental and Emotional Development: Create tax credits and infrastructure for the growth of authentic intelligence—empathy, creativity, critical thinking, and inner stability
Taking care of people is not a soft expense. It’s the most strategic investment a society can make.
If the government can offer tax breaks for automation, it can offer even more for elevation.
That’s not anti-tech.
That’s pro-human.
Government Serves at the Consent of the Governed
Let me say this clearly: We do not exist to serve systems. Systems exist to serve us.
And any government that forgets that, any institution that chooses expediency over dialogue, is not just neglecting its duty. It’s abandoning the soul of democracy.
Right now, we’re watching a dangerous trend take shape.
Businesses are touting record profits from automation and workforce reductions, celebrating “efficiency” while human potential sits idle and under-supported. And the government, instead of intervening, is offering subsidies and tax breaks that reward this extraction.
Where is the public policy that asks: How can we redeploy this displaced human capital into areas that expand national well-being?
We need a governing vision that sees people not as labor to be cut, but as capital to be cultivated. People who are ready to heal, ready to learn, ready to build, teach, care, create, and serve, if only the systems made space for them to do so.
We the People did not consent to become programmable inputs.
We are not waste to be optimized away.
We are the authors of this society.
And it’s time we demanded systems that reflect our worth.
Rise in Voice and Expectation
If the government held a town hall tomorrow and asked, “How would you like AI to be used in your life?” What would you say?
That question still matters. Your voice still matters. And your expectations are not naïve; they are the blueprint for a more just, innovative, and empowered society.
Drop your thoughts in the comments. Share this with someone who needs to remember their worth.
Let’s raise our collective expectations.
Let’s remind our leaders who they serve.
Let’s build systems that honor both humanity and progress, together.
,
Graham Skidmore
President, EnGen | Ethical Technology Advocate | Systems Rebuilder


