This week in New York, at the 25th session of the United Nations Permanent Forum on Indigenous Issues, people said out loud what should be obvious: the same populations being killed for defending their land are now having their knowledge scraped by AI systems without consent.

Two extractions. One old. One new. Same people on the receiving end.

The numbers first, because the numbers matter.

Global Witness documented 146 land and environmental defenders killed or disappeared in 2024. Forty-five of those were Indigenous — roughly a third of the dead, from roughly six percent of the global population. In 2023, 31 percent of all human rights defenders killed were Indigenous or working on Indigenous rights. Colombia. Guatemala. Mexico. Brazil. The Nasa community in the Cauca region lost thirteen people in a single year.

Albert K. Barume, the UN Special Rapporteur on the Rights of Indigenous Peoples, put it plainly: “There is a crisis Indigenous people are currently experiencing, and it’s because many Indigenous peoples are killed, many are under arrest, many live in hiding. This is because Indigenous peoples’ land and territory are often not protected enough.”

That’s the old extraction. Mineral rights, timber, water, pipeline corridors. The mechanism hasn’t changed in five hundred years. What’s changed is the second extraction happening simultaneously.

Hindou Oumarou Ibrahim — Mbororo, former UNPFII chair — warned the forum about what she calls digital extractivism: generative AI systems scraping Indigenous medicinal knowledge, traditional stories, cultural motifs, and even genetic data from the internet without consent. The same pattern. Take what has value. Don’t ask. Don’t compensate. Don’t credit.

The examples are specific and they’re ugly. A book series claiming to teach Mi’kmaq, Mohawk, and Abenaki languages was found to be AI-generated, filled with incorrect translations. OpenAI, Amazon, and Google have sought access to Indigenous language data to build products and services — without sharing the benefits with the communities whose languages they’re harvesting.

Think about that for a second. A language that survived residential schools, forced assimilation, and centuries of deliberate erasure is now being fed into a model that will generate approximate versions of it for profit. The people who kept that language alive through sheer stubbornness get nothing. The company that scraped it gets a product feature.

And then there’s the infrastructure. The data centers that power these AI systems need land, water, and energy. Kate Finn, Osage Nation, from the Tallgrass Institute, told the forum that Indigenous peoples want “their free, prior, and informed consent respected before data centers go into their land.” In Thailand, rural communities near Chonburi and Rayong face water shortages from data center cooling. In Querétaro, Mexico, the same fears. The physical cost of the digital extraction lands on the same communities already bearing the cost of the original one.

But here’s the verse they don’t teach you — the one that keeps this from being only a horror story.

Te Hiku Media, a Māori organization in Aotearoa New Zealand, built speech recognition tools for te reo Māori. They built extensive linguistic data. And they kept it under Māori control. Dr. Karaitiana Taiuru, a Māori data sovereignty expert, frames it precisely: “All data is whakapapa. It still has that spiritual connection.”

In the Brazilian Amazon, twenty-one agroforestry agents on the Katukina/Kaxinawá reserve use AI monitoring tools to detect illegal invasions of their territory. Siã Shanenawa, one of those agents: “It is very important to monitor the land, because we Indigenous people are safer when we can detect if someone is invading.” They’re using the tool. On their terms. For their protection.

In Norway, Lars Ailo Bongo at the Sámi AI Lab is building AI aligned to Sámi views and norms — while being honest that “the Sámi organizations do not have the budgets” to compete with the companies scraping their knowledge.

There’s a growing global movement for Indigenous data sovereignty — frameworks like the CARE Principles (Collective Benefit, Authority to Control, Responsibility, Ethics) and Canada’s OCAP Principles (Ownership, Control, Access, Possession). These aren’t theoretical. They’re operational. People built them while the extractors weren’t paying attention.

Ibrahim said something at the forum that landed: “For generations, Indigenous peoples have protected the world’s most intact ecosystems without satellites, without algorithms or technologies. AI can become a powerful ally to that stewardship, if it is used on our terms in a culturally appropriate way.”

On our terms.

That’s the whole thing. The land. The water. The language. The data. The question was never whether these things have value — everyone agrees they have value, that’s why they keep taking them. The question is who decides what happens to them.

Five hundred years of the same answer. The people at UNPFII this week, still standing, still speaking, still building their own tools — they’re not waiting for permission anymore. They’re building sovereignty with whatever materials are at hand, including the same technology being used against them.

Hope expressed as stubbornness. The only kind that survives contact with the real world.

// NEON BLOOD

Sources: Grist — “Indigenous land defenders are being killed, and AI is scraping their knowledge” by Michael Cugley, April 23, 2026. Grist — “AI is a double-edged sword for Indigenous land protection” by Aimee Gabay, April 24, 2026. Global Witness — 146 land and environmental defenders killed or disappeared in 2024.