UNE leads on Indigenous rights in AI


As artificial intelligence reshapes university education, the University of New England is addressing an important question: how can institutions ensure AI development respects Indigenous rights rather than perpetuating existing patterns of knowledge extraction? 

The UN's 2025 International Day of the World's Indigenous Peoples theme - "Indigenous Peoples and AI: Defending Rights, Shaping Futures" - highlights growing global concerns that AI could repeat colonial patterns of knowledge extraction in digital spaces. At UNE, Professor Peter Anderson, Pro Vice-Chancellor Indigenous, is working with LabNext70 to ensure the university charts a different course. 
 
"Much like the curriculum and experience at UNE, we want our students to be cognisant of Indigenous knowledges, perspectives, engagement that also crosses over into AI,” says Professor Anderson. “And how we create students who are AI-literate but also literate in Indigenous knowledges in the AI space." 

 

Building on rights, not good intentions

"Universities as a whole historically have been extractive in terms of Indigenous knowledges," Professor Anderson explains. "We've got a model of how not to do it and how not to be extractive with Indigenous peoples and knowledges. This is a global conversation, and being here at UNE, as we launch Madgwick for staff and now also for all students, it's about how can we can embed Indigenous rights into our AI development and use." 

Professor Anderson's approach draws from his research on Indigenous rights-based methodologies, and the UN Declaration on the Rights of Indigenous Peoples - a framework that Australian universities have endorsed but find challenging to operationalise in meaningful ways. His recent research paper – “Indigenous rights-based approaches to decolonising research methodologies in settler colonial contexts1 - argues that despite decades of calls for decolonising research practices, Australian academia continues to default to Western approaches that frequently marginalise Indigenous ways of knowing. The paper demonstrates how rights-based frameworks can move beyond tokenistic inclusion to create genuine partnerships that support Indigenous self-determination. 

"All of my research is actually informed by the United Nations Declaration on the Rights of Indigenous Peoples," he notes. "Universities Australia’s Indigenous strategy 2022-20252 has made mention of it. It also provides useful guide for researchers and institutions as an ethical framework of doing business with Aboriginal and Torres Strait Islander peoples and communities. However, what we haven't done as institutions across this country is really looked at how do we embed that and enact these." 


From extraction to  knowledge and tech sovereignty

This rights-based approach becomes even more critical when applied to AI development. The stakes in the AI era are particularly high.  According to the UN concept note3, tech companies are training AI systems on vast datasets that may include Indigenous languages, cultural materials and knowledge - often without permission or awareness. This mirrors historical patterns of appropriation - but at an unprecedented scale.4 

UNE's response centres on a fundamental concept: Indigenous data sovereignty. - the principle that Indigenous peoples have the right to control data about their communities, cultures, and knowledge systems. The university's Madgwick Student platform, launching in November as part of a staged rollout that will extend free AI access to all students, incorporates technical architecture that aligns with these sovereignty principles bykeeping user data under individual control rather than feeding it into external AI training systems.  “The way Madgwick handles data aligns closely with the principles of Indigenous data sovereignty,” Professor Anderson explains. “Within Madgwick, data remains securely within UNE’s environment and under the control of the individual user. It is never shared externally or used to train commercial AI models, ensuring that ownership and control of the data stay where they belong — with the user.” 

As Chair of the Indigenous Data Network5, part of the Australian Research Commons6, Professor Anderson recognises that the sector still lacks a unified understanding of what Indigenous data sovereignty means in practice. "We still haven't got a uniform statement on it that everything can fit under, but with Madgwick … we've got it already." 


From decolonial theory to practical action

Professor Anderson's approach offers a practical pathway through complex theoretical territory. While the sector talks extensively about decolonisation, he argues that a rights-based framework provides clearer direction. 

"In the sector, what everyone's talking about is decolonisation - but you don't know how to get there," he explains. The concept of Indigenous Rights functions as the active force that facilitates movement into decolonial or decolonised academic spaces. Often individuals at faculty level and researchers can't see the endpoint of 'decolonial', whereas through rights you actually say, ‘I've done this, and this, and this that work towards that agenda.’"  

Looking ahead, Professor Anderson envisions partnerships that would invert traditional extractive relationships. The secure environment created by Madgwick could enable collaborations with local communities to develop AI assistants controlled by the communities themselves. 

"AI assistants developed by Indigenous academics and experts or in partnerships with Indigenous communities, where these communities retain ownership and control over cultural knowledge and determines how and when it is shared." 


Beyond access to genuine equity

The conversation about AI and Indigenous rights also extends to notions of access and equity in education. As Professor Anderson argues, true equity in the AI era means ensuring Indigenous students graduate with the technological literacy needed for an uncertain future. 

"We always think about getting Indigenous students in and how do we actually progress them through," he observes. "But equity is beyond getting somebody in the door and beyond getting them into the classroom. This is actually getting them the skills to be engaged in new jobs, a new tech-focused world." 

He continues: "We don't even know what work is going to look like in five years. But we believe that tech literacy, particularly around AI, can help equip graduates with skills increasingly valued in emerging job markets. 

 
 
Previous
Previous

Simtheory Academy launch: AI skills for all

Next
Next

PASS leaders to guide peers’ AI adoption