This guide has already mentioned some of the Risks and Possibilities of using AI in your research
So far, the guide has presented these Risks in relation to your research project and its lifecycle:
- I.E. research conception, funding, data collection, analysis, publication/dissemination, and re-use
But the use of AI in Research without critical and expert human oversight has broader social, cultural, & planetary implications
Technological Colonialism & Related Prejudices
- AI tools reproduce and amplify historically entrenched knowledge hierarchies established by colonial projects. Programed with Western logics and worldviews, they are incapable of representing the epistemic diversity of our cultural worlds. This makes them instrumental in furthering the marginalisation of Non-Western or -Anglo-American knowledges and perspectives. As well as contributing to their misrepresentation.
For example, GenAI's - incapable of represent living cultures and knowledge communities with the complexity or nuance required - have been found to flatten and misrepresent Aboriginal and Torres Strait Islander cultural production and heritage.
- Relatedly, prejudiced and stereotyped narratives are often baked into to the way AI gather, organise, and distribute knowledge about diverse identities and subjects. They circulate discriminatory representations and understandings of gender, sex and cultural diversity, as well extending ableist epistemes.
Worsening Social Inequalities
- AI tools' inputs and outputs are products of the digital divide; their 'learning' is done to the exclusion of people everywhere who have limited or intermittent access to technology and GenAI. While these perspectives remain excluded, AI's reproduce and contribute to global tech inequality.
- GenAI depends on metadata. The data tagging involved is often a product of terrible labour conditions that exploit poorer communities, or disenfranchised people, around the world.
Planetary Destruction
- GenAI uses colossal amounts of electrical power and freshwater.
- Overall the creation and use of GenAI (especially Graphical AIs) it not yet on a sustainable path.
It is important to approach its use with open eyes
Resources:
Indigenous Protocol and AI Position paper "[...] is a starting place for those who want to design and create AI from an ethical position that centers Indigenous concerns".
Williams, A. Miceli, M. and Gebru, T. (2022) The Exploited Labor Behind Artificial Intelligence, Noema (October).
Pogrebna, G. (2024) AI is a multi-billion dollar industry. It’s underpinned by an invisible and exploited workforce (the Conversation)
Stasiuk, G. and McMullan, J. (2025) How AI images are ‘flattening’ Indigenous cultures – creating a new form of tech colonialism (the Conversation)
UN Environment Programme (2024) AI has an environmental problem. Here’s what the world can do about that