AI 'Aboriginal Steve Irwin' Sparks Outcry Over 'Digital Blackface'
'AI Blackface': Fake Indigenous Wildlife Star Sparks Ethical Row

A wildly popular social media wildlife personality, hailed by followers as an 'Aboriginal Steve Irwin', has been exposed as a complete fabrication – an artificial intelligence-generated avatar that has sparked fierce ethical debate and accusations of 'AI blackface'.

The Bush Legend: A Digital Fiction

Known online as the Bush Legend, the character named Jarren appears in videos set against the Australian outback, handling snakes and searching for eagles to the sound of didgeridoos. With over 180,000 combined followers on Meta's Instagram and Facebook platforms, his enthusiastic, 'mate'-filled delivery captivated an audience who believed they were watching a real Indigenous conservationist.

However, the man, the wildlife encounters, and the narration are all products of artificial intelligence. Meta's data indicates the account, created in October 2025, is operated from New Zealand. The person believed to be behind it, a South African national living in New Zealand, did not respond to requests for comment.

Experts Decry 'Cultural Flattening' and Theft

The deliberate creation of an Indigenous-presenting avatar has raised serious ethical alarms. Dr Terri Janke, a leading expert in Indigenous cultural and intellectual property, described the initial realism as 'remarkable' but said the act was offensive and risked 'cultural flattening'.

"Whose personal image did they use to make this person?" Dr Janke asked. "I feel a bit misled by it all." She warned that such AI content poses a particular threat to marginalised communities, constituting a form of insidious theft that can steal opportunities from authentic Aboriginal rangers and educators.

Tamika Worrell, a Kamilaroi woman and senior lecturer in critical Indigenous studies at Macquarie University, was more blunt, labelling the avatar a form of cultural appropriation and 'digital blackface'.

"It's AI blackface – people can just generate artworks, generate people, [but] they are not actually engaging with Indigenous people," Worrell stated. She highlighted the dual harm: such accounts often share only 'palatable' aspects of culture while also providing a new vector for racist commentary, which she observed in the account's follower comments.

AI's Built-in Biases and an Implausible Defence

Professor Toby Walsh, a leading AI researcher at UNSW, explained that AI systems inherently carry the biases of their training data, potentially perpetuating stereotypes. He also warned that the digital 'tells' of AI are vanishing, making it 'next to impossible' for the public to discern real from fake content.

Facing criticism, the Bush Legend account issued a defence through its own AI avatar, claiming it does not seek to 'represent any culture or group' and is simply about animal stories. It told critics to 'scroll on' if they disliked it, despite earlier prompts asking followers for a $2.99 monthly subscription.

The controversy underscores the urgent ethical and legislative vacuum surrounding AI-generated content and its capacity to appropriate identity, mislead the public, and cause profound cultural harm.