Management at Australian Community Media (ACM) recently announced to staff that there was ongoing “AI experiments and testing” taking place in its newsrooms. Unfortunately, this announcement has sent the opposite message to employees. The announcement, made on October 3, revealed that ACM is exploring the use of artificial intelligence in three primary areas: story editing and coaching, headline writing, and generating story ideas. Our staff first responded to this initiative with great ambivalence. Specifically, they are worried about its potential impact on jobs and journalistic integrity.
ACM runs dozens of these mastheads over a giant swath of east coast Australia. They’ve been doing their best to actively test AI tools for more than a year at this point. As the ABC has just revealed, the nonprofit news org has created a generative AI model that can help them write articles. However, even with such progress, a majority of impacted employees are concerned that the technology will be used as an excuse to lay them off in the future. As we reported last year, ACM last year saw the unfortunate need to lay off 35 employees. They mostly blamed decreased funding from Meta, Facebook’s parent company, for these cuts.
Employee Concerns About AI and Job Security
One ingrained concern expressed by ACM staff is the worry that AI will be taking over human jobs in the organization. Sam, an employee of ACM, voiced his concern that new tech would only be used to rationalize even more job cuts.
“Some people will lose jobs and the ones who are left behind will be left picking up the pieces,” – Sam.
Sam explained that the current workforce is already feeling stretched thin with pre-established demands. Workers are already on the edge of burnout with rising workloads. Workers fear that the new AI technologies will disproportionately endanger their jobs.
Another staffer, Tim, expressed his concerns about the trustworthiness of AI products. He remembered one key moment in particular, getting assigned to write an article about what was happening in court. He did it using ACM’s generative AI model. Tim emphasized the dangers of depending on tech for high-stakes reporting.
“With my knowledge about the story, I knew that could have potentially defamed someone who could have been wrongly identified from what was generated,” – Tim.
Tim’s frustrations underscore a crisis confronting journalists at all levels. Specifically, they’re concerned about the uncut, unreleased nature of AI generated outputs and what that spells for accountability in news reporting.
AI Testing and Ethical Implications
ACM leadership has been frequently trying to assuage the staff on the ethical development of AI usages in journalism. A spokesperson stated that while they are exploring AI tools to enhance their work, “humans make the decisions on every word we publish.” They all underscored that integrity and accuracy are still the foundation and ironclad tenets of journalism.
“AI is not a replacement for journalists, editors or lawyers,” – ACM spokesperson.
The spokesperson further clarified ACM’s approach to AI by stating: “We do not use Gemini to write stories or rely on it for legal advice.” This cautionary language underlines a theme of thoughtful AI adoption. It cautions them to prioritize retaining human judgment at the center of their curatorial processes.
Terri, another participant and documentary film maker/editor, raised doubts about the idea of looking to AI for legal advice before engaging in journalistic activities. She compared it to getting an online diagnosis rather than seeing an expert.
“It’s sort of analogous to Googling your symptoms instead of going to a doctor,” – Terri.
Terri’s commentary emphasizes the truly disturbing aspects. She cautions against deploying AI where deep, technical understanding and complex judgment are needed.
The Broader Industry Landscape
This combination of innovation and experimentation with AI technologies isn’t exclusive to ACM—other media organizations are working on exciting projects like this. News Corp is currently recruiting AI engineers to establish such a replacement editorial workforce in Australia. They’ve released their own internal AI instructor, called NewsGPT. To take advantage of the generative AI phenomenon, the ABC has developed ABC Assist, an internal generative AI tool. This new tool is an amazing journalistic resource that assists with finding archival information, summarizing legacy content, and creating interview questions.
The inclusion of these technologies in journalism leads to critical discussions about the purpose of human journalists. These are the sort of experiments ACM is actively pursuing with AI, in order to expand the journalistic toolkit. Employees are worried about the long-term impact on their careers.
Despite assurances from management regarding responsible innovation, many staff members fear that advancements in AI may ultimately lead to decreased employment opportunities within the industry.
“AI won’t completely fill the hole that’s been left behind by the people who have left,” – Sam.
As membership organizations such as ACM continue to adjust to these industry shifts, honest communications with staff should remain a priority. They need to maintain the institutional ethics that undergird anybody’s independent, responsible journalism.

