With Artificial Intelligence being mainstreamed into our media, classrooms and workplaces, the last group of people to swallow the metaphorical AI pill is journalists. As it truly is a career based on human connections, finding how to keep up with modern technology while upholding journalistic integrity and ethical responsibility poses a unique challenge.
“It’s really hard to say we don’t use AI because AI is in everything,” journalism adviser Michelle Balmeo said. “When you click ‘remove background’ from a Photoshopped image, that’s AI. It’s a really big bucket that we think means ChatGPT, which is a large language model. But when you shop on Amazon, that’s AI. Like, when you get your Spotify Wrapped, that’s AI. It’s everywhere.”
The integration of artificial intelligence into journalism truly is a double-edged sword. While AI tools can revolutionize the industry by enabling faster fact-checking and enhanced data analysis, their use comes with caveats. For example, automated reporting systems could quickly churn out sports recaps, freeing up human journalists to tackle more nuanced stories. However, the speed and efficiency of AI must be balanced against the core principles of journalism: accuracy, accountability and trust.
“I think AI in general is really good at sifting through a lot of data really fast,” Balmeo said. “And pulling out patterns. Pattern recognition, insights, highlights. I think that’s a really good use of AI because that stuff takes a long time if you do it by hand.”
Most journalists and journalism educators agree on one thing: AI should not be used in content creation. This includes generating stories or AI-created images for publication, as it diminishes the human connection from writer to reader. Publications risk losing authenticity and alienating their audiences by allowing AI to replace human effort.
“If we say, ‘We’re going to let people post AI images and we’re just going to credit them and say, this was made by AI,’” journalism adviser Bradley Wilson said. “The problem is that it’s a journalistic website. Somebody seeing that and seeing, ‘Oh, this is an AI-created image.’ You can’t prevent that person from then associating other content on the site with AI. So now every other piece of art, every photograph, every journalistic story, every quote. It almost taints the journalistic work on that site because you’ve opened up the possibility that you’re an organization that allows the use of AI for content creation.”
Along with concerns about the general ethicality of AI, a more pressing issue is that the technology could be weaponized to spread fake news or deepfakes, further blurring the line between what is real and fabricated. When speed takes priority over thoroughness, the industry risks undermining its own credibility, eroding audience trust.
“We don’t want to lose the trust,” Wilson said. “If you look at the Gallup poll on trust in the media, our trust is not at an all-time high by any stretch of the imagination. So, the last thing we want to do is anything that will cause us to lose more trust.”
Journalism thrives on human connections, interviews, lived experiences and empathy. These are all absolutely essential to creating stories that resonate with audiences. While AI can mimic certain aspects of human writing, it cannot and possibly never will replicate the depth and authenticity of human interaction.
“The thing that journalism does is it talks to people,” Wilson said. “A good journalist is going out and talking to real people, and then synthesizing and extracting from multiple conversations with real people, the most meaningful or significant pieces to make sense of something complex. AI, at least right now where we’re at, can’t do that.”