From the invention of the telegraph in the 1840s to the recent surge of various generative Artificial Intelligence (AI) tools, such as ChatGPT, Zapier, DALL-E and more, technology and the integration of AI is introducing a new era of journalism.
To address these changes, the NBCU Academy hosted the Student Innovation Summit in partnership with Comcast’s Project UP from Columbia College Chicago on Sept. 25.
The virtual, in-person program provided attendees with valuable insight from top executives with NBC Universal and Comcast.
In addressing the issue of misinformation fueled by artificial intelligence, Rashida Jones, president of MSNBC, highlighted concerns raised in a recent survey by Elon University that found more than 75% of Americans believe AI misinformation will significantly impact the upcoming presidential election.
“The news, being the truth, is important to so many people and of course to the credibility of MSNBC,” Jones said. Responding to an audience member’s question about MSNBC’s strategies for combating misinformation, Jones explained the network’s proactive measures.
“We are implementing both technological and editorial filters to ensure our content remains unmanipulated during election cycles,” she said. Jones added that in addition to fighting misinformation, AI tools are being tested for fact-checking and traditional journalism skills, such as context verification and source confirmation. AI in journalism also raises concerns about losing jobs due to automation.
During the conference, a student from Duke University posed a crucial question to the panel: “How can AI help journalists while not threatening their careers?”
“AI enhances our core work, highlighting its potential to improve efficiency and information aggregation,” Jones said.
While acknowledging that automation may change job functions, she reassured the audience that the essential skills needed by journalists remain unchanged.
That sentiment gave hope to several attendees who expressed optimism about AI lowering barriers to sophisticated reporting by enabling reporters to carry out ambitious projects that previously required extensive resources and experience.
Jones pointed to successful applications of AI in journalism, such as the award-winning City Bureau project, which utilized AI for reporting tasks.
“We must embrace technology,” she advised, encouraging journalists to continuously develop their technological skills by using new tools.
Yet, some of the audience members voiced concerns about the implications of “creative destruction” in traditional practices.
They questioned the rapid advancement of technology and how it might block foundational journalism skills.
As local newsrooms face increasing challenges, AI can emerge as a powerful ally in enhancing journalistic effectiveness.
By facilitating deep research and evidence-based reporting, AI helps audiences know the origins and processes behind news stories.
“Transparency in reporting is crucial,” MSNBC Contributor Brian Caravigliano said during the conference.
He discussed how journalists can utilize AI to document their reporting processes by providing data-based evidence to back up claims.
“My dad always used to tell me that a good craftsman never blames his tools,” Caravigliano said.
That perspective resonates with many in the industry, as AI automates mundane tasks—such as data collection and data entry —freeing journalists to focus on complex investigations and creativity.
As AI becomes more involved in journalism, there is a growing call for federal regulations to ensure its ethical use, particularly in combating misinformation.
During the conference, experts emphasized that establishing uniform standards within the industry is essential for describing the boundaries of AI usage.
“The rapid pace of technological advancement often outstrips the development of corresponding regulations,” Jones said referencing the challenges faced by lawmakers.
The news media industry has historically managed its own ethical standards and accountability, distinguishing itself from social media platforms that often lack oversight.
“Developing accepted industry standards can help restore trust with audiences during a time when misinformation is prevalent,” Caravigliano said. Proposed federal legislation, such as the COPIED Act, aim to create guidelines for authenticating AI-generated content and increasing transparency around its use.
However, experts point out that regulations may emerge from community standards rather than federal mandates.
The development by other countries to create regulatory frameworks for AI reflects a global recognition of the need for responsible use in the media.
Ethical guidelines focusing on transparency, accountability and the preservation of journalistic values are crucial as the industry adapts to technological changes