The Associated Press (AP) has recently unveiled its stance on the integration of artificial intelligence (AI) in journalism. The guidelines explicitly state that AI should not be employed to generate content or images that will be published by the news agency. However, they also emphasize the importance of their staff familiarizing themselves with this rapidly advancing technology.
AP joins a select group of news agencies that are establishing protocols for incorporating cutting-edge tools, such as ChatGPT, into their operations. In a move to further educate journalists, AP will be adding a chapter to its renowned Stylebook this Thursday. This chapter will offer guidance on reporting AI-related stories and will feature a comprehensive glossary of related terms.
Amanda Barrett, AP’s vice president of news standards and inclusion, commented on the initiative, stating, “Our objective is to strike a balance between innovation and safety.”
The Poynter Institute, a respected journalism think tank, labeled this period as a “transformative moment.” Earlier this spring, they encouraged media outlets to formulate and share their AI usage policies with their audience.
Generative AI, which can produce text, images, audio, and video on demand, still struggles with discerning fact from fiction. Consequently, AP advises that any content produced by AI should undergo rigorous scrutiny, akin to any other news source. They further caution against using AI-generated multimedia unless it’s the focal point of the story.
This sentiment resonates with Wired magazine’s policy of not publishing AI-created stories unless the AI origin is the central theme. Nicholas Carlson, Insider’s editor-in-chief, emphasized the importance of human touch in journalism, stating, “Every word in your stories should be a reflection of your diligence and integrity.”
The emergence of AI-generated misinformation or “hallucinations” underscores the need for stringent standards to ensure content authenticity and credibility, as highlighted by Poynter.
While the direct publication of AI-generated content remains controversial, its potential utility in the newsroom is undeniable. For instance, AP editors can leverage AI to compile story summaries for subscribers. Wired suggests AI can assist in crafting headlines or brainstorming story concepts. Additionally, AI can offer editing suggestions to enhance readability or propose interview questions.
AP’s decade-long experimentation with basic AI forms, such as generating brief reports from sports scores or financial data, has been invaluable. However, Barrett emphasizes a cautious approach to ensure the integrity of their journalism remains uncompromised.
Last month, a partnership was announced between OpenAI and AP, allowing the AI company to access AP’s vast news archive for training purposes.
The unauthorized use of journalistic content by AI firms has raised concerns among news agencies. The News Media Alliance, representing numerous publishers, has articulated principles to safeguard their intellectual property rights.
The potential of AI replacing human roles in journalism has sparked debates, especially in discussions between AP and the News Media Guild. Vin Cherwoo, the union’s president, expressed a mix of optimism and caution regarding the guidelines.
Barrett believes that while AP journalists should be well-versed with AI, they should also be equipped to report on its implications in the future. The upcoming chapter in AP’s Stylebook will delve into the multifaceted impact of AI, spanning politics, entertainment, education, and more.
The chapter will also introduce terms like machine learning, training data, face recognition, and algorithmic bias. However, Barrett acknowledges the dynamic nature of AI and anticipates regular updates to the guidelines, stating, “The AI landscape is ever-evolving, and our guidance will adapt accordingly.”