… State press agency investigates possible abuse of AI
The Namibia Press Agency (Nampa) recently found itself in uncharted waters when it allegedly had to navigate the use of artificial intelligence (AI) in producing a publication paying tribute to former president Hage Geingob without disclosing this to the public.
The agency’s AI expert, Angie September, however, denies this.
Sources say Nampa’s subeditors at the time of production flagged the potential use of AI in the publication, resulting in the agency’s management intervening.
“There has been some discussion around the use of Al in our newsroom, but it has not been formally incorporated into our production processes yet,” September says.
She says as much as Al is supposed to make lives easier, it also poses challenges.
“We cannot shy away from the use of modern technology, but we have to find solutions and ways to master it.
“Newsrooms are not spared from these challenges, and they must find a way to overcome this, while cautiously embracing Al, ChatGPT and the like, where possible,” September says.
She says reporters and other newsroom staff members remain crucial as they bring critical thinking, ethics, and the ability to understand context to the production process, which are all essential for accurate and nuanced reporting.
EXPLORATION
Editors’ Forum of Namibia (EFN) chairperson Frank Steffen says many media houses have started exploring the use of Al in their reporting.
“There is a lot of use of AI. The quality of articles has improved in terms of vocabulary and grammar. It is not wrong to be used, as long as it enhances work,” he says.
It would, however, be risky to use AI to curate reports, because AI relies on available information on the internet, which may not be accurate, Steffen says.
PROOFREADING STILL NEEDED
“Once you have used any of the AI tools in your reporting, it is essential to proofread reports to ensure it is accurate.
“AI has its dangers, but yes, it is being used in Namibian newsrooms, although not during the optimisation process,” he says.
Steffen says AI would not replace media staff, but could increase reporters’ efficiency in terms of data interpretation, fact checking and processing content.
No laws and regulations on the use of AI currently exist.
“By the time we pass these laws, AI has moved to another stage,” he says.
ADVISING ON DEFAMATION
The Namibian’s managing editor, Tangeni Amupadhi, says the paper has been exploring the use of AI, which is still in the early stages.
“This is why we have engaged with local tech companies and international media development organisations, such as DW Akademie, to facilitate internal engagements on the use of AI in the newsroom,” he says.
Thus far the paper has been using AI to sift through readers’ text messages, flagging potential defamation and false information, Amupadhi says.
Other possible experimental areas include ideation, the pitching of stories and transcribing and interpreting data, he says.
“We are in the process of setting basic guidelines on the use of AI while we work on a detailed policy that enables the entire organisation to learn and to safely use AI,” Amupadhi says.
He says the news organisation is open to constructive dialogue on the integration of AI in the newsroom.
BEWARE ‘HALLUCINATIONS’
The programme director at DW Akademie Namibia and Southern Africa, Peter Deselaers, says it’s important that the journalism industry explores various technologies to determine their potential impact.
He says experienced editors can quickly pick up whether reporters have used AI.
“One of the biggest risks when using AI is that there is a lack of relevant context, bias, or even so-called hallucinations, where the text generator makes up facts.
“In journalism there should always be a human in the loop, who makes sure reporting is accurate, has relevant context, and is factually true,” says.
Deselaers says there is no specific regulations on using AI in Namibian newsrooms, but he is aware that some newsrooms are working on internal guidelines.
‘DON’T USE IT’
New Era managing editor Jonathan Beukes says New Era has not officially begun to use AI yet.
“We are not using the latest generation of AI, because there are no rules and regulations in place. What is currently available is a verbal agreement that says don’t use it, and when reporters use it, they should disclose it to their supervisor directly.
“Tools such as Grammarly are used, but the institution has had a dialogue about its effective use. It’s very easy to pick up the use of artificial intelligence based on wording, grammar and story flow,” he says.
“We see many individual journalists using AI tools in their reporting. There is lively debate in the newsroom on how to use these tools and which problems may arise.
“As AI and machine learning plays an increasing role in social media, in businesses and in many technologies people use, it is important for journalism to also explore these technologies and to be able to explain and analyse their impact.
“I think in the long run it is crucial to include the topic in the EFN’s code of ethics,” Beukes says.
The Paris Charter on AI and Journalism is a global framework the journalism community has developed under the leadership of Reporters without Borders.
Stay informed with The Namibian – your source for credible journalism. Get in-depth reporting and opinions for
only N$85 a month. Invest in journalism, invest in democracy –
Subscribe Now!