Talk of automation is making the rounds again and unless newsrooms retool, adapt and media organizations train journalists to cope with it, more staffers risk becoming redundant.
The Online News Association’s (ONA) annual conference in Denver, Colorado this month featured speakers like futurist Amy Webb who reminded us that “bots” are here to stay.
In her “Tech Trends for Journalists” presentation, Webb urged journalists to look ahead and follow emerging trends.
“The future of news depends on your ability to see what’s happening at the fringe and follow it to the mainstream,” she said.
Webb’s “2016 Trend Report” zeroed in on “bots.” It said news organizations would soon use them to sort and tag articles in real time.
“We’ll see advanced bots manipulating social media and stocks simultaneously,” it added.
So get used to it.
For the uninitiated, “bots” are robots, and robo-journalism, or automated journalism has become more than a trend.
The Associated Press (AP), Thomson-Reuters, Bloomberg News, The New York Times and Los Angeles Times have used them in recent years to crunch numbers in formulaic writing of repetitive business reports, weather forecasts and sports news.
To understand the trend, journalism educator, digital media scholar and online news pioneer Alfred Hermida tweeted:
The guide he mentioned is an invaluable resource by Andreas Graefe, a research fellow at Columbia University’s Tow Center for Digital Journalism, who studied how automated news from accurate and structured data has discombobulated the journalism profession.
The key word is “accurate.” After all, somebody has to generate that information and verify it, automation, speed, and scale notwithstanding.
Last month, Facebook tried to automate its “Trending” news section but the initiative bombed.
Facebook “promoted a false story by a dubious right-wing propaganda site. The story, which claimed that Fox News had fired anchor Megyn Kelly for being a ‘traitor,’ racked up thousands of Facebook shares and was likely viewed by millions before Facebook removed it for inaccuracy,” Slate reported.
According to the Florida-based Poynter Institute, British organization Full Fact published a road map for fully automated fact-checking, noting that such an endeavor isn’t a fantasy but an attainable goal.
Graefe, meanwhile, thoughtfully analyzed how algorithms – those problem-solving formulas derived from the name of 7th Century AD Arab mathematician Al Khawarizmi – can generate news faster than humans, make fewer errors, and potentially generate news on demand.
But, he cautioned, these algorithms have limitations:
Algorithms rely on data and assumptions, both of which are subject to biases and errors. As a result, algorithms could produce outcomes that were unexpected, unintended, and contain errors.
Algorithms cannot ask questions, explain new phenomena, or establish causality and are thus limited in their ability to observe society and to fulfill journalistic tasks, such as orientation and public opinion formation.
The writing quality of automated news is inferior to human writing but likely to improve, especially as natural language generation technology advances.
Graefe advised journalists to develop skills that algorithms can’t perform such as in-depth analysis, interviewing and investigative reporting.
His caveat for news organizations:
Since algorithms cannot be held accountable for errors, liability for automated content will rest with a natural person (e.g., the journalist or the publisher).
Algorithmic transparency and accountability will become critical when errors occur, in particular when covering controversial topics and/or personalizing news.
Apart from basic guidelines that news organizations should follow when automatically generating news, little is known about which information should be made transparent regarding how the algorithms work.
Tom Kent, former standards editor at AP and a contributor to the guide, produced “An ethical checklist for robot journalism” last year that was updated in March 2016.
In it he poses key questions editors should ask: How accurate is the underlying data? Do you have the right to the data? Is the subject matter appropriate for automation? How does your automation organize the data? Will you disclose what you’re doing? Does the style of your automated reports match your style? Can you defend how the story was “written?” Who’s watching the machine? What about long-term maintenance? Are you considering automation that creates multimedia presentations? Are you using software that reduces long articles to bullet points? Are you ready for the next frontier?
In April, The Guardian reported that artificial intelligence (AI) was already making inroads but asked if it could win journalism’s coveted Pulitzer Prize.
This month an unsigned blogpost rightfully asked: “Is automated journalism taking away the art of writing?”
Probably not, but journalists take heed.