When People Don’t Trust Robot Journalism

It is easy to fear robots. They’re an easy way to tap into the instinctive of sentience without human qualities such as empathy, compassion, creativity or mercy. Robot journalism is often seen with the same trepidation, many seeing it as a threat to the livelihood of working reporters and editors.

But while many fear the future, there is a measure of reassurance when you look at current, real-world applications of Natural Language Generation (NLG). Robot journalism is not a thing of the future, it is already here.

NLG products are currently used by organisations around the world to produce texts centred on financial reporting, sports, and natural phenomena. At Retresco, our products are already being used for the former in addition to e-commerce text and house listings websites. Often, when these texts are presented to the general public, they are impressed with the quality of the work, as can be seen in this video from Deutsche Welle.

That’s why we were particularly interested when Deutsche Presse-Agentur (DPA) unveiled last Friday research undertaken with Statista prior to scoopcamp 2018. There, DPA asked a number of questions about issues facing the journalism industry. The research was commissioned by nextMedia.Hamburg in advance of the conference. The thing that caught our attention was the focus on robot journalism, its capacity to be verified, and its applications.

The numbers were pretty stark, although need to be looked at in the proper context. 45 per cent of respondents thought that robot journalism would not have a big impact, 49 per cent were unsure about the technology, and 28 per cent thought it would never work. Only three per cent had some faith in its potential.

When asked the subject of its trustworthiness, there was a great division in numbers, although the general view seemed to bend a little more (but not much) towards a place of trust. 43 per cent did not trust automated content, 39 per cent were not sure, and just 18 per cent thought it could be trusted.

Those statistics are not surprising, and we’ve come across similar before. But this is a fluid situation and I suspect that the coming years will see a shift in perceptions about automated content, robot journalism, or whatever title it is promoted under. The current perception among reporters and editors is that for them to embrace robot content is akin to turkeys voting for Christmas. But their reaction is that of the horse and carriage driver protesting against the invention of the car. Automated content is here to stay, and will continue to evolve, improve, and become more integrated into news organisations.

That’s the context. Now let’s look at the numbers individually.

According to the research, 45 per cent, 49 per cent, and 28 per cent have no faith in automated content, are sceptical about it, or believe it will never work. Okay, we can live with that, but we think those numbers will change as technology like ours becomes more commonplace in the newsroom. Right now, NLG is still at a nascent stage. The next couple of years, however, should see it come on in leaps and bounds.

What needs to be understood is that, as we’ve said before, NLG technology should be seen as a tool to help reporters and editors, not to replace them. The journalism industry has been in a parlous state for over a decade. It won’t be NLG that kills reporter roles, but people that decide to use it that way. The smart organisation, however, sees NLG as an investment in their reporters, not as a replacement.

NLG technology can spot breaking trends, alerting reporters immediately while providing instant coverage. As this happens, human reporters can use skills beyond computers to add depth and quality to the initial stories.

As for the trust issue, 82 per cent did not know, or did not believe, that NLG was trustworthy. We understand that fear, too, but NLG content is only as reliable as its underlying data. And good data, data that has been analysed and assessed by experts, data that is monitored for mistakes, is more accurate than humans. Humans lie, or evade, or try to paint a flattering picture when none exists. And while humans have opinions, data does not.

The final statistic says that 91 per cent favour labelling robot journalism stories as such.

We agree.

We’ll even make some suggestions. If the story is part-human and part-NLG, then this can be added at the bottom: “This story was created in part using Retresco’s NLG software.” If the story is complete NLG, how about this: “This story was created using Retresco’s NLG software.”

We believe in transparency in our products, especially when they’re being used in the newsroom. Hopefully, people will see this and trust will build in the next couple of years. We know our technology is trustworthy, and we’d like others to know that, too.

 

 

[one_fourth]  [/one_fourth] For more information, please contact:

Pete Carvill
presse@retresco.de

 

 

About Retresco

Founded in Berlin in 2008, Retresco has become one of the leading companies in the field of natural language processing (NLP) and machine learning. Retresco develops semantic applications in the areas of content classification, recommendation, as well as highly innovative technology for natural language generation (NLG). Through nearly a decade of deep industry experience, Retresco helps its clients accelerate digital transformation, increase operational efficiencies, and enhance customer engagement.

 

Contact eng