22 Mar 2024

AI Shaping the Public Opinion in News Media – What the Public Wants to Hear vs What the Public Needs to Hear

A lot has been said about artificial intelligence (AI).  It is reshaping global business and everyday life worldwide and by no means, it is a new industrial revolution. Mankind’s striving for an easier life has led to the creation of a fast-learning, self-aware creature which shall irreversibly become an unexpected (or perhaps unwanted) member of society.

We have witnessed the universality of AI’s application.  Here, we are focusing on the use of AI in the sphere of news media business.  AI has been widely employed in gathering news information.  It may also filter false data and fake news, contributing to the accuracy and truthfulness of the published information.  In this way, AI supports one of the main principles of journalism – giving the public comprehensive and complete information on specific events or persons, based on credible sources.  On the business side of the news media, AI has been used to track the preferences of the public, boosting the income of the media outlet.

It is obvious that AI is going ever deeper in the media.  But one must not forget the core of true journalism – making a professional and fair balanced editorial decision on giving the public what it wants to hear or see and what it needs to hear or see.  By the power of their pens, journalists are steering the public opinion.  As professionals, they have to diligently monitor public developments and process information, and project what the public must be aware of, so it can actively participate in shaping the society and controlling the politicians.  Will we let a robot shape our opinions on the subjects of public interest?

This question is not just ethical but also a social one.  News media are specific, a mix of a non-business – informing the public on matters of public interest and business – media services.  The business side must not jeopardize the non-business one.  This fine line between the two is frequently not that visible.  The owners of media are dominantly business driven, and as AI contributes to the media business significantly, the question is – will media let AI also decide on what the public needs to hear.

Is there enough ethics in a robot to assess the duty of the public to make the right social and political decisions?  Maybe this should be a line that should not be crossed, at least until AI learns how to become an artificial homo politicus.  Until then, discussions should be held as to the extent AI can be used in creation and implementation of the editorial line.

Although the editorial lines are internal matters of the publishing companies, there should be a balanced regulatory response as to the use of AI in news journalism. And this would not seem to be a threat to journalism and its freedoms, but rather its protection from potential business driven misuse of AI.  For now, AI should remain a tool, not a journalist.