AI and the Media, Misinformation and Narratives.

    Rendition of Walter Cronkite.

News was once trusted more, where the people presenting the news were themselves trusted to give people the facts. There were narratives even then, yet there was a balance because of the integrity of the people involved.

Nowadays, this seems to have changed with institutional distrust, political sectarianism and the battle between partisan and ideological Identities versus anti-establishment orientations.

In short, things are wonky.

Now the world’s first news network entirely generated by artificial intelligence is set to launch next year.1 This seems a bit odd given that the Dictionary.com word of the year is ‘hallucinate’ because of artificial intelligence, as I’ve written about before.

What could possibly go wrong with a news source that is completely powered by artificial intelligence?

Misinformation. Oddly enough, Dr Daniel Williams wrote an interesting article on misinformation, pointing out that misinformation could be a symptom instead of the actual problem. He makes some good points, though it does seem a chicken and egg issue at this point. Which came first? I don’t think anyone can know the answer to that, and if they did, they’d probably not be trusted because things have gotten that bad.

At the same time, I look through my Facebook memories just about every day and note more and more content that I had shared is… gone. Deleted. There’s no reasoning given, and when I do find out that something I shared has been deleted, it’s as informative as a random nun wandering around with a ruler, rapping people’s knuckles and not telling them why she’s doing it.

Algorithms. I don’t know that it’s censorship, but they sure do weed a lot of content and that makes me wonder how much content gets weeded elsewhere. I’m not particularly terrible with my Facebook account or any other account. Like everyone else, I have shared things that I thought to be true that ended up not being true, but I don’t do that very often because I’m skeptical.

We would like to believe integrity is inherent in journalism, but the water got muddied somewhere along the way when news narratives and editorials became more viewed than the actual facts. With the facts, it’s easy to build one’s own narrative though not easy enough when people are too busy making a living to do so. Further, we have a tendency toward viewing that which fits our own world view, the ‘echo chambers’ that pop up now and then such as echoed extremism. To have time to expand beyond our echo chambers, we need to find time to do so and be willing to have our own world views challenged.

Instead, most people are off chasing the red dots, mistaking sometimes being busy as being productive. At a cellular level, we’re all very busy, but that doesn’t mean we’re productive, that we’re adding value to the world around us somehow. There is something to Dr. Daniel Williams’ points on societal malaise.

A news network run completely by artificial intelligence mixed with the world as we have it now doesn’t seem ideal, yet the idea has it’s selling points because media itself isn’t trusted largely because media is built around business, and business is built around advertising, advertising in turn is a game of numbers and to get the numbers you have to get eyeballs looking at the content. Thus, propping up people’s world views is more important when the costs of doing all of that are higher. Is it possible that decreasing the costs would decrease the need to prop up world views for advertising?

We’ll be finding out.

  1. 2024 ↩︎

3 thoughts on “AI and the Media, Misinformation and Narratives.

Leave a Reply

Your email address will not be published. Required fields are marked *