3:33
Commentary
This week I learned about a new journalism app: Artifact. Opening it for the first time, you might find the app mundane. However, the questions it asks about the future of journalism are profound.
How will we perceive bias in journalism if artificial intelligence increasingly serves us the news we read?
Artifact’s pitch lured me in. First off, I don’t have a news app that I like. Artifact also aims to encourage users to read journalism from a wide variety of sources, something I aspire to, as I bop from app to app.
The model sounds similar to other apps that aggregate the news, whether it be Twitter, Apple News+ or Flipboard.
The difference on Artifact is this: Machine learning tailors its recommendations for further reading. The app aims to satisfy your interests based on your reading habits. The co-founders of Instagram launched the app for a limited number of users until Wednesday, when it became available to anyone.
Its co-founder, Kevin Systrom, explained the app this way on the Hard Fork podcast: “Artifact is a hyper-personalized newsfeed driven by the latest in machine learning. It’s like TikTok for text. There’s a lot of text out there on the web, most of it news, blogs, articles. And we take all of that, we understand it, using machine learning. And then we say, ‘Hey, user, you signed up for Artifact. What are you into?’ And then we start matchmaking, and we present you a feed of stuff.”
This week, the app smartly scraped national news stories that I have read (mostly) without running into paywalls or hiccups. One huge drawback is the lack of local news. You cannot find content from any Kansas newspaper or news outlet, let alone your beloved Kansas Reflector.
Having read 50 articles, I am intrigued but not converted. The founders promise that “machine learning,” already 2023’s phrase of the year, is the sparkle dust that will transform the experience.
“And at first you’re just, like, OK, it’s a newsreader,” Systrom continues. “But if you use it for — I don’t know — a week or two, or depending on how heavily you use it, it becomes really different. And that’s because it starts to understand what you like and you don’t like.”
Despite what Systrom says, up to this point, the app insists on limiting my reading to a few dozen interest areas. Upon logging in for the first time, I searched for a “photography” button to signal my interest in cameras, photographers and new photo books.
But there was no button for that. Instead, I could select interests like “Kitchen Products” (Who needs a news app to find blender reviews?) and “Royals” (as in princes and queens, not the baseball boys in blue in Kansas City).
The artificial intelligence of Artifact also aims to amplify the most expert and trustworthy writers on a news story.
“I want a great post from an epidemiologist on Substack who nobody knows to become the de facto thing that everyone reads if it’s great,” Systrom told Hard Fork. “And then I want people to be able to discuss it and post about it and say what they think about it and debate and argue or whatever.”
The vision is an enlightened and thronging town square, but one officiated by artificial intelligence.
The vision is an enlightened and thronging town square, but one officiated by artificial intelligence. – Eric Thomas
The disconnect is obvious: Artificial intelligence would moderate not just the journalism we consume, but also the discussions we have about it. Perhaps one of our most human pursuits — explaining the world to one another — erodes into a predictive computer model anticipating what we will read and how we will react.
Another goal for Artifact? To use the app’s artificial intelligence to generate journalism based on other stories available in the news media.
“We do not generate text right now, though I will say I’m very excited about generative text in the future for us,” Systrom told Hard Fork. “Less about replacing writing and less about replacing authors or publishers, way more about synthesizing events and being able to point you to the right sources on the events.”
Paired with a deluge of other news about artificial intelligence, this app made me wonder. We often label a newspaper or website’s editorial judgment: “The New York Times is a cesspool of progressive dribble.” “The New York Post is owned by corporations that guide its conservative politics.”
Will our trust in journalism stabilize once machine learning is the only hand steering us to one story or another? The introduction of apps like Artifact suggests that their developers believe so.
Of course, our reading habits online are already being steered by algorithms and the suggestions of advertisers, whether on Facebook or Twitter. Indeed, you likely reached this point in this article based on some machine learning, unless you are one of the family members whom I force to read my columns each week. (Sorry, Polly.)
However, I predict that we see the same bias in the news when it’s artificial intelligence that curates news for us. The algorithms of one app might provide us news we find distasteful (“Too Democratic!”), while another might cater to our sensibilities (“Finally, an app that gets it right!).
After all, tech reporters said this was Elon Musk’s essential motivation for buying and breaking Twitter. In the machine of Twitter’s recommended tweets, Musk saw a ghost — one that was blocking conservative voices and boosting liberal ones. He set out to find that perceived bias and eliminate it.
If we, like Musk, obsess about interrogating our news aggregation apps, our attention drifts far from the journalists actually doing the work. We aren’t critiquing their reporting methods. We aren’t analyzing the writer’s agenda. We aren’t investigating the publication’s financial entanglements. Instead, we are grinding our teeth at an app — an app that selected a headline from a publication that employs a writer who filed a story.
It’s a meta-argument, fitting for this moment in the metaverse. Facts and stories are lost.
Conversely, the phrase “artificial intelligence” will signal empirical authority to some people. RobotNews™ will be trusted news. Many will perceive apolitical computer code as serving up facts and headlines without preference for red or blue politics.
That perception would ignore the mistakes that artificial intelligence makes, especially at this moment. Kevin Rose has reported how, in addition to producing glaring factual errors, some machine learning can be steered toward hallucinations.
Aside from Hunter S. Thompson, we don’t enlist psychedelic journalists. It seems a dangerous way to get the news.
“If you have people who are asking a chatbot, ‘What’s the Deep State?’ or it’s having conversations about lizard people, or if it’s having conversations about vaccines and other sensitive topics, you can easily see how people might actually take that thing seriously,” Rouse said on The Daily.
No doubt, American journalism has always been political. Even during colonial times, the British monarchy mandated that publishers print sympathetic news. More than 100 years ago, major cities had many newspapers to give voice to various partisan points of view, including voices that campaigned for racial equality, powerful unions and immigrant rights. Other voices sought to limit those rights.
The politics of those bygone publications, seen through their mastheads, were a list of names. Often we knew those people.
In this dawning age, to understand why we are reading the news we are reading, we might need to be familiar with a line of journalistic computer code, rather than an old-fashioned human being.
Eric Thomas directs the Kansas Scholastic Press Association and teaches visual journalism and photojournalism at the University of Kansas. Through its opinion section, Kansas Reflector works to amplify the voices of people who are affected by public policies or excluded from public debate. Find information, including how to submit your own commentary, here.
Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site. Please see our republishing guidelines for use of photos and graphics.
Eric Thomas