Michigan Radio project points to future of AI use in newsrooms

Print More

Mike Janssen, using DALL-E 3

Over the past year or so, as Google, Facebook and OpenAI have unveiled ever-more amazing artificial intelligence models, and I’ve felt ever-higher levels of anxiety over the future of our industry, a simple exercise has given me hope. 

Whatever the new model happens to be, I start with a simple question: “Tell me what happened at the last Grand Rapids City Commission meeting.”

Grand Rapids is the city where I live, but you can try this for any city and I’ll bet you get the same result — a bad one. 

Sometimes the AI model just links to the city’s website and tells me how to figure it out on my own. Sometimes it finds a news article from an old commission meeting and summarizes that. Sometimes it finds a YouTube video of a meeting and tries to summarize the meeting but misses the most important stuff or just plain makes it up.

And I breathe a sigh of relief. Because it means AI still can’t do even the most basic job in local news — covering a public meeting.

It’s more than just relief, though. Watching the AI fail at a simple task, and understanding why it fails, points to an opportunity going forward. 

It’s an opportunity we’ve been studying for a while at Michigan Radio through a project we call Minutes. The idea behind Minutes is to automate the process of creating transcripts from local government meetings and store those transcripts where journalists can skim and search to find story ideas.

We came up with the tool after struggling to keep track of all the meetings happening in the dozens of communities within our listening area. When I first started at Michigan Radio, we could rely on small community newspapers to cover the meetings and let our reporters know which ones were worth a follow-up. But many of those community newspapers no longer exist. Those that do are getting by with a fraction of the staff they once had. That’s why we need a tool like Minutes. 

We recently wrapped up a partnership with the AP as the only public media outlet among five local news organizations chosen for the AP’s AI in Local News program. 

With the AP’s help, we connected with the Knight Lab at Northwestern University to revamp our transcription pipeline, dramatically improving the quality of the transcripts and streamlining the approach so we can cover even more communities. We’ve also added an email alert system so that reporters can track specific keywords and get notifications whenever the keywords pop up in new transcripts. 

We now have close to 2,000 transcripts in our database, with new ones coming in each day from more than 100 public bodies across seven states.

Here is where the real opportunity with AI begins for newsrooms like ours. Consider again the simple question that AI models struggle to answer about city council meetings. 

In theory, answering questions about local government meetings shouldn’t be a problem. The city of Grand Rapids, where I live, posts vast amounts of information about every meeting on its website. Agenda packets for each meeting regularly stretch into the hundreds of pages. And every minute of every meeting is broadcast live on multiple platforms, including both Facebook and Google — two of the biggest AI companies in the world. 

The problem for AI, same as most people actually, is that it doesn’t know how to get the information. There’s no publicly searchable database of every local government meeting. Instead, each city, county and school board maintains its own independent archive, some of which are more accessible than others. And, in most cases, the information is tucked inside a PDF file or a video.

You can’t ask ChatGPT to summarize a meeting it can’t access. That’s why AI struggles with my simple request. 

But with a tool like Minutes, we can feed transcripts directly into the AI model, even chunking up the text so that it doesn’t get lost keeping track of long passages. 

I’ve been experimenting with just this approach in the past few weeks, utilizing OpenAI’s API to feed in transcripts and generate bullet-point summaries with timestamped links to make it even easier for our reporters to follow the vast expanse of local meetings in our state. 

The results so far are, admittedly, inconsistent. At times the summaries are detailed and accurate. But just as often, the language model seems to get “lazy” in the middle of the task. The summary is vague, the timestamps get left out entirely. But I have managed to get one story out of the summaries that I would have never found without it, so I’m optimistic. And we’re still working on it. 

This whole exercise suggests a promising and durable role for journalism in the age of AI. 

As good as AI has gotten, there are still many things it can’t do, many questions it will struggle to answer simply because it doesn’t have access to the underlying data. I’m convinced that collecting and organizing datasets will be a key role for many newsrooms in the coming years — especially local newsrooms. Our jobs will become less focused on analyzing data to reveal the stories reporters think are important and more about structuring datasets so that anyone can use the latest AI tools to find answers to their own questions. 

Giving people information about their community is at the heart of what we do in public media. AI has transformed how people get information, but it hasn’t changed our role.

By focusing on what AI can’t do (or at least can’t do yet), we see more clearly what we should do to continue serving our communities. 

Dustin Dwyer is a reporter for Michigan Radio and co-creator of Minutes. He was a 2018 fellow at the Nieman Foundation for Journalism at Harvard. He lives in Grand Rapids. You can reach him at [email protected].

Leave a Reply

Your email address will not be published. Required fields are marked *