Skip to Main Content
Resource · Blog

AI in Local Government: Emerging Applications, Risks, and Challenges

by Olivia Barrow, Writer, Curate, Part of FiscalNote

Applications of AI in local governments and their implications for government affairs and advocacy organizations, concerns that early adopters are discovering, and what to expect next.

AI in local government

Back to resources listing

The emergence of generative AI tools is opening up new possibilities for public communication and service delivery for local governments, many of which are facing staff vacancies and looming funding shortfalls. The adoption of AI-powered tools could redefine the interaction between local governments and their constituents, enhancing service accessibility and policy participation. Yet, if not managed carefully, it could also erode public trust and reinforce societal inequalities.

Read on to learn more about some of the most promising applications of AI in local governments and their implications for government affairs and advocacy organizations, some unexpected concerns that early adopters are discovering, and expert opinions about what to expect next.

AI Toolkit: Navigating the Future of Government Affairs

Learn what the rise of AI might mean for your industry, how you can leverage AI tools in your job, and the legislation you should be watching.

Local Governments are Approaching AI with Caution

In the absence of federal regulations on AI, local governments are developing their own policies and guidelines for internal use, and their approaches vary widely. Maine, for instance, has imposed a six-month moratorium on its use within state agencies. In contrast, San Jose, California, has developed a 27-page policy requiring municipal employees to report every use of tools like ChatGPT. Kendall County, Illinois, released a succinct two-page guideline in June, requiring that employees clear each intended use of the tool with their executive.

Boston is taking a more permissive approach. Santiago Garces, chief innovation officer for the city of Boston, focuses on supporting public servants in their use of AI tools and fostering a community of collective experimentation.

Boston’s 10-page policy offers examples for suggested uses and includes three main principles:

  1. Verify the output

  2. Never submit proprietary/confidential information

  3. Disclose the use of AI to create images and content

“We minimize the risk by adding clarity and support rather than trying to deprive the users of something they would find interesting or useful,” Garces says.

Promising Applications of Generative AI for Local Governments and Advocacy

1. Enhancing Public Communication

Saiph Savage, assistant professor at the Northeastern University Khoury College of Computer Sciences and director of the Civic AI Lab, envisions local governments using AI tools to improve their communications strategies. By sharing more content, they can keep the public informed about progress on new initiatives.

Savage also notes that tools like ChatGPT can help tailor messages to different audiences to increase understanding of new policies. For example, an advocacy group concerned about equitable public transportation services could use a solution with built-in AI capabilities, like VoterVoice, to tailor a message about upcoming transit changes to reach older residents and help them make their voices heard. VoterVoice makes real-time suggestions on subject line and message length; phrases that trigger spam filters; and the number, type, and placement of links to maximize campaign success.

2. Facilitating Data Analysis

Many local governments across the country have portals that provide free access to vast amounts of data about the delivery of government services, demographic info, crime trends, and much more. This data is made available to empower the public to make data-driven decisions, participate in policymaking, and hold the government accountable.

While the portals have been available for several years, they’ve largely been an untapped resource by both city staff and government affairs teams because they required technical skills to manipulate the data. But with generative AI, individuals without coding skills can now perform statistical analyses to make meaning out of these data sets. For instance, in Boston, Garces' team used generative AI to analyze the city's 311 non-emergency hotline response times across neighborhoods, uncovering previously unnoticed disparities.

With the help of GenAI tools that can perform statistical analysis, government affairs teams could leverage these resources to enhance their arguments for or against proposed ordinances with compelling data.

3. Sensemaking

Government affairs teams often spend a lot of time demystifying government processes to help their stakeholders understand what’s happening. Generative AI tools can help speed up this process by analyzing, interpreting, and summarizing local government documents. For example, you could feed an entire city council meeting minutes document into ChatGPT and ask it to identify and summarize any relevant action items.

Garces says he sees potential for this application of AI to help equalize access to government and bureaucratic procedures for newer government affairs professionals or advocacy organizations. “Success in government often requires esoteric knowledge of long and obscure processes,” he says. “I think these tools can help level the playing field.”

How Technology Can Help You Incorporate Local Policy Monitoring into Your Strategy

AI Experimentation: Unexpected Risks, Challenges, and Ethical Considerations

There are many well-documented risks associated with relying on AI for decision-making, including the potential for amplification of biases and discrimination if the tools are trained on biased data. But groups that are actively experimenting with generative AI are also discovering new concerns.

In Boston, for example, city staffers were surprised to find that automatic transcription tools like Otter violated both their AI guidelines and a state law. Because the tools automatically add themselves to every meeting without notifying participants, they violated Massachusetts’s two-party consent law for recording phone calls. Additionally, whenever the subject of the conversations being recorded was sensitive, using Otter meant that confidential information could be shared with a third party, which violated the second principle of the AI policy.

Garces is also concerned about the downstream impact on creative jobs as generative AI applications become more reliable and enter mainstream use. “Artists add a lot of value to people and cities,” he says. “From a policy standpoint, we should be thinking more about how to protect them, both through how we use the tools internally, but also broadly as a society.”

Lastly, the use of AI poses a risk of loss of credibility if it’s used inappropriately. Many civic technology companies are adding generative AI features to existing software products. However, the public has not reacted well to some examples of organizations using generative AI to draft sensitive public communication. When Vanderbilt University used ChatGPT to help draft an email about the importance of community in the wake of a shooting at another university in February 2023, students were outraged.

The Future of AI in Local Government

The integration of generative AI in local government offers the potential to enhance efficiency and public engagement, but also poses significant risks that must be carefully managed.

As AI use in local government evolves, we can expect to see more guidelines and revisions. Garces says he is already working on the next version of Boston’s guidelines based on what he and his team have learned from their initial experimentation.

Software providers who are racing to incorporate generative AI into their tools should pay close attention to the guidelines local governments are adopting and consider making their AI-powered features optional to avoid getting blacklisted by cautious local governments.

Savage notes that political partisanship could be a barrier to the development of useful AI guidelines. If one party champions the use of the tools and invests in research and experimentation on it, but then that party loses power, opposing parties may abandon the effort in favor of creating something new that they can take sole credit for. “It’s important to consider those dynamics when you’re thinking about technology adoption and innovation in government,” Savage says.

Organizations that want to influence the way local governments use AI should consider building relationships across the aisle and finding champions for your policy within both parties to help avoid this risk.

Make Your Voice Heard Regarding Local Government Use of AI

Government affairs teams and advocacy organizations have a role to play in holding governments accountable for developing these policies with care, considering all angles.

Local government monitoring software like Curate can help you stay abreast of the latest developments around the use of AI-powered tools at the local government level. With customized notifications, you can find out whenever a locality introduces or revises an AI policy or proposes a moratorium on certain kinds of tools, or holds a public hearing related to AI. By proactively engaging with local leaders on both sides of the aisle, you can help make sure that political instability doesn’t get in the way of progress, that the risks are properly mitigated, and that regulation doesn’t stifle innovation.

Ready to see for yourself?

Let’s explore how modern issues management can help you get more done.

Back to resources listing