Back to the Jan-Feb 2024 issue

Exploring AI Applications in City Government: The Promise and the Risks

By Joshua Pine and Lena Geraghty, in collaboration with Kate Stoll and Danielle Grey-Stewart

Artificial intelligence (AI) is changing the landscape of work, and local governments are by no means exempt. Cities, towns, and villages can harness the power of AI to draft resolutions, create social media content, summarize information for constituents, improve data-driven decision-making, and more. However, not all municipal applications are suitable for AI, and not all AI technologies are appropriate for government use.

To discuss the opportunities and risks of AI for local leaders, the National League of Cities (NLC) and the American Association for the Advancement of Science Center (AAAS) for Scientific Evidence in Public Issues co-hosted a virtual event in August 2023, titled “Introduction to AI in Municipal Government.” Although the conversation was just the tip of the iceberg when it comes to AI applications and policies by municipal governments, several key takeaways can be gleaned from the event:

Artificial intelligence is not a magical solution

Hoda Heidari, co-leader of the Responsible AI Initiative at Carnegie Mellon University, shared how AI utilizes statistical processes designed to find patterns in data. This can be a valuable tool for local leaders. However, like human-made decisions, AI systems have the potential for bias and discrimination depending on their training, the data they use, and their ultimate application. Extra care should be exercised when AI is used to inform consequential decisions or to provide risk assessments that impact people’s lives.

There are many types of AI

Generative AI, like ChatGPT or Google Bard, is just one type of technology. Generative AI for text works by predicting the next word in a sentence, and the next sentence in a paragraph and so on, based upon what’s most likely to follow according to text pulled from many sources, including the internet. Generative AI does not generally test for accuracy — humans should fact check its outputs.

Experiment but verify

The City of Boston’s Interim Guidelines for Using Generative AI highlight how AI tools have significant potential to benefit the work of city employees. This potential does not, however, remove responsibility of staff for its output, which is why verification and accountability are essential. Boston’s Chief Information Officer Santiago Garces emphasized three guiding principles:

1) Fact check and review all generative AI outputs. Humans are ultimately responsible for whatever products or outcomes they publish, regardless of whether AI is used.

2) Disclose when generative AI has been used. Constituents expect transparency from their local government.

3) Do not share sensitive or private information. The information input into generative AI prompts is not inherently private and therefore could be vulnerable to security threats or could be taken by the company providing the model to further train their technology.

AI is not just for big cities

Whereas other emerging technologies often require significant technical training and staff resources, generative AI tools based on natural language inputs allow cities of all sizes to benefit from its value. The City of Wentzville, Missouri is piloting generative AI tools to automate certain aspects of city communication, which creates more time and space for creative and strategic thinking by city staff. Wentzville offers in-person training and requires virtual training for staff on the use of generative AI.

The August event merely scratched the surface of the many questions raised about the promise and peril of AI applications in local government. What questions do you have about using AI in your city? What creative applications of AI can propel your city into the digital future? What guardrails should be put in place to ensure the safe use of AI in your city?

To learn more about AI, visit the AAAS EPI Center resources at bit.ly/AAAS-epi-center.

This article was originally published by the NLC and reprinted with permission.

Joshua Pine is the program manager of urban innovation at the NLC. Lena Geraghty is the director of urban innovation and sustainability in the Center for Municipal Practice at the NLC. Kate Stoll and Danielle Grey-Stewart are with the AAAS Center for Scientific Evidence in Public Issues.