AI instruments may assist governments perceive the wants and needs of residents. The group is “already inputting plenty of its information” by means of group conferences, public surveys, 311 tickets, and different channels, Williams says. Boston, for example, recorded practically 300,000 311 requests in 2024 (most had been complaints associated to parking). New York Metropolis recorded 35 million 311 contacts in 2023. It may be tough for presidency employees to identify tendencies in all that noise. “Now they’ve a extra structured method to analyze that knowledge that didn’t actually exist earlier than,” she says.
AI might help paint a clearer image of how these types of resident complaints are distributed geographically. At a group assembly in Boston final 12 months, metropolis workers used generative AI to immediately produce a map of pothole complaints from the earlier month.
AI additionally has the potential to light up extra summary knowledge on residents’ needs. One mechanism Williams cites in her analysis is Polis, an open-source polling platform utilized by a number of nationwide governments around the globe and a handful of cities and media corporations within the US. A current replace permits ballot hosts to categorize and summarize responses utilizing AI. It’s one thing of an experiment in how AI might help facilitate direct democracy—a difficulty that software creator Colin Megill has labored on with each OpenAI and Anthropic.
However whilst Megill explores these frontiers, he’s continuing cautiously. The aim is to “improve human company,” he says, and to keep away from “manipulation” in any respect prices: “You need to give the mannequin very particular and discrete duties that increase human authors however don’t exchange them.”
Misinformation is one other concern as native governments work out how finest to work with AI. Although they’re more and more frequent, 311 chatbots have a blended file on this entrance. New York Metropolis’s chatbot made headlines final 12 months for offering inaccurate and, at occasions, weird data. When an Related Press reporter requested if it was authorized for a restaurant to serve cheese that had been nibbled on by a rat, the chatbot responded, “Sure, you may nonetheless serve the cheese to clients if it has rat bites.” (The New York chatbot seems to have improved since then. When requested by this reporter, it responded firmly within the destructive to the nibbling rat query.)