Where people once gathered freely to speak, debate, and protest, a new kind of listener has arrived — one that doesn’t blink, doesn’t sleep, and never looks away. Artificial Intelligence is now part of the way our cities watch us. Town squares, high streets and shopping centres have always been important democratic spaces. They are where people meet, organise, and make themselves visible. These rights — to speak and to assemble — are protected in UK law under Articles 10 and 11 of the Human Rights Act (1998). But in the last few years those same spaces have started to feel different. They are still public, but they are also increasingly digital: mapped, recorded, and sometimes predicted. Participation can start to feel watched rather than welcomed.
AI-driven surveillance systems are one of the clearest examples of this shift. Cameras no longer just record for security; they analyse faces, match them to lists, and make decisions in real time. That changes how people experience public space. When you know you are being scanned, you behave differently. You might still walk through the town centre, but would you stop to sign a petition? Would you join a protest? Would you go to a rally if you thought your face might be captured and stored?
Croydon in 2025 shows how this works in practice. The Metropolitan Police installed permanent live facial recognition cameras in the town centre, saying it was part of a wider effort to tackle urban crime. The technology scanned everyone passing through and compared their faces to police watchlists. Within weeks, 17 people were arrested for offences such as stalking and domestic abuse, and shoplifting reportedly fell by almost a third. On paper, that looks like success. Some shopkeepers even said they felt safer and that the town centre was calmer.
But that was only half the story. Other residents said the atmosphere had changed — not because of crime, but because of the feeling of constant observation. One local business owner told the BBC that customers felt anxious, “like they’ve done something wrong” just by being there. Civil liberties groups such as Big Brother Watch and Liberty warned that the technology could have a “chilling” effect: people may not be told not to protest, but they may choose not to, just in case. The police said non-matching data was deleted, but because there is no strong, dedicated law on facial recognition beyond the Data Protection Act 2018, many people didn’t fully trust those assurances. When trust is thin, participation is thin.
And this is the democratic problem. AI can make places safer, but it can also quietly reshape how people take part in public life. If people believe they are always being scanned, they may stay away from demonstrations, avoid sensitive community events, or stop speaking to journalists in public. The result isn’t an open, confident community — it’s a tidy, quiet one. That is not what Sustainable Development Goal 16 (Peace, Justice and Strong Institutions) is aiming for. SDG 16 depends on transparency, accountability and citizens who feel free to act. Safety that comes at the cost of voice is not real safety.
The Croydon example shows that digital tools can protect and constrain at the same time. They can reduce theft and catch dangerous offenders, but they can also narrow the space for spontaneity, dissent and everyday participation. If public space becomes a place where you are always a data point, not just a citizen, then the character of democracy changes. People start to perform good behaviour instead of practising citizenship.
This doesn’t mean AI has no place in public life. It means it has to be used ethically, with clear rules, independent oversight, and honest communication with the public. Technology should help people feel both safe and free — not force them to choose one or the other. If we want lively high streets, visible activism and communities that trust the institutions around them, then the digital layer that now covers our public spaces must be designed to empower citizens, not to silence them. Otherwise, we risk building very secure cities where nobody really speaks.
Laiyba Rashid