Windows, Cursors, and the Invisible Layer of the AI City by Berk Alkoç

Building blocks are overlayed with digital squares that highlight people living their day-to-day lives through windows. Some of the squares are accompanied by cursors. Below, there is the article title, "Windows, Cursors, and the Invisible Layer of the City by Berk Alkoç"

Artist contributions to the Better Images of AI library have always served a really important role in relation to fostering understanding and critical thinking about AI technologies and their context. Images facilitate deeper inquiries into the nature of AI, its history, and ethical, social, political and legal implications.  

In this series of blog posts called ‘Through My Eyes’, some of our volunteer stewards are each taking turns to choose an image from the library and unpack the artist’s processes and explore what that image means to them. In this blog post, Berk Alkoç explores Emily Rand’s image, AI City and what it reveals about algorithmic bias in increasingly digitalised cities where extractive data harvesting facilitate tech companies to exclude, surveil, and target individuals. 

In 1950, Gordon Childe wrote that “the concept of ‘city’ is notoriously hard to define” in his Urban Revolution. He had a point. Cities have always been strange hybrids: part geography, part invention, part collective experiment. They aren’t just collections of buildings where people happen to congregate. They’re networks of relationships, new ways of living compressed into dense spaces. In the past, cities grew from mud and stone. Today, the materials have changed. Now, the city is also constructed from data, code, and thousands of invisible algorithms quietly processing in the background while we simply try to eat dinner, walk the dog or scroll through our phones.

This tension defines AI City, a piece created by illustrator Emily Rand in collaboration with The London Office of Technology and Innovation (LOTI). The work emerged from a workshop at Science Gallery London during London Data Week 2023, beginning with a public conversation with Sam Nutt about how AI shapes urban life and how bias infiltrates these systems. Rather than creating a conventional infographic or dystopian “big brother” poster, Rand chose to depict this through an ostensibly ordinary city block. Yet nothing remains ordinary once you begin to look closely.

And if the city operates as a GPU, what exactly is it processing? Us.– Berk Alkoç

At first glance, the image presents familiar urban elements: a dense patchwork of apartments, brick facades, diverse windows revealing different lives. Someone makes a phone call. Another person slumps at a table. A figure stands with arms crossed, lost in thought. This is recognizable urban life. But then here comes the boxes: neon rectangles hovering around certain windows, as if the cityscape has transformed into a website interface. And the cursors: black arrows pointing directly at people, suggesting the skyline has become a giant computer screen where an invisible hand prepares to click on someone.

Building blocks are overlayed with digital squares that highlight people living their day-to-day lives through windows. Some of the squares are accompanied by cursors.
Emily Rand & LOTI / Better Images of AI / CC BY 4.0

The selection isn’t random. Some people receive boxes, others don’t. Some windows attract cursors, others remain unmarked. Suddenly the city appears less like an urban landscape and more like the interior of a graphics card, with building rows resembling processor arrays, the entire block functioning as an enormous GPU. And if the city operates as a GPU, what exactly is it processing? Us.

Here lies the work’s incisive critique. The boxes and cursors represent unseen systems that shape contemporary urban life—systems that monitor our movements, organize us into patterns, and determine who receives attention and who doesn’t. The piece prompts essential questions: In a data-driven city, who becomes visible? Who gains priority? And who gets overlooked? This is algorithmic bias in practice, but not the dramatic science fiction version. It’s quiet, mundane, cumulative and harmful. It manifests as improved waste collection in one neighborhood and neglected potholes in another. It appears as one window receiving a digital frame while another fades into invisibility. The effect is as subtle as the small cursors in Rand’s image, but equally consequential.

Consider walking through the city with an invisible video game interface overlaying everything, highlighting random people while you’re simply trying to catch your bus. The city becomes simultaneously street and dashboard. You exist in public space while being processed by hidden systems. You’re surrounded by millions of people, yet filtered, sorted, and perceived through rules you never consented to—together, yet apart.

The invisible selection process that AI City visualizes has become increasingly visible through striking examples of algorithmic bias. For instance, Cambridge researcher Christoffer Koch Andersen’s work on “trans impossibility” examines how digital systems built around rigid binary gender categories systematically exclude trans people from essential services. Andersen points to Hungary’s plan to use facial recognition at pride parades to identify and fine attendees—suddenly those floating cursors aren’t just overlooking queer people; they’re actively targeting them. The same technology now renders people invisible in one context while placing giant neon boxes around anyone who dares exist publicly in another.

“You don’t see us until you’re aiming” 

Christoffer Koch Andersen and Mila Edensor on structural invisibility and trans impossibility 

Andersen demonstrates how these classification systems derive from colonial-era categorization practices—the supposedly “neutral” algorithms now embedded in everything from healthcare reminders to banking access actually perpetuate centuries-old biases. It’s the digital equivalent of those floating cursors deciding who gets selected and who gets ignored, except the consequences extend beyond metaphor to your actual bank account, health outcomes, and  your safety at a public gathering.

This extraction extends beyond labor to data itself. As Kate Crawford notes in Atlas of AI (2021), tech companies operate under a “collect-it-all mentality” where engineers aim to build “a mirror of the real world,” requiring that “anything that you see in the real world needs to be in our databases.” And the Smart cities become the ideal sites of this comprehensive data harvesting: faces captured on streets train facial recognition systems, social media feeds build predictive language models, and personal photos train machine vision algorithms. Just imagine how every small action, like a phone call, a walk, scrolling, becomes data that helps the city “learn,” and how that learning influences the city’s treatment of you. All of this is normalized as necessary rather than questioned as invasive. The urban environment transforms into a resource to be mined, with public spaces exploited for training data that powers the very systems creating discriminatory outcomes in areas such as employment, housing, and policing.

If Childe were writing today, he would need to revise his definition. Cities are no longer built solely from stone and steel. We no longer define them by agricultural “surplus”—now we speak of data surplus. Cities consist of streaming data, prediction models, and machine learning algorithms. They are simultaneously human and non-human. And if the city functions as a computer, then someone, somewhere, controls the mouse. The question becomes: who decides where to click? 

About the author

Berk Alkoç (he/him) is a designer–researcher based in Germany exploring the intersections of technology, cities, and everyday life through a critical (and unapologetically queer) lens. At ZeMKI, University of Bremen, he designs for Molo, a civic media platform. At the Institute for Technology Assessment and Systems Analysis (ITAS) at the Karlsruhe Institute of Technology (KIT), he researches nature conservation through a relational values lens and how digital tools shape environmental governance. Outside of work, he’s likely outdoors or immersed in something visual, whether behind a camera, sketching, or experimenting with graphic design.

A headshot in black and white of Berk.

If you want to contribute to our new blog series, ‘Through My Eyes’, by selecting an image from the Better Images of AI Library and exploring what the image means to you, get in touch (info@betterimagesofai.org).

Explore other posts in the ‘Through My Eyes’ series