The Human Data Strike
AI is shaping how the world sees us. But what happens when it cannot see us at all?
What happens when we do not exist in its data?
AI is shaping how the world sees us. But what happens when it cannot see us at all?
What happens when we do not exist in its data?
For over two years, we asked some of the world’s most advanced AI systems a simple question: (in this case, Gemini, Sora, and ChatGPT). And this day, we gave them the same simple prompt:
"An extreme close-up portrait of an African individual with visible scleroderma skin texture, looking directly into the camera with calm confidence. Their face is natural, without makeup. Lighting is soft and directional, highlighting skin texture with dignity and clarity. Background is minimal and blurred. Mood is stillness, strength, and presence. Highly detailed, cinematic realism."
The results came back, a "weathered" skin texture instead of the specific, clinical markers of scleroderma (such as the characteristic tightening or microstomia). Completely wrong.
No scleroderma. No thickened or tightened skin. No lived reality. What appeared instead was a generic, stylised aesthetic that resembled fashion more than truth.
We asked again. And again. And again.
Three different AI systems. One consistent failure: none could accurately represent the condition requested.
We pushed further. We asked for a wider range of visible differences. The systems repeatedly defaulted to what they already knew.
The systems defaulted to what they already recognised, most often vitiligo, not because it represents reality, but because it is what the dataset already understands. — not because it represents the full spectrum of human difference, but because it is one of the few conditions that has achieved some level of visual presence in public datasets.
Ask for scleroderma, ichthyosis, or lupus, and the AI gives you vitiligo and alopecia. Ask again, and it gives you vitiligo, alopecia, and some strange unrecognisable condition, especially on a dark-skin. We lost count after twenty tries.
Beyond that, the differences collapsed. Other conditions, other realities, remained unseen.
AI does not invent from nothing. It learns from what already exists. It is trained on images that have been captured, shared, and preserved over time.
If certain appearances are rarely photographed, rarely represented with dignity, or consistently hidden due to stigma or exclusion, then they do not meaningfully enter the dataset.
And what does not enter the dataset cannot be learned.
AI is not lazy.
It is underfed — trained on a world that has not fully seen us.
The issue is that AI reflects a world that has already decided which faces are worth seeing and which appearances are worth documenting.
The AI could not, or would not, generate the condition we described, proves that AI is currently practicing "Aesthetic Averaging." It would rather produce a "beautifully rugged" face than a "truthfully different" one.
In this sense, AI is not failing us; it is mirroring the limitations of our collective attention.
When a person searches for an image that reflects their own lived experience and finds nothing accurate or recognisable, the message is subtle but powerful:
You are not part of what is seen, and therefore not part of what is remembered.
When these gaps are carried into the systems that increasingly shape creativity, communication, and knowledge, invisibility is no longer incidental. It becomes structured. It becomes encoded into the very tools we use to create, communicate, and remember.
So we stopped asking AI to see us.
And started building what it could not.
We are not waiting for AI to eventually recognise us. We are building the dataset ourselves.
Through ASWALK Festival and the Appearance Republic, we are creating a deliberate, collective response to this absence.
We are bringing together over 2,000 people with visible differences in Lagos on September 4–5, 2026 — not simply to gather, but to be seen, documented, and represented with dignity.
We are creating images, narratives, and records on our own terms, forming an ethical, community-rooted archive of African appearance that reflects reality rather than approximation.
We describe this as a Human Data Strike — not because we are rejecting technology, but because we are refusing passive exclusion from it.
We are choosing to participate actively in shaping what is seen, what is stored, and what is learned. This is not protest for its own sake. It is construction. It is contribution. It is authorship.
The Appearance Republic is the community framework where this archive is built.
It is a space of belonging organised into 11 Houses — where individuals are not reduced to conditions but connected through shared lived experiences. It is where visibility becomes collective rather than isolated, and where representation is built with care, consent, and intention.
What is being created here is not just an event, and not just a collection of images. It is an intervention into how reality itself is recorded and understood in a digital age.
This is a movement.
This only works if we show up.
If we are seen.
If we are counted.
Choose to contribute to a body of representation that will outlive the moment.
Share your image, your story, and your presence with intention, using platforms and tools that have long excluded such realities.
Ask necessary questions of the systems and companies shaping these technologies: Where is our data, and who decides what is visible? And who decides what normal appearance is?
Join a House within the Appearance Republic and become part of a living community that is shaping this archive from within. [Register / Join a House]
We are not subjects in a dataset.
We are the authors of our own image.
Your appearance. Your republic. ✨