This article is available in: French
This year at SIGGRAPH, Theodore Kim (Former Pixar Senior Scientist, currently Professor at Yale University) discussed racial bias in Computer Graphics research.
As he explains, even if one might be surprised at the idea of such a bias (after all, mathematics and physics are supposed to be exact sciences, free from any human bias), “many of the basic research problems we take for granted in computer graphics contain insidious assumptions about race”.
For example, Theodore Kim shows that when it comes to skin rendering, SIGGRAPH papers overwhelmingly focus on lighter skin tones, and most of them forget that black skin even exists. Similarly, technical papers about hair simulation mostly focus on straight hair, to such a degree that when the word “hair” is used, it is usually implied that it means “straight hair”. When researchers publish technical papers about curly hair, they tend to explicitly write “curly hair”, not just “hair”. So-called Type 4 hair (curly/kinky hair) is hardly ever mentioned, even though about a billion people do have this kind of hair.
Theodore Kim explains that focusing on universal approaches on these topics i actually counter-productive, including from a scientific point of view. Indeed, kinky hair doesn’t move the same way straight hair does, and black skin doesn’t interact with light in the same manner brighter skins does: when trying to create a realistic dark skin, the specular component will typically be more important than subsurface scattering.
Due to both these differences and the bias in Graphics Research, using so called universal equations to simulate dark skin car create unconvincing results. Theodore Kim highlighted this issue using MetaHuman Creator (a photorealistic digital humans generator by Epic Games): the SSS/specular ratio seems to be off when creating characters with black skin.
The presentation reminds us that such biases are not new. During the analog era, human faces and color charts were used to adjust the way film was developed and processed. These references, called “leader ladies” (for movies) and “Shirley Cards” (for photography), were almost always created using the face of white women (although Shirley Cards became more diverse in the 90s).
As a consequence, dark skins were quite often poorly rendered, as shown by Theodore Kim using a few movie clips.
Theodore Kim explains that de goal of his conference is not to judge the past, but rather to become aware of the situation as an industry. Instead of perpetuating the prejudices of previous eras, he argues, we may “engage in anti-racist research that works to dismantle it”. He suggests, for example, that we could create an equivalent of the Shirley Cards when it comes to dark skin rendering: this would be helpful to further the research in this area.
In the end, this kind of work will help improve the way we render digital humans as a whole, whatever the type of their hair and the color of their skin.
Here is the full talk:
It should be noted that a Bird of a Feather session (Countering Racial Bias in Computer Graphics Requires Structural Change) further explored this subject, with contributions from Theodore Kim, Holly Rushmeier (Yale University), Raqi Syed, (Victoria University of Wellington), Wojciech Jarosz (Dartmouth College), A.M. Darke (UCSC). The event was not recorded, but a signup form is still available if you wish to be contacted about participating in further efforts for SIGGRAPH 2022.
For more information about this topic, you may also be interested in three of the books referenced in the talk:
- Algorithms of Oppression: How Search Engines Reinforce Racism – Safiya Umoja Noble
- Race After Technology – Ruha Benjamin
- Girl head – Feminism and Film Materiality – Genevieve Yue
You may also want to read our interview of Raqi Syed and Areito Echevarria, who created the VR experience Minimum Mass: at the end of page 2, Raqi Syed told us about her work and the idea of “decolonizing disciplines in VFX”. She highlighted that “many people ask what we do with digital humans, what they are for, but the question we don’t ask is who gets to be a digital human”.
Last, but not least, a publication related to this talk is available on Theodore Kim’s website : Countering racial bias in computer graphics research (T. Kim, Holly Rushmeier, Julie Dorsey, Derek Nowrouzezahrai, Raqi Syed, Wojciech Jarosz, and A.M. Darke – arXiv 2021). Anti-racist practices and future research directions are suggested within the publication.