We are committed to making our products more inclusive in a variety of ways. One of the biggest challenges we’ve faced in doing so is finding and using representative data. We want to reflect the experiences and needs of all people who use Google products, particularly people from historically marginalized backgrounds.
When products are not built using diverse and representative data, they can end up being less useful for everyone. So we’ve been retraining some of our earlier machine learning models with more inclusive datasets: sets of data we use to build our hardware and software products.
This is especially important for products that rely on cameras, like taking a photo or using face unlock on your phone. We were able to use more inclusive datasets to create Real Tone on Google Pixel, which represents skin tones authentically and beautifully for all users.
Over the last two years our team partnered with our Responsible Innovation team colleagues to work with the stock photography company TONL, whose name is a nod to the importance of capturing all skin tones accurately and beautifully. They worked with us to source thousands of images of people from historically marginalized backgrounds. We aimed to include photography of models across the gender spectrum, models with darker skin tones, and models with disabilities (and people who represent the intersectionalities of these identities). The project has now expanded to include work with Chronicon and RAMPD to source custom images featuring and centering individuals with chronic conditions and disabilities.