Google recently unveiled the Pixel 8 Pixel 8 Pro, and among its updated roster of features, the company kept circling back to its Real Tone capabilities – but it’s not a feature exclusive to the most recent models.
Instead, Google Real Tone has been a feature of many recent Pixel phones, down to the budget-focused Pixel 7a and all the way up to the Google Pixel Fold – though performance isn’t consistent among all Pixel phones, with the most recent Pixel 8 series delivering the most accurate skin tones yet.
With that in mind, here’s everything you need to know about Google Real Tone, including what it is, how it works and where you can find it…
What is Google Real Tone?
Real Tone is a Pixel camera feature designed to improve accuracy when reproducing darker skin tones where previous Google phones have fallen short.
Google has worked with more than 60 photographers of different ethnicities to train the Pixel camera to capture darker skin tones more accurately, preventing them from appearing darker or brighter than how they look in real life.
Real Tone is a part of Google’s ongoing effort to address skin tone bias in AI and camera technology, and it has continued to improve its performance since its original iteration on the Pixel 6.
“This racial bias in camera technology has its roots in the way it was developed and tested”, explains Google on its website.
“Camera sensors, processing algorithms, and editing software were all trained largely on data sets that had light skin as the baseline, with a limited scale of skin tones taken into account. This bias has lingered for decades, because these technologies were never adjusted for those with darker skin”.
Because Real Tone is baked into the Pixel’s camera, it is also supported on many third-party apps, like Snapchat.
How does it work?
Rather than being based on one single app or technology, Real Tone follows a framework that addresses six core areas in Google’s imaging technology.
First of all, the Pixel camera is trained to detect a diverse range of faces to ensure the camera can get an in-focus image in a variety of lighting conditions.
Next comes white balance. The auto-white balance is designed to better reflect a variety of skin tones.
The automatic exposure is responsible for how bright photos snapped on the Pixel looks, so Google has improved this as well.
Furthermore, the company developed a new algorithm to reduce the impact of stray light on an image, ensuring darker skin tones don’t appear washed out when framed by a sunlit window, for example.
Avoiding blur is also central to getting a good photo, so Google has used the AI in its custom-made Tensor chip to keep images sharp in low-light conditions.
Finally, the auto-enhancement feature in Google Photos works with photos taken with smartphones outside of the Pixel, meaning other Android users can optimise the colour and lighting in their images after taking them.
All of the above is applied to the Pixel’s camera, which is then tested by professionals to expand Google’s image datasets.
“We continue to work with image experts – like photographers, cinematographers, and colorists – who are celebrated for their beautiful and accurate imagery of communities of color”, said Google’s Image Equity Lead, Florian Koenigsberger.
“We ask them to test our cameras in a wide range of tough lighting conditions. In the process, they take thousands of portraits that make our image datasets 25 times more diverse – to look more like the world around us”.
More recently, Google has open-sourced the Monk Skin Tone (MST) scale, a scale developed by Harvard professor Dr Ellis Monk that is more inclusive and provides a broader spectrum of skin tones than the current tech industry standard.
Which Pixels support Real Tone?
Real Tone launched on the Pixel 6 in 2021 and can be found on the following phones: