Recently, Google has been the subject of racism. Its pixel phones are accused of unnecessarily whitening darker skin tones which don’t look great most of the times. Following this Google decided to change its algorithm and improve its color accuracy
For people of color, photography has not always seen us as we want to be seen, even on some of our own Google products. To make smartphone photography truly for everyone, we’ve been working with a group of industry experts to build a more accurate and inclusive camera.” Sameer Samat, Vice President of Android and Google Play
Google is working on its auto white balance and auto exposure algorithms to make them more inclusive of darker skin tones. Photographer and Director Micaiah Carter mentions that the goal is to create “almost like a guidebook to captures skin tone.” In the Google I/O presentations Google explained that it has worked with “expert image makers who have taken thousands of images to diversify” Google’s datasets
The photo shown above is one from Google’s presentation and trust me the photo from the left is edited for sure. No pixel smartphone captures an image so weird and incorrectly exposed. The after image looks more saturated and contrasty in nature. However, the skin tones on the right are way better.
Also, Google is working towards better portrait photography. It is improving its AI algorithms to better capture curly and wavy hairs by separating them from the background, what we commonly refer to as edge detection.
Also, Google is working on its algorithm to better capture curly and wavy hair more accurately. This means that cameras will be better equipped to separate any person’s hair in a photo from their background in a portrait photo.
Through these before and after images, Google is really insulting its previous version pixel smartphones. Google announces that these changes along with some other changes (not revealed in the Google I/O) will be coming in the upcoming Google pixel 6 series smartphones.