Google pixel phones have always featured cameras, thanks to the fact that they are not part of the AI's use of the company. The latest application? Built-in kiss for the camera of the Pixel 3 camera that automatically detects when subjects are being prepared and photographed quickly.
The feature is updating the Photobooth of the Camera Camera application. This is a Shutter-Free mode that automatically shoots a wide angle on the Pixel 3 camera. In addition to detecting kisses, Google says the software recognizes five key facial expressions that "should" cause the recording: "Smiles, Language-Out, Swing / duck, puffy cheeks and surprise. "
That's the theory, anyway. Our tests with the application were inconsistent. "Her ability to detect a duck-face is debatable," was the assessment of On the edgeJohn Porter. Although he added that he successfully managed to kiss his reflection in the mirror. "It grabbed the picture at the moment when my lips contacted!"
The technology for this comes from Google Clips, the 2017 experiment on using AI to make photography easier. Clips should have been a tool for families to capture important moments. It was small, lightweight and minimalist, and used embedded algorithms to decide when to shoot. But while a neat concept, it was redundant for most users.
While Clips was thrown out of Google's history (we could not find it for sale in the Google store), technology that helped to incubate lives. With neural networks that scan your facial expressions and make sure your eyes are not closed, Google says pixel 3 makes it easier than ever to have a perfect self-portrait and group photo.
As part of the Pixel Camera update, the application also helps users know when looking for their best picture. The white bar on the side of the screen (left in the GIF above) matches the actions of users. When everyone sees the camera and makes a beautiful face, it expands to the full screen width and the phone takes a picture.
"We are excited about the possibilities of automatic camera camera photography," Google engineers wrote in a blog post. "Because computer vision continues to improve, in the future we can generally trust smartphones to choose a great time to shoot."