Google is testing a new technology that separates the foreground from the background in videos in real time, making it easy to change the background to something more interesting or fun.
YouTube Real-Time Mobile Video Segmentation produces results similar to chroma-keying (also known as green screening), but doesn’t rely on a background that’s a uniform color.
The technology is designed specifically for videos of people, and uses a neural network that’s been trained using tens of thousands of sample images in a wide variety of poses, set against different backgrounds. For each image, Google’s researchers isolated annotated facial features and hair as accurately as possible to help the network learn to distinguish between foreground and background.
The effect is less accurate than true chroma-keying, but it’s effective, convenient and ideal for livening up short video clips.
The segmentation technology is part of the YouTube Stories , which is currently in limited beta. Stories are 15-minute videos that are only accessible via the YouTube mobile apps, and are designed to be more like Instagram videos.
Once you’ve recorded a video for a Story (or selected one from your device) you’re given a small choice of editing tools, including the ability to liven things up with stickers, text, filters and background music. Real Time Mobile Video Segmentation will appear in its own tab, ready to switch out boring backgrounds.
Google killed off YouTube Video Editor last year, so it’s interesting to see it developing new editing tools specifically for mobile. Real-time background replacement is particularly unusual, and suggests Google is hoping to get the upper hand on Snapchat and Instagram before releasing its own take on the popular Stories format.