Alright, so by now, you have definitely seen Snapchat’s facial filters. It seems like every day Snapchat is adding a new animal or face distortion for you to morph your face into. The filters are a funny way to spice up your snaps and they seem to be Snapchat’s most appealing feature. From the infamous puppy ears to the legendary face swap, we are always eager to see which filter snapchat has added that day. But just where did these filters come from, and how do they work?
1. How did we get these filters?
Snapchat’s “lenses”, as they are officially called, actually were not originally created by Snapchat. A Ukrainian app called “Looksery
” developed the structure behind the lenses we all know and love (or know and tolerate). In September 2015, Snapchat saw the creative lenses and wanted them. Bad. They bought the company for a pretty penny
— something to the tune of $150 million dollars, making it the largest tech acquisition in Ukrainian history
. Who knew your snaps had such a colorful backstory?
2. How on earth do they work?
Remember the apps that would make it look like you had gained a hundred pounds or aged a hundred years? They used the same technology that Snapchat has now come to thrive on. Looksery’s technology falls under the category of “Computer Vision”, a rapidly growing technological craze. Computer Vision uses pixels from a camera (like the one on your smartphone, tablet, or webcam) and uses them to recognize the objects it is being shown, such as your face.
A computer is given a set of values with each value matching a color. Computer Vision then uses this data to find areas of extreme contrast, like the dramatic difference between the center of the forehead and the sides of it. When Computer Vision “sees” enough areas of contrast in the right places, it then deems the object a face. But in order for Snapchat’s lenses, which are oh-so specific to certain facial features, to function properly, Computer Vision had to go one step further than just recognizing that a face is, in fact, a face.
Here’s where it gets interesting:
The good news is, the human brain is still smarter than the computer (yay!), so Computer Vision had to be trained to recognize certain parts of the face, like the eyebrows, lips, and ears. This took a lot of manual data entry. When we say a lot, we mean hundreds upon hundreds of photos had to be manually marked for facial features. Your camera then uses this sample of “the common face” and places it on your face, setting it up to add your lens.
From there, you know what to do. Snapchat allows you to choose from so many different lenses and introduces more almost every day. The more lenses they experiment with, the sharper the Computer Vision becomes, allowing Snapchat to create more and more complex lenses.
3. That sounds like a lot of work. Just what is Snapchat getting out of this?
As mentioned before, Snapchat made this rather pricy investment in Looksery’s product. Snapchat recently just acquired ANOTHER computer vision company, called Seene.
So just how are they making these investments worthwhile?
In September 2015, immediately following the company’s acquisition of Looksery, Snapchat opened up its lens store. The lens store sold a variety of lenses, each on sale for $0.99. Snapchat’s strategy was to put a lens they knew would be popular (such as the barfing rainbow lens) on the Snapchat app for a few days. Then, suddenly, you’d check your Snapchat and your favorite lens would be gone. That’s where the lens store came in. You could purchase the lens and it would appear on your app forever. However, after just two months of the lens store, Snapchat shut it down
and began to move towards themed lens advertising.
Movies and other products can create a themed lens related to their product, and feature it on Snapchat. Themed lens advertisements just might be how the company is planning on making that revenue back. Advertisements on Snapchat do not run cheap, as Snapchat has an audience of over 100 million daily users.
4. So, what can we expect up next?
Snapchat recently added a lens in promotion of the new Ghostbusters movie
. The lens has a quick callout appear instructing the user to utilize their back camera rather than the forward-facing, or “selfie cam”. Using the back camera is a feature that hasn’t been encouraged before in Snapchat lenses.
Using this “Ghostbusters” lens on the front facing camera allows the user to be surrounded by ghosts. Using the back camera, however, uses a totally different lens feature. This time, the lens allows the user to dump slime on any subject.
Snapchat is quick to update its features and seems to be pretty intuitive to what users want. It added a “friends” feature where another person featured in your snap also receives the lens. After face-swap became wildly popular
, it introduced a lens that allows the user to swap faces with any photo in their camera roll. We highly anticipate that they will now venture into using the front-camera-facing and back-camera-facing features on more of their lenses.
Snapchat also recently featured a bee lens that, you guessed it, turned the user’s face into that of a bee. It also altered the user’s voice when a video was being taken. We definitely expect to see more filters with voice distortion, as that just adds one more layer to the lens experience.
By Courtney Echerd