Inside Facebook’s Quest to Be Your Bestie—By Owning Your Memories
Since its launch one year ago, Facebook’s On This Day feature has become popular in that world-conquering way only Facebook products become popular. Sixty million people visit the On This Day page each day, and 155 million use the same push notification that told just you there’s a photo from March 24, 2008, that you really ought to see. It’s a fun feature steeped in nostalgia, and people love it for the same reason they love Timehop (which Facebook still swears it didn’t copy).
On This Day was far from Facebook’s first attempt at surfacing your old stuff. Three years ago it released Timeline Movie Maker, which turns your feed into a sizzling highlight reel without you even asking. The same feature returned in 2014 as A Look Back. Each December, you can see (and share) a Year In Review video, which collects your 10 best moments from the previous year into a jaunty slideshow.
There’s no question that people love sharing and engaging with memories. One 2013 study found that viewing old photos was among the most popular things to do on Facebook—even more popular than sending messages or playing games. At around the same time, the company started seeing an uptick in “like bombing,” in which you dive deeply into someone’s timeline and randomly like something, bringing it to everyone’s attention again (LOL omg remember your hair in college?). Facebook users love to reminisce.
So on one hand, On This Day is just Facebook making it easier for people to do what they already do. However, On This Day is dramatically more invasive than someone you vaguely remember from college surfing through old photos. It’s a notification buzzing your phone, or a colorful box atop your News Feed. And some of the stuff in your past can be problematic. There’s no knowing human eye to sort through the pictures to show only happy times; Facebook’s algorithms are doing the choosing. There’s a very good chance that On This Day is going to dredge up a painful memory, or at least something you’d rather not see again.
You can always opt out of On This Day, or delete the photos you don’t want yourself or anyone to see again. But Facebook’s desperate to keep you around, and to incentivize you to share more and more personally on its services. Lurking beneath those fun pictures of fading memories lies something far bigger and more important to Facebook’s business.
In its neverending quest to dominate all your time and attention, Facebook has spent years looking for ways to repeatedly mine (and monetize!) all that information, to give 1.6 billion users more opportunities to engage with the life they’ve lived and shared on the social network. If Facebook could learn to tap your memories, the team figured, you’d be attached to Facebook unlike anything else. They also knew that if they toyed with your past or showed you something you don’t want to see, they’d have blown it in a huge and irreparable way. “Autobiographical memory is of fundamental significance for the self,” a landmark 2000 study began, not just to the life we remember but to our conceptions of ourselves. For Facebook, as in life, when you play with emotions, you play with fire.
So the team started to ask key questions: What kinds of memories do people want to see? Should Facebook only show happy memories? Can a computer possibly know what a happy memory even is? They found a lot of answers, but one in particular: this stuff is really, really complicated. And you can’t solve memories with algorithms.
The Emotion Engine
Artie Konrad, a soft-spoken researcher who finished his PhD in cognitive psychology the day before we met, and who joins Facebook full-time next month after a few summer internships, led much of this research. He’s spent years asking users about the types of memories they like and dislike, how they categorize them (“baby” versus “achievement” versus “vacation”), and how they want Facebook to handle them. “They thought our role,” Konrad says, “was to provide occasional reminders of fun, interesting, and important life moments that one might not take the time to revisit.”
Identifying and understanding those “life moments” became the driving purpose behind On This Day, and the center of Konrad’s research. However, the task of turning that research into algorithms falls on engineers like Omid Aziz, On This Day’s engineering manager. Aziz is energetic and fast-talking, and careful to immediately set expectations when we meet. “We understand,” he says, “that we obviously can’t build a perfect algorithm.” Then he starts drawing on a whiteboard, telling me all about how he’s trying to build a perfect algorithm.
Long-term, the team wants to explore the difference between happy memories and important ones worth showing even if they don’t make you smile. But for now, the team is mostly going for smiles. So step one is what Aziz calls “filtering,” in which the computer throws out all the things you’re obviously not going to want to see. Exes, people you’ve blocked, things like that. Then you get to step two: ranking. This is where the tech gets tougher.
Facebook’s machine-learning software can figure out the memories you’re going to want to see based upon how you use the product. If you consistently share memories with certain friends, you’ll see more of those friends. If you dismiss every single memory of that jackass Tony, eventually Facebook’s going to stop showing you Tony. There are so many signals: Facebook has discovered posts with a lot of likes and few comments are nearly always positive, for instance, but those with more comments and fewer likes tend to have more complicated emotions.
Aziz and his team also are tapping Facebook’s computer-vision work to recognize things in your photos and sort them based on their nostalgic appeal. Show it a picture of your corgi, and it sees “corgi.” Food, clothes, shoes, tacos, people—Facebook can see them all. If Facebook can figure out what (and, someday, who) is in your photo, then mine the comments and likes to figure out its emotional tenor, it can work with confidence. For a while, anyway, because it’ll inevitably get weird.
Babies, Konrad says, are memory gold. People love sharing pictures of their babies. But what if you lost your baby this year? Facebook probably doesn’t know that, but showing you that memory would be a disaster. Or even something less dramatic: What if you had a fight with your best friend yesterday and you can’t bear to even think about him? If you lost someone five years ago, would seeing her picture now make you smile or cry? “What could be considered a pleasurable initial experience,” Konrad says, “could become unpleasurable over time.” In the research world that’s called a “contamination sequence.” You loved your boyfriend, but then you broke up. There’s also a “redemption sequence,” which is the flip side: you get back together, you overcome an illness. Memories are always powerful but never stable.
Sharing Your Feelings
When it first launched, On This Day was hit with a wave of feedback from upset users who weren’t prepared for their recently deceased children, pets, and a litany of other things to show up atop their Facebook feed without warning. One user, Rachel Jennings, wrote an open letter after seeing photos of the house she was forced to sell, and the kitten who’d recently passed. “I know, Facebook, that it will only be a matter of time before you hit me with an image of my dear father, who passed away last year,” she wrote. “Perhaps you will show me the photo I posted in tribute the night of his stroke.”
Last October, seven months after On This Day’s debut, Facebook finally rolled out basic preferences for the feature. Now you can easily pick people or dates to block from ever appearing in On This Day, preventing at least the very worst. They also listened to users who complained about the 2014 Year In Review videos, which prompted people to share with the line “It’s been a great year! Thanks for being part of it.” For plenty of Facebook’s billion-plus users, 2014 sucked, and the video’s chipper tone only made things worse. On This Day is thus comparatively neutral, its picture frames communicating reminiscence without a whiff of emotion.
Emotion, though, is the very center of On This Day. It’s increasingly the center of Facebook. Facebook doesn’t want to be a platform for you and your friends; it wants to be your friend. The company is invested in shaping and controlling the emotional experience of using its product. (Remember when it toyed with the emotional content of the News Feed to see how people would react?) Tools like On This Day are starting to take advantage of the fact that Facebook is already becoming a place for much more nuanced expression. The recent launch of Facebook reactions, with their more granular tone for sharing emotions, speaks to this.
“One of the big areas that we can build out,” says Tony Liu, the product manager for On This Day and other reminiscence projects at the company, “is more nuanced ways for people to express themselves.”
Improvement is slow going, though, in part because the On This Day team has to wait a year to see the results of any platform changes. Reactions launched in February, and as people use the new emoji to comment on photos across Facebook, the On This Day team will gain more data about which timeline items people wouldn’t mind seeing again. That’s data they can use to turn this year’s updates into next year’s memories.
While there may be no perfect algorithm, Aziz and his team are continually trying to get a little closer. They know that memories are powerful, for better and for worse. They know that the reward for getting it right is dwarfed only by the penalty for getting it wrong. “We have so much work to do,” Aziz says. “It’s literally like one percent finished.” Check in a year from now, he says, and we’ll see where we are.