"Mom, what was your first date like with dad?"
Mom: "Let me show you."
That's how Lee Hoffman, the cofounder of an upcoming app called Memoir, sees the future of memory. "Everything your brain does with memory, what your brain does when you walk by a building and you have a flashback of another time you were there," he says. "We can basically do that [in the app], assuming we have enough data."
There's about to be a lot more data. As video–equipped wearable computers such as Google Glass enter the market, theoretically, we can record almost everything we experience. Apps like Memoir, Hoffman envisions, could one day play it back to us on command, leaving us only to fill in the feelings.
By augmenting human memory, we will also, inevitably, change it.
Memoir––which at launch will compile your photos, social media posts and geolocation coordinates into a search engine for your life––is just one of several companies preparing to augment memory. Another startup called Memoto plans to ship a wearable camera this summer that archives about 2,880 randomly shot photos of your life each day. Everyday.Me helps create a life timeline that users can browse later. Even large companies that were once focused on instant communication have shifted their mindsets into the past. Facebook launched Timeline. Twitter opened up access to its Tweet archive. And Foursquare began using historical checkin data to help inform current decisions.
Before we all strap on recording devices and prepare ourselves for near–perfect, machine–augmented memory, however, it is worth considering that by augmenting human memory, we will also, inevitably, change it.
Humans have always outsourced their memories to some extent. Instead of keeping tabs on every detail of our workplace, for instance, we're more likely to memorize which of our coworkers may have any particular piece of information. Psychology calls this concept of collective record "transitive memory," and research suggests we've invited computers to be part of the process.
One 2011 study, for instance, asked participants to type trivia into a computer. Half of the group was told their documents would be saved. The other half were told they would be erased. Those who believed they could reference the trivia again later were significantly less likely to remember the information. In another experiment by the same researchers, participants were asked to remember both a piece of information and in which computer file it was stored. They were better able to recall the location of the information than the information itself.
Personal information is no exception to this type of outsourcing. In 2007, the neuroscientist Ian Robertson polled 3,000 people about their own lives. While 87 percent of respondents over age 50 could recite a relative's birthday, just 40 percent of those under 30 could do the same. One–third of the younger group referenced their phones in order to recite their own phone numbers.
Six years later, it's more than phone numbers that we're storing on our digital devices. Many people could already cobble together a timeline of their lives using leftover social media posts, photos, calendar data, emails and geocoordinates. Add wearable technology into the equation, and perfecting the "timeline of you" seems within reach.
It is possible that having an available record or photograph might diminish the memory when we re–visit that event, making it feel more trite.
Creating a searchable database for our lives–if the way we've responded to other searchable databases is any indication–could result in having fewer personal details on the tips of our tongues. At the same time, it could allow us to retrieve almost every single detail, to show our children the first date with their father instead of tell them about it. "Every single thing in your life, you could have a perfect memory," Hoffman says. Well, almost perfect. Cameras capture events, not feelings. Your perceptions of events may be different the second or third time around. Still, there are obviously upsides to a perfect recall of events. Michael Anderson, a memory researcher at the University of Oregon in Eugene, had his students keep "forgetting diaries" as a way to estimate how much time they wasted looking for the car keys and making up for other faults in our memories. on average, he estimated, forgetting things takes up about a month of time each year.
But forgetting has its own advantages. It helps us remember what is important and to create a past we can live with. "The process of forgiveness, for example, can involve forgetting or a memory fading with time," psychologist Samantha Smithstein writes on Psychology Today's website. "Conversely, there are times when we remember being quite moved about something or experiencing delight. It is possible that having an available record or photograph might diminish the memory when we re–visit that event, making it feel more trite."
Socrates wasn't wrong–the new technology did often have the effects he feared–but he was shortsighted.
Machine–assisted perfect recall is still an extreme, pursued by dedicated lifeloggers like Hoffman, who takes manual photos of every meal, romantic breakup and meeting in his life (before our interview, he snapped a photo of me). Until this movement seeps into the mainstream through apps like Memoir, aided by always–on devices like Glass that make it seamless, only very few people understand its implications. One of those people, a woman with near–perfect recall who is known to psychologists as AJ, talked with National Geographic in 2007 about her unusual ability. "I remember good, which is very comforting. But I also remember bad–and every bad choice," she told the magazine. "And I really don't give myself a break. There are all these forks in the road, moments you have to make a choice, and then it's ten years later, and I'm still beating myself up over them. I don't forgive myself for a lot of things. Your memory is the way it is to protect you. I feel like it just hasn't protected me. I would love just for five minutes to be a simple person and not have all this stuff in my head. Most people have called what I have a gift, but I call it a burden."
That sounds terrible. But then again, so did the written word, when Socrates imagined that memory outsourcing technology entering the mainstream. As Nicholas Carr points out in The Atlantic, the philosopher feared that writing would cause people to "cease to exercise their memory and become forgetful."
"Socrates wasn't wrong–the new technology did often have the effects he feared–but he was shortsighted," Carr concludes. "He couldn't foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom)."
Perhaps we'll eventually feel the same way about outsourcing our personal histories to the Internet.
After all, "Human memory," reads one of the Memoir's taglines, "is so 2012."
[Image: Flickr user Chris Marchant]
Shares