Day 2 of #SMX East began with an insightful presentation of, and follow-up discussion about, what many people covet as the holy grail of search: Personalization. Users crave an intuitive search experience, Google seeks relevancy, while marketers want to be found by the right users. But is online personalization the harmony-seeking matchmaker it would appear?
Keynote speaker and author Eli Pariser, wisely, remains wary. In a world where algorithms, code and robots curate search engine result pages, among other facets of the Internet,  how can we be sure we’re seeing the whole picture?  Together with moderators Danny Sullivan and Chris Sherman, the trio tackled this sensitive subject before a room of intrigued attendees– some were wide-eyed, some skeptically squinting, some still asleep.
AIMCLEAR live-tweeted this powerful keynote via @beebow. Read on for thought-provoking takeaways.
Topic of Discussion: The moral consequences of living a life that is mediated by search algorithms.
Who knew Mark Zuckerberg was such a metaphorical ballerina? Eli cited Zuck’s rationale of creation of the News Feed, as captured in David Kirkpatrick’s book,  The Facebook Effect, which goes a little something like this:
“A squirrel dying in front of your house may be more important to your interests right now than people dying in Africa.”
Hurmph. Whether or not you agree with that statement, the underlying sentiment is a concept worth considering: Human beings by nature are sometimes more responsive to and engaged by content that more directly relates to their life, to the here and now.
One day, Eli (a self-professed lean-towards-the-left dude) noticed that his conservative Facebook friends had all but vanished from his user experience. Why? Facebook’s algorithm had noticed behaviorally, Eli tended to interact more with liberal or non-conservative content. So… Facebook tailored Eli’s user experience accordingly.
This sense of personalization isn’t service-specific, either. Where you sit, where you computer is being used, which browser you use – all of these factors can impact what you see on the web.
The Great Google SERP Experiment
As an experiment, Eli asked some of his friends to search for “Egypt” on Google, then send him the screen capped SERPs. Understandably, they were rather different. One friend got results about the (then) ongoing protests and crisis in Egypt. The other friend got links to daily news, travel tips, airfare, and the like.
Lesson?
“The web is increasingly showing us what it thinks we want to see, not the world as it is.” -Eli Pariser
Filter Bubbles, Ice Cream, & Broccoli
Online, we are increasingly surrounded by a membrane of personalized filters, what Eli refers to as filter bubbles. You cant choose what’s gets in through your filter bubble, and moreover, you don’t know (and cannot change) what gets edited out.
Generally, it seems, the web tends to serve up “information junk food” or “information dessert” – mindless, entertaining content that satiates users’ hunger for instant gratification. (Eli uses an ice cream cone to represent this content. A very tasty looking ice cream cone, I might add.)But sometimes, what we’re really in the mood for are information vegetables (represented by broccoli a book) – more meaningful content we can learn from in a deeper way.
Great media feeds us a balanced diet information junk food and information vegetables. Unfortunately, filter bubbles can often weed out the veggies.
Who, or What, Is Guarding the Gate?
Once a manually-engineered frontier, the web is now constructed, curated and guarded by a new set of gatekeepers: code. Unfortunately for us, code has no civic sense of what actually matters to the human user.
If these algorithms are really going to mediate how we understand the world, we need to make sure they don’t just focus on a narrow idea of relevance, on what we click first, on the information ice cream cones
We really need the Internet to be that thing that connects us to new ideas, new people, new ways of thinking. We need Internet broccoli.
“This balance cannot happen if we’re all stuck in a personalized bubble of one.”
With that, Eli returned to his seat and the discussion between he, Chris, and Danny commenced.
Bite-Sized Discussion Takeaways from Eli
- There are two modes to how people use Google – information retrieval & information mapping. Google is focused on former, not latter.
- If you want to find the URL of official White House website, no problem. But if you’re looking for blogs with opinions about Obama’s presidency, that gets a little trickier.
- For some searches, personalization can be subtle. For others, it’s not. Google undersells how significant it is.
- Google most likely doesn’t have malicious intentions with personalization. Google is trying to give you a better user experience.
- Google sees personalization as making it harder to game SERPs. If everyone has different results, it’s harder to know where you stand, and therefore harder to know where manipulation opportunities (or needs) exist.
- Speaking of which, Google probably gets perverse pleasure when site owners watch rankings go up and up with no mind to personalization. Before you get excited about your search rank, log out, switch computers, and make sure it’s not just you.
- Yahoo News does some personalization on news headlines based on your Yahoo profile. Sneaky monkey! This point illustrates that you never really know if you’ve busted out of your personalization bubble.
- Ultimately, Google’s goal is to make the web experience a little more passive, a little more home delivery, rather than going out searching.
- It would make it easier if we (users) were able to understand the filters, able to turn them on or off, rather than having them imposed on us.
- It would be good if people knew what the transaction was they were making with google. “you take this data, okay!” That way, folks could decide which data they want to have google use, when it’s on, when it’s off.
- Originally, Google was going to add some sort of visual indicator to help identify when things are personalized. Most likely, that will not happen any time soon. Not only does it almost make it creepier when you know what’s adjusted for you vs. not, apparently Google engineers feel/felt that not everyone understands this concept, and that confusion could make things more difficult.
- Google thinks about user data from a more ethical standpoint than some other companies. (Well, that’s comforting.)
- If you’re able to target by inferred data, you should be able to see that data is being inferred about you. I.E. Google should say, “Here is what we literally know about you, and here’s what we can target based on inferences.”
- Most people don’t think about the fact that we’re walking around the Internet with a big price tag on our shirts. We are.
- Still, Google doesn’t really know who you are. it knows who it thinks you are based on content you consume, among other things.
- Google: “We’re not getting complaints about our personalization!” Universe: “Yeah, but they don’t know it exists.” How can people make complaints about something they don’t know exists?
- Eli believes Google could do a lot more to explain its philosophy about personalization without making it super easy to boost your results.
- The balance comes when people are able to use these tools the way that they want to use them, to decide for themselves.
“IÂ just want info junk food right now, kthnx. Tomorrow, I will want information that provokes me. I don’t want to be force-fed.”
And, truly, with that, Eli dashed off stage to catch a plane. The keynote wrapped up and attendees were let loose in the convention center to hit up their next session. Stay tuned in AIMCLEAR blog for more coverage comin’ atcha from #SMX East 2011 🙂