Tiny House
More About The National Coalition for Dialogue & Deliberation • Join Now!
Community News

Are dissenting voices being filtered out of your Facebook feed and Google searches?

You may have noticed the same people’s posts always showing up on your Facebook news feed lately, but you may not have realized that even your Google searches are tailored to what a not-so-thoughtful algorithm thinks you want to see. One of the most disturbing and pervasive trends happening on the interwebs is described brilliantly by Eli Pariser (board president of MoveOn.org) in this 9-minute Ted Talk.

As Pariser notes in his talk, there has been a gradual shift in the way information is being posted online, and it’s an invisible shift that we aren’t allowed to control. We aren’t even told it’s happening.  Facebook allows us to hide people from our Newsfeed, but there is no option to “Show All.”  Personally, I’ve been irritated that people who have already “liked” NCDD’s Facebook page may no longer see the page’s posts on their wall unless they have recently thought to visit the page — or they have gone into the page to find a hidden option to “Show in News Feed” if they hover over the “Liked” button.

isolation-imageThe same thing is happening with all of your friends’ posts; if you haven’t interacted with them recently, the Facebook robots don’t think you’re interested in them anymore and you don’t see their new posts.

Google’s search results are similarly tailored to what the internet thinks you are most interested in.  Two people doing the very same Google search will get very different search results.  YouTube is doing the same thing.  Yahoo News is doing this.  And the list goes on.  Sometimes we get to choose the kinds of posts we see (like with my husband Andy’s favorite app, Flipbook), but often it happens without our consent or conscious choice.

What does that mean for us in the dialogue and deliberation community?  Well, it means that people are less likely to be exposed to viewpoints they don’t already agree with.  It means people who don’t tend to be curious about certain things (particular social issues, other cultures, foreign countries, world news, and so on), will be less likely than even just a year or two ago to learn by chance about things they haven’t already sought out.  And it means people may have a distorted view about how much the world agrees with them (in an extreme example, imagine if one person searched for “fracking” and found only posts about environmental risks, while another person did the same search and found only Exxon Mobile ads).

Watch the video, and please share what you think. How do you think people should respond to this mind-closing trend that has silently gained traction on the most potentially mind-opening tool humans have ever developed: the internet?

Sandy Heierbacher on FacebookSandy Heierbacher on LinkedinSandy Heierbacher on Twitter
Sandy Heierbacher
Sandy Heierbacher co-founded the National Coalition for Dialogue & Deliberation (NCDD) with Andy Fluke in 2002, with the 60 volunteers and 50 organizations who worked together to plan NCDD’s first national conference. She served as NCDD's Executive Director between 2002 and 2018. Click here for a list of articles and resources authored by Sandy.

  More Posts  

Join In!

We always encourage a lively exchange of ideas, whether online or off. Questions? Please feel free to contact us directly.

  1. Here’s the way another group has put it:
    As web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a “filter bubble” and don’t get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy. Read our community Q&A with Eli (featuring 10 ways to turn off the filter bubble): http://on.ted.com/PariserQA

Post Your Reply!

Click here to cancel reply.

 

-