'The promise - and peril - of personalisation'

One day, Eli Pariser (an online organiser) logged onto Facebook to find out what people with less liberal opinions than his own were talking about.  He couldn't find them.  Based on his past search and click behaviour, Facebook had simply edited them out.Since then, Pariser has gone on to write The Filter Bubble: what the internet is hiding from you.  Speaking recently at the RSA in London, he spoke about his concerns about the filters and algorithms that shape the way the internet is presented to us.  The internet, it seems, is not as ‘connective' as he once thought it could be.Companies recognised that there was money to be made in helping people sort through enormous data torrents.  This led to a focus on ‘relevance' as manifested in, for example, Amazon's ‘if you liked this, you might like that' concept.  And these filter algorithms do more than that.  They can make inferences from seemingly unrelated data and are responsible for creating a ‘web of one' in which results are no longer ‘universal' but rather based on our own search history.  This ‘filter bubble' feeds our human confirmation bias by presenting to us the world as we already see it.The problem is that in our personal bubble views, we don't know what we are missing. It is relatively easy to know the editorial or political slant of a newspaper but not the unseen filters of social media.  And this matters when social media is driving approximately 50% of the traffic to online news sites.  It's easy for challenging stories to be lost from view amongst the stream of ‘likes'.We need to move on from narrow relevance and be challenged in our world view.  It's not easy to achieve this but the first stage is to be aware - and to make others aware - that this filter bubble exists.