https://www.flickr.com/photos/alansimpsonme/36481068532

The news that Facebook has patented an algorithm that determines its users’ social class should dispel any notion that Facebook is a neutral space to hang out with friends – or that its main purpose, according to founder Mark Zuckerberg’s revised mission statement, is “to bring the world closer”. Bringing advertisers closer to your wallet, definitely.

Facebook’s ‘socioeconomic group classification based on user features’ patent‘ – is its mathematical model that classifies users into a social class. However, Facebook is going way beyond the traditional class indicator, income. It’s using all sorts of data to determine a person’s social category – information we knowingly, or incidentally, provide via our digital ‘engagements’. What sort of information? Things like home ownership, level of education, how many devices you own, how you use the internet. And other stuff “aggregated from market research questionnaires”. Like what? What restaurants you favour, brand of shoe you buy, preferred dating site?

It’s not surprising Facebook would engage in this type of classification. Facebook is in the business of classifying and characterising its users because that’s what it sells to its advertisers.

If Facebook does roll out this rather crude approach to social class, it will have material consequences on the kind of information that will be put in front of different people. So if you live in a less salubrious town, only own a mere two devices, dine at what Facebook deems are average restaurants – you will be offered information deemed appropriate to your ‘lower’ social class, maybe shown fewer or less well paid jobs. We can’t know precisely because the ‘calculation’ will be done by an algorithm and it will determine if you are the type of user most likely to click on an ad for a predatory loan or if it is more profitable to only show you ‘working class’ dating sites.

Facebook is not doing anything wrong in a business sense – in fact it is doing precisely what it is set up to do: The algorithm has to show those ads to people who are most likely to click on them. Clicks are money for Facebook.

Over the long term this ‘targeting’ (aka discrimination) will not only reinforce people’s social situation, it will create the very reality that it purports to measure.

This social class patent story has emerged at a time when there is a plethora of concerning stories about Facebook and the opaque nature of its business operations resulting in practices Facebook did not anticipate or intend. We have distilled these into four big conversations around the issues facing not just Facebook but social media platforms in general:

  • Exploitation of people within specific social networks that facilitates discriminatory advertising behaviour
  • Usage addiction: platforms are built to be ‘sticky’ to keep users engaged
  • Social isolation and fragmentation: the development of the ‘echo chamber’
  • Breaches of privacy caused by algorithmic exploitation of personal connections across social media leading where the source of the breach is unknowable

To listen to Sandra and Kai’s discussion tune into The Future This Week podcast.

While Facebook did not anticipate the socially undesirable uses made of its platform – think fake news, Russian meddling in US elections – its hands-off approach to content is built into its business model because that is what maximises its profits. Facebook describes itself as merely a platform for distributing content, but has to come to grips now with the fallout from wanting to be a journalistic service without actually hiring any journalists.

The above issues have emerged because of the very ‘success’ of this model – namely its algorithms are able to scale that optimisation of segmentation and dissemination. And any attempts by Facebook to intervene in this data flow – such as hiring actual people to weed out the most obvious problems with content – can only ever amount to a tinkering at the edges.

The tension between the desires of two groups serviced by Facebook – the users and the advertisers – is built into Facebook’s business model. Last week we discussed how this disconnect undermines Facebook’s credibility with users – a serious business problem for Facebook as it seeks to expand its services.

Not all social networks work this way: WeChat, owned by Chinese multinational Tencent, does not rely strongly on advertising. Its model is not solely dependent on selling users to companies and maximising click-through rates.

Facebook built its reputation on its narrative of bringing people together, but is risking both its good name and perhaps its very existence, because stories about Facebook are increasingly about how it is a large-scale discrimination machine that sells users to the highest bidders. After all, social class is not a way of connecting people but of pulling them apart.

To hear Sandra and Kai discussing this and other stories tune into The Future This Week podcast

Dr Sandra Peter is the Director of Sydney Executive Plus and Associate Professor at the University of Sydney Business School. Her research and practice focuses on engaging with the future in productive ways, and the impact of emerging technologies on business and society.

Kai Riemer is Professor of Information Technology and Organisation, and Director of Sydney Executive Plus at the University of Sydney Business School. Kai's research interest is in Disruptive Technologies, Enterprise Social Media, Virtual Work, Collaborative Technologies and the Philosophy of Technology.

Related content