Summary | “Facebook and Your Data” Community Conversation

On Saturday 21st April, the University of Edinburgh, in partnership with People Know How, ran an event on the topic of Facebook data. This event was part of the University’s ‘Community Conversations’ series, which had previously included events on Brexit. This post is a summary of the event by Sarah Anderson, one of the organisers.

 

Overview

“Facebook and Your Data: How Worried Should You Be?” was run in partnership with People Know How, a charitable organisation based in north-east Edinburgh that is committed to bringing about positive social change through the ideas of those affected by challenges. The purpose of the event was:

  • To respond rapidly to a matter of public concern and interest
  • To share high-quality information and insights from academic research – especially when this cuts through confusion created by politics or the media
  • To create opportunities for members of the public to ask the questions they need or want
  • For researchers to hear the perspectives of non-specialists on their research

Legal experts in IT law at the University of Edinburgh, Judith Rauhofer and Burkard Schafer, and health data expert Mhairi Aitken, were the speakers invited to give their perspectives on the recent Facebook controversy.

 

Summary of the discussion

Our three speakers had differing perspectives and our audience, while small, was also diverse in terms of its interests and concerns. This made for an interesting discussion!

Mhairi Aitken, Judith Rauhofer, Burkhard Schafer

Burkhard Schafer

  • It’s unlikely that someone would steal your personal data to cause you serious personal harm, such as framing you for a crime. However, your personal data being shared can cause you harm in other, more subtle ways, such as unfavourable home insurance prices.
  • There wasn’t a Facebook data ‘breach’ – it was business as usual. Nothing that has come out about Facebook in recent months has been a surprise to people in the IT industry.
  • Facebook doesn’t just see what we’re doing when we’re logged in – it’s also looking when you’re off it. This is difficult for the user, as they no longer know which environment they’re in – are they operating in Facebook or not?
  • Because Facebook is a network, we’re not just taking risks for ourselves – we’re exposing our friends to risk too. Every time we allow an app our data, we give it access to our contact list. This also means we’re also hostage to the most stupid of our friends on Facebook!
  • Not using Facebook can also be harmful: border control for some countries and high-risk lenders check social media profiles not only for bad behaviour, but also whether you’re active enough (an active profile shows you are a ‘real’ person).
  • The Cambridge Analytica controversy is two worlds colliding with different safeguards. As a result, people were becoming research subjects without their consent.

Judith Rauhofer

  • People are still leaving their Facebook profiles wide open. Judith learned more about her colleagues from searching their Facebook profiles than from working with them for 2.5 years.
  • The more people that are on Facebook, the more useful it is and so the greater the appeal to join. This is one reason why people keep using it in spite of the risks. Even if you could understand Facebook’s privacy policy, it probably wouldn’t put you off!
  • Cookies give advertisers a view of all your online activity. They network the data you give to all different websites and build up a detailed picture of you. This picture can be detailed and accurate, to the point advertisers can even tell if you’re depressed or have just gone through a break-up! Advertisers can now also track via device fingerprints.
  • Not all companies are trying to monetize cookie data for ill. Some are using it for good (e.g. the Samaritans have tried using it for suicide prevention).
  • If you don’t pay for a product, you are the product. Facebook and Google get paid by selling our data. The other option is that we pay e.g. £5 a month for the service. Would you pay £5 a month for Facebook to respect your privacy? You’d be surprised how many people wouldn’t.
  • The issue is it’s not just people wanting to sell us fridges with our data – it’s places like Cambridge Analytica.
  • Prioritisation algorithms mean that you now only see on Facebook the people who are the same as you. This leads to political echo chambers around things like Brexit or the Scottish Independence Referendum. This prioritisation algorithm can find out exactly why you might vote for a political party, then send you advertising based on that.
  • It is probably impossible to escape Facebook: even if you’ve never been on Facebook, friends may have tagged you in stuff. And once enough people start giving their social media data to insurance companies, it will become compulsory.
  • Treat Facebook security settings like personal hygiene. It’s an ongoing process as they keep changing.
  • We probably need to start censoring ourselves online. This is sad, as it’s against the spirit in which the internet was originally created.
  • The social media business model is now one of the big questions. If everyone was careful and privacy conscious, the world would have big economic problems!

Mhairi Aitken

  • People are often very supportive of public data being used for health research. Presumably, they assume it will be used ethically with strong governance policies. However, the Cambridge Analytica scandal could have repercussions for academia – a sector that tends to be assumed to have such strong ethical governance.
  • People have much lower expectations of how private sector data is used. There is an awareness that the data will be used for profit. However, while we might accept that Facebook would make money from us, we may not have anticipated that they could possibly do us harm.
  • There is cause to be optimistic: data is transforming our lives to we need be having conversations about it, and the Cambridge Analytica controversy has prompted a big one.
  • Academia must not shy away from public conversations about data: if people become more sceptical of sharing their data, health research will suffer and so, ultimately, will we all. This dialogue must keep going.
  • Some academic health researchers do use private sector data, but there have to be very careful agreements around this. Most data used for health research is public sector.

Questions from the audience (only the first couple recorded here as the event turned into a discussion)

Q. Can Facebook etc. see your email messages?

A. They don’t literally open it, but if you have a Gmail account, Google will scan it for key words. It will also do the same for emails you receive from anyone, regardless of their email provider.

Q. Can Facebook etc. see your phone calls?

A. Facebook will have a record of all the phone calls you’ve made from your smartphone. And if you don’t have Facebook on your phone but your friend does and they phone you, Facebook still has your phone number!

Conclusions

  • Some degree of sharing of personal data on social media is inescapable – even if you’re not on it, your friends are and your data will be shared that way.
  • The sharing of personal data can have positive as well as negative outcomes, so a world where we don’t share it at all is probably not desirable.
  • We all need to be aware that social media companies are businesses, and decide what business model we’re prepared to support. Would we pay for privacy? Would we accept cheaper health insurance in return for our Nectar card information? Would we pay £5 a month for Facebook?
  • Now this conversation has exploded, the dialogue needs to keep going and universities have some responsibility for this. If we don’t find ways to, for example, acceptably share personal health data, we could all end up worse off.