So how does what transpired in March regarding Facebook and Cambridge Analytica (CA) change our notions of privacy and who we trust with it? In conversation with friends and family, many of whom continue to struggle with what it is I do for a career, the belief system is one that expresses a frustration that only my peers and network would understand the half of what has gone on, what the available Terms & Conditions mean and the risk that was presented to them by clicking on surveys and quizzes through social media applications. This, in spite of the constant negative refrain from me – my refusal to take part in quizzes and questionnaires, to forward on memes, to change my profile picture to the latest visual border etc – pointing out the error of their ways consistently since joining the platform 10 years ago. At that time, the majority of Generation Xers and the Boomers II were only starting to engage online in a space where the Millennials were far more advanced in their activities. But not so advanced in their understanding of the implications of their behaviour, it seems.
Many bloggers and writers have been addressing the Facebook fallout from various angles in the last few weeks. Brian Krebs produced a very insightful blog that has been useful in increasing the understanding with regard to not sharing personal data by responding to quizzes and questionnaires online without thought. It seems to be that social media users need to be a lot more cautious and suspicious – which is counter intuitive to the marketing spin of how positive these platforms are for connecting people and making the world a friendlier place. Mark Zuckerberg, Facebook CEO, in his April 2018 Senate testimony, referenced Facebook’s role as “a force for good” more than once. However, with the volume of hate speech, criticism, insults and shaming that can be experienced by many, particularly the young and vulnerable, the happy landscape must seem a million miles away from their lived experience.
Challenging Perceptions
One man’s comedy is another man’s tragedy. Much of life and how we respond to it is the art of perception.
I entered this industry over 20 years ago, with a background education and qualifications in Philosophy, Psychoanalysis and Hypnotherapy. I was considered an outlier at the time. When I first began attending information security and assurance conferences, it appeared to be new news that the human factor was important and that there was a need to address information security awareness content to users, employees and the wider citizen population. Spin forward twenty years and we appear to have similar challenges with regard to levels of awareness and understanding of the value of personally identifiable information (PII) and data privacy. We are the product and the data about our data is what is being traded, not always for good, though not explicitly intentioned for evil. The whole subject is rife for deep philosophical thought as we appear to be at an apex in societal terms. Social media has had a profound impact on our humanity and culture. This was specifically addressed in a very interesting report from the European Data Protection Supervisor released in March 2018 reviewing “online manipulation and personal data”. The only line of the Executive Summary is extremely pertinent: “The digitisation of society and the economy is having a mixed impact on civic engagement in decision-making and on the barriers to public involvement in democratic processes”.
The EU General Data Protection Regulation (GDPR) is just over a month away from being in force and, in watching the live streaming of the Senate Hearing (10thApril 2018) at which Zuckerberg was given quite a grilling, the suggestion was made that perhaps the US needs to adopt something similar in legislative terms, rather than continue to rely on self-regulation by the various social media platforms and large data gathering technology firms. For all the complexity and years of data protection legislation experience that already exists, it is possible to condense the new regulation position into edible chunks.
However, to imagine that the big tech companies have “got it right”, that they are adequately securing our data and responsibly managing them, would be, frankly, foolhardy. They are all driven by profit and the requirement to meet the needs of their shareholders. In so doing, costs have been continually being cut over the last decade, with increasingly risky end results. There is widespread outsourcing and extensive use of multiple data processors and sub-processors. This brings with it a palpable lack of control, particularly given the expectations of the EU GDPR to understand this complex web woven and follow the data throughout its lifecycle usage, storage, sharing, transfer. The new requirement is to maintain a Record of Processing Activities (ROPA) in which an extensive set of data about the data held and processed is expected to be gathered and maintained. Zuckerberg referenced the thousands of apps that needed to be reviewed – and all the deep data levels that needed to be considered for review across his global enterprise. Imagine the effort Facebook are going to have to go through to achieve this?!
The IT media rhetoric shared in the last two years regarding the impending implementation of the EU GDPR is that this has been a significant opportunity to review data processing and re-engage with customers on a more positive, transparent and trustworthy footing. The current evidence begs to differ that any real change is taking place.
Obviously, Facebook is not alone, by any stretch of the imagination. It is one amongst a number of companies that could be classed as operating within the industry of surveillance capitalism. Whilst many of us are clear that we, as the product of that endeavour, need to be well informed and cautious – that clarity is not evident across all users (or members, as Zuckerberg refers to us).
Trust but verify
So, who should we trust, and why?
Measuring perceptions of trust is not a new activity. Research polls are regularly carried out. Time was, lack of trust was more evident in public bodies than in private companies or organisations – but the tables have turned, presumably with the ever-increasing volume of data breaches splashed across the multiple media channels available to all citizens.
Recently, the UK Information Commissioner’s Office (ICO) carried out research on public perceptions of trust which showed that UK citizens do not trust organisations with their data and that there is a significant deficit of trust. Only one-fifth of the UK public have trust and confidence in companies and organisations storing their personal information – and this survey was conducted prior to the CA/Facebook revelations.
The lack of trust is balanced with a lack of understanding of how personal data is used (only one in 10 having a good understanding). Philosophically, the survey respondents are likely to have contradicted their own actions and behaviour, given that they have probably widely shared information using social media platforms and yet willingly criticize those same platforms. Healer heal thyself.
Older adults are more likely to have little trust and confidence than their younger counterparts – advancing years at least brings both wisdom and increased cynicism! Millennials etc believe they have the monopoly on understanding technology. That’s as may be but there is evidently a gap in the understanding regarding the motives of organisations and the realities of politics and the necessity to meet shareholders demands (and how that might be achieved).
Over time, it has become clear that social media organisations have not been following the rules that more traditional media companies are regulated under – which tells you a lot about the trustworthiness of the industry that they have believed they are beyond or operating outside of such strictures. Whilst originally being developed as a social networkplatform, the mutation took place over time and through exponential growth. Facebook is largely a social advertising platform, in reality – and in its DNA, there are blips. Researchers have found applications that could be leaking personal data, threatening privacy. Saying so creates unease but again the issue is a need to understand the risks of data aggregation across multiple platforms, location data and our online activities.
Just because you can, doesn’t mean you should.This should be the mantra of tech companies – and indeed for us all. The scale of the data gathering has been understood to be beyond usual management techniques since the 1990’s when the term “big data” entered into our IT lexicon. As a result of last month’s shenanigans, Facebook had to put on hold plans to “research” the correlation of anonymised patient data with Facebook user profiles to see if medical care was required. Given the volumes of available data we can expect to hear more instances of these intended projects. However, handling large data lakes of our personal data has not always been done with respect nor within the legislative boundaries that exist. This is the reality of our interconnected electronic existence – we all choose which laws we will doggedly comply with and which ones we think we can stretch a little, or “wing”!
Given the extent of the use of technology in our daily lives, one place where the increased understanding required could be best embedded is in the current UK schools curriculum citizenship programmes. Again, this is not a new requirement – and yet the education system has still not adapted sufficiently to meet the needs of today’s electronically connected citizens.
“May our philosophies keep pace with our technologies. May our compassion keep pace with our powers. And may love, not fear, be the engine of change”. Edmond Kirsch, character in Dan Brown’s Origin -pg. 413, ISBN: 978-0-5930-7875-4
About the Author
Dr Andrea C Simmons, FBCS CITP, CISM, CISSP, M.Inst.ISP, MA, ISSA Senior Member, ISO27001 Lead Auditor is owner and director of www.i3grc.co.uk. Andrea has more than two decades of direct information security, assurance and governance experience, helping organisations establish appropriate controls, achieving and maintaining security certifications in order to ensure information protection is adequate for their crown jewels. Her work has included development of a trademarked and patentable enterprise governance, risk & compliance (eGRC) approach to addressing business information governance needs. Whilst also spending the last 8 years researching Information Assurance, Andrea has published two security management books. She can be reached at andrea.simmons@bcs.org.
Find out why you need machine identity management
Related blogs