Digital risks to privacy

The Auction of Oneself and its risks : Facebook’s ugly face (3/4)

Posted 1 CommentPosted in Crisis, Data mining, Privacy, Social media, Trust

Why is there so much noise about the Cambridge Analytica crisis? What is really new about people not trusting in Facebook? I could argue that the use of demographic data for marketing purposes (or political purposes) is a common practice since the days of Gallup and their predictions based on opinion polls dating back to 1940. Since then, much has evolved, and the large-scale use of data in politics and marketing is a natural consequence of this evolution. Or is it not?

Marketing is a persuasive science (or art) and it has been always about matching. Think about marketing as a dating service, matching people, with their needs and desires, with products and services. Or, in the political area, voters with politicians. This is very clear from Kotler’s definition:

Marketing is the science and art of exploring, creating, and delivering value to satisfy the needs of a target market at a profit.  Marketing identifies unfulfilled needs and desires. It defines, measures and quantifies the size of the identified market and the profit potential. It pinpoints which segments the company is capable of serving best and it designs and promotes the appropriate products and services.

The use of psychological profiling and data mining in this infamous case it is not very different from what database marketers had promised 20 years ago, with the difference that now it is possible to deliver on these promises to a level of granularity close to 1-to-1 and in real time! Because now there is technology, there is abundance of data and companies such as Facebook or Google learned how to monetize from people’s online footprint.

The economy of exhibitionism and its costs

As I mentioned on my previous post about this topic, the boundaries have blurred and we have difficulties to know where our private world ends and the public domain starts. And this happened because we volunteered information in a process that Piñuel Raigada, my PhD thesis director, called “the auction of oneself”. When users start to trade their holidays’ photos, or their sons’ photos, or their selfies for “likes”, it is clear that an economy of exhibitionism has begun, under the suggestive name of “shared economy”. Yes, we share for the simply pleasure of sharing and enjoying the exposition of ourselves for networking purposes, but the shared economy is not free at all, as it has been always suggested. In fact, the intermediators of the shared economy have been spectacularly skilled in profiting from these footprints, and privacy pays a price for that.  To a certain point, there’s nothing wrong with that, and people seem not care too much and they are definitively more relaxed about privacy – perhaps because they don’t know the risks.

So, what is the real price of the shared economy? What are the risks?

The scandal involving Facebook and Cambridge Analytica sheds new light on the networked society, and the question now has to do, in my opinion, less about privacy, and more about the breach of trust and confidence on these companies in the trade of one self(ie) for one like. Other important risks are the possibilities of manipulation of personal data in a world with of fake news and fake promises, a post-truth world.

It is clear that the digital giants are the new mining companies, but instead of silver and gold these companies are data mining the lives of millions of people and companies in search of patterns that will make it even more efficient the marketing process. The ideal of marketing, at last, will be fulfilled, thanks to the consumer, the king of this customer-centric, data-centric world, that will enjoy the “personalised services and products” they long for. This can be really good, and, as a marketer, I don’t see a problem here, if these companies act with responsibility, which was not what we saw in this case, hence, Mark Zuckemberg’s apology.

But if they start to getting serious about this responsibility, consumers and society will probably face important collateral effects. As the reader Paulo Seth brilliantly commented on my previous post: “Honestly, if you don’t care about your own online safety you deserve all the misery you can get.”

 

Digita_Transparency_Surveillance

The Myth of Transparency: Facebook´s ugly face (2/4)

Posted 3 CommentsPosted in Big Data, Crisis, Crisis management, Data mining, Privacy, Privacy, Social media, Trust

A crisis about transparency (or lack of), we could summarise the Facebook reputation nightmare. Or, as the Times magazine puts it  brilliantly: “All this has prompted sharp criticism of the company, which meticulously tracks its users but failed to keep track of where information about the lives and thinking of those people went.” In this apparent paradox lies the first point I would like to highlight in this 4-part analysis: The Myth of Transparency.

If you read books such as Jeff Jarvis’ Public Parts (2011), you know how social media has successfully created a hype about the virtues of living life under the public sphere, in a continuous Self Big Brother. Although back then Jarvis agreed with some sort of protection to people’s privacy, such as the ones proposed by then European commissioner Viviane Reding, he was defending a libertarian, perhaps utopian, view of transparency that disregarded a basic impulse behind the “publification” of our lives by tech companies: data has economic value and social media thrives on marketing data.

What this crisis has brought to the surface and to the attention of regulators was the culmination of a series of privacy issues and breaches involving Internet and more famously Facebook. It is perhaps the beginning of the “end of the innocence” and the realisation by the users that transparency is good when it happens on both ways: from the part of the producer of the data (i.e. us) and the marketer of the data (social media companies). The market has become more mature and people starts to realise that there has never been a truly “free service” by Google or Facebook. As Viviane Reding poses it: we pay the service with our data.

To be fair, these companies never have said that they didn’t use people’s data for different purposes, including making tons of money. However, what people are noticing now is how obscure and careless firms have been in the management of this data – and how vulnerable they are when their minds can be read by data mining companies such as Cambridge Analytic with the controversial, and at the same time, brilliant experiment conducted by Kosinski et al (2015).

People are also realising how social media creates a subtle form of surveillance, by letting unknown organisations to access their view of the world, their relationships and their tastes. By impacting serious decisions like votes in a general election, for example, or referendums, the public opinion starts realising the risks of manipulation in this cycle of data transparency – data mining – campaign management.

Click to TweetClick To Tweet