Dirty tricks, skullduggery & data portability


This thought piece explores how business executives ought to be debating control over user (data), is less about where data is collected and stored but rather where, or rather how, individual data is used, monetised and by whom.


--

Given that platform companies such as Apple, Facebook, Twitter, Google, Baidu, Amazon, Alibaba, Tencent Xiaomi as examples complicate, confuse and officiate what they are actually doing with our personal data, how can leaders position their business to become truly customer centric and put the customer first.


As a context, economics defines utility companies (gas, electricity, water, telecoms) as only having one true differentiator - price. Given the ubiquity and certainty of one unit of electricity is the same from where-every you buy it, the market players create bundles and offers to hide the actual price and to make comparisons between the same utility very difficult or near impossible. However, what happens when you don't have a “price” e.g. Facebook, Google - it becomes all about your data. This non price becomes translated into UI/ UX, privacy, trust and convenience.

Whilst I are not suggesting that data is a utility (as data is data) the focus here is in the application of the same tricks, skulduggery and confusion that market players who have access to your data use to hide the truth about what they are doing with your data.


They can maintain this smoke screen as there are two views a user can have of how such companies operate. Let’s use Facebook as an market example, but this equally applies to any platform where the provision of a “free service” is being used to justify the exploitation of users’ personal data for as the financial justification by these platforms.


Figure 1 presents the whole picture but let’s take the user Lubony and look to the left.




When looking to the left in Figure 1, our user Lubony sees Facebook as a service, this is how as a user you log into the Facebook service from the web or mobile. The user is able to go to https://www.facebook.com/settings?tab=privacy and can control some of their preferences on sharing, privacy. There is nothing on the use of their personal data as that is hidden away somewhere else. Whilst it is not obvious to find preferences and the description to what each tick box is for is unclear, what default settings mean or how your friends have opted to set up their preference is all obscured. For instance, have a go at finding where you are “active from” for login devices or which companies you use “facebook contact” to login with. The user has a perception of control but in reality to get the most from Facebook as a service you follow the suggested set up and this gives Facebook control over your content and data. This is the “Morals of Opt-in and the ethics of Opt-out

The following preference is especially important to understand: Our user Lubony spends an hour finding the right setting and opts to restrict sharing to friends only. The user now perceives that their data will only be shared with their immediate Friends - which is true unless one of the friends decides to onward share or cut/ paste and share, and that content now falls under new terms of whomever reshared as the rights you set don’t copy over. Furthermore, there is nowhere in the preferences, at least not that we have found, where you are given the choice to share your posted data, searches, and all other behavioral data with third party providers.

What we can learn from looking left is that the platforms do provide user preferences, but make it vague and difficult to find. Not to mention the default settings are biased to the platforms themselves (the service provider)- quelle surprise. Further users don't tend to understand that their belief in preferences as control is misfounded and they actually don’t provide any control at all; and why would they?


Now time to turn right and look in the other direction, in favour of the provider

When our user Lubony signed up to in this case example Facebook and had to opt for male or female (which should already be sending out a warning) - it was a SIGNUP and this was an acceptance of their Terms and Conditions - https://www.facebook.com/terms.php. It was not consent - it is this way or no way. You are agreeing to the “signup” form as a contract and bound by these terms. This same contract document you have no option to accept spans over 14,000 words, and is hardly succinct, so if you’re one of the platform’s 1.4 billion daily active users you may want to think about whether you properly understood what you were signing up to; which is why we are writing this. However it is assumed by acceptance that you have given informed and explicit consent. Facebook's new data policy can be found here and the terms can be found here - these will probably be different from the ones you agreed to as one term of note is, that you will auto accept any new terms they give you.


The key point here is that you have accepted their terms and conditions. These terms set out that Facebook is entitled to do lots with your data - considered it as to embrace anything that is legal or not defined as illegal. A rather overly simplest, and wrong, interpretation of these terms is that Facebook sells your data to bring in revenue to support the service. This is Daily Mail interruption. Facebook is touchy about this point and is clear that they don't “sell” any of your information to anyone, and never have or will. Reassuringly it is all in the interruption. Whilst Facebook doesn’t sell on your information like your name and address, its goal is to compile a digital version of you with everything from your shopping habits to your political preferences, and aggregate that data in a way that will allow advertisers to target you, what advertisers do is pay Facebook to use the data they have collated about you. So they have not sold your data, they have sold the access to the digital you. Picky, but generate repeat revenue.


As a user, I should be able to accept that I am the product for a free service and that data can be used to provide the free service, but what we should be entitled to know is who Facebook contracts with, what the terms of the contracts are and so provide some balance and transparency. We should not ask for pricing as this is commercially sensitive, but it would be a better, more honest and open system if both parties on either side of the barter with these platforms can “see” an equal amount about each other, if so required.


However, looking left: Facebook tells you that you have control and preferences. Looking right is the contract and the terms where there is no consent over how your data is used.

The opening to this piece suggested that as business executive we ought to be thinking about permissions and consent. A key message is that taking back control over one’s data is important as it will force the market to become more transparent and business leaders need to be ready for this change. Data portability will create a massive new opportunity for you if you have a strategy for data.