Creating the virtuous circle

Extract from “My Digital Footprint”, this is from the Chapter 6  “A two sided business model"


Be under no illusion that this (creating a virtuous circle) is either simple or easy. Mark Zuckerberg, the founder of Facebook, said when commenting on Beacon: “We’ve made a lot of mistakes building this feature, but we’ve made even more with how we’ve handled them. We simply did a bad job with this release, and I apologise for it.” This was in response to the 70,000 users on Facebook responding to Facebook’s new Ad and Beacon features in December 2007. The Facebook Ad followed a well-trodden path of purchase goods, PIN codes, getting free extras online. Fun and not a big fuss, Beacon, however, was different. Beacon would look at what you do and as such has deep roots in behavioural marketing based on targeting (open loop); however, it took your data and told your friends what you had done, really without any due care or thought. If you looked at someone’s profile, it told your friends. Rather hard to hide the fact you were looking for ‘fitties’. Buy a book or CD; well your friends may want it as well. In so many ways this is MY DIGITAL FOOTPRINT, but perhaps in the rush to monetise the knowledge, no rules were put in place. Ideas are cheap, implementation is hard.

At this point there is a need to bring back some themes that we skipped past in the early part of the book. These are the bonds and bridges between privacy, risk and trust. Figure 33 shows how these bonds and bridges relate to the MY DIGITAL FOOTPRINT feedback model of collection, store, analysis and value. The bonds and bridges of risk, privacy and trust are the connection fabric that ensures that the user continues to enjoy the experience or will cause the user to stop using the service. How do our experiences enforce the benefits that mean we participate more, or damage the benefits and head off into the dark side, causing lockdown and disengagement? Confidence, referral, recommendation, privacy, trust and risk are all key aspects to unlocking the virtuous circle.

Within this start phase of discovery there will no doubt be areas that will push boundaries. Whilst we understand legal and illegal as boundaries, there are often instances where within a grey area of illegal but acceptable, sometimes referred to as ‘socially acceptable’ [allowing your underage teenager to see a film with a different classification, providing your bank PIN to a partner when they jump out of the car in the rain to get some cash from an ATM, allowing your child to try alcohol at home], or ‘in the public interest’ in the case of journalism where the boundary is crossed, but no one would be too upset if the outcome is good, positive or beneficial. As the user is the provider and consumer of data, the damage will be caused when, as a trusted provider (brand), one crosses the line of protector to exploitation agent. A lot of work and debate is needed on unacceptable and inappropriate interpretation of data and Metadata. As a director, governance will be difficult, balancing these intangible issues along with wealth, competition and value, as this is where the opportunities lie; it is something I spend a lot of time worrying about.


Figure 33
Creating the virtuous circle


To help in the understanding of why trust, privacy and risk are bonds and bridges, I am going to use the concept of capital (trust capital, risk capital and privacy capital) but not in the strict economics definition sense. Trust capital is how much, based on previous experience, you will be prepared to trust a service provider you have or have not previously used. Risk capital is how much, based on previous experience, you are prepared to risk (chance) using a service provider you have or have not previously used. Privacy capital is how much, based on previous experience, you are prepared to open up your privacy or personal data to a service provider you have or have not previously used. Capital, in each case, can be built and destroyed. In the next section we are going to briefly look at how you can build and erode capital.


The premise of privacy capital is that people will be born with no privacy capital and over one’s life this capital will be built or eroded based on experience. Good experiences of seeing your privacy protected or, in general, government protecting your privacy through action, law and regulation, the nation’s privacy standard will build privacy capital. Your social group, your family and their experiences will build privacy capital which is good. These good experiences provide positive feedback as shown in Figure 34. Positive experiences mean that you will have a higher propensity to engage and more hope by doing so that you will get better services and experiences. This lowers your fear, uncertainty and doubt (FUD). The more you see your privacy is protected, the more you may wish for someone to use your data as you do not fear them abusing it. You become more willing to share patterns, preferences, routes routines, shopping lists, click data, location and other inputs that will improve services. The converse being true; the more you see someone abuse your privacy, your friends, your social groups and generally the media invading privacy, the lower your FUD becomes, and your privacy capital is not eroded.


Figure 34
Building or eroding privacy capital

It is expected that your capital will even out at some point after you are 21 and will only be moved by major events.

Normal 0 false false false MicrosoftInternetExplorer4

You can read the entire book here

You can buy the book here