Who determines whose interests are best served by a machine?



In this blog I looked at algorithms and bias and asked who is in control …. http://www.mydigitalfootprint.com/algorithms-is-anyone-in-control I asked a rhetorical question "So the question is not if you can become an algorithm but how accurate can an algorithm model you! "

Just thinking further on this topic and what is the order in which my interests should be best severed?

·        My as an individual - I code for myself based on my data and my desired outcomes - allows me to mis-represent myself

·        Me as a group of link minded people - we test the algorithm to determine if I (we) like or dislike the implied outcomes and they are refined

·        An organisation acts on my behalf to determine if harm is done and sets guide lines (best working practices)

·        A government sets up a regulatory body to provide guidance and enforce law

·        A programmer who is outside of my jurisdiction does what they like

·        A company who is outside of my control who wants a desired outcome

Asking the question again "Who determines whose interests are best severed by a machine?"  Who does the machine act on the behalf of?  Whilst this is an organisation that can exploit your data sets, there is an interesting dynamic…

1. If it is the company itself coding and determining the algorithm there is an incentive to get it right and focus on those who will buy their products or services. The efficiency games comes to town.

2. If it is a company where the code is about some measure that says volume over profit then there is a different bias 

3. If the code is outsourced or the algorithm purchased - where is the alignment?

So my question is now - Does Strategy, Style, Culture and Purpose (of a company) effect the algorithm and coding principals, measurable and outcomes of an organisation?

The reason to question this is that my data (my digital footprint) forms the data set that allows companies to personalise my experience, however if my data set when passed through the algorithm says for example not credit worthy, the retailer may choose to ignore me, but the outcome is the bias of a company (coder/ algorithm) and not my data.