Please see the previous post here.
Data collection resides in an ethical grey area, especially on the internet where we accept that our behaviours are being constantly surveyed.
‘It’s an amazing victory for data appropriators that acquiescence has become the standard model for obtaining ‘consent’’ (Sadowski, 2016).
Sometimes it can be hard to tell whether companies are acting in the interests of their customers or for profit, and sometimes it is hard to tell whether data breaches are accidental or not.
Despite such things, people find that they trust certain ‘tech companies more than others according to a recent survey by Morning Consult’ (Dunn, 2017). Amazon tops the list, being most trusted with 69% of 2,200 US adults placing their trust in the company to keep their data secure (Dunn, 2017). Google got 65%, Apple 60% and Facebook and Twitter fell to the bottom of the list with the least trust from consumers at 49% and 34% respectively (Dunn, 2017).
This is not a surprise when you learn that some social media platforms have looked after data poorly. WhatsApp shared peoples’ names and phone numbers with Facebook to allow businesses to advertise directly to those individuals affected.
Even bigger than this are:
‘the data brokers that create massive personalized profiles about each of us, which are then sold and used to circumvent consumer protections meant to limit predatory and discriminatory practices.’ (Sadowski, 2016).
Data Brokers sell individuals’ sensitive personal information, as a commodity to businesses, advertisers and sometimes the government, collecting this data through cookies (small pieces of code on a browser), web beacons (transparent images on websites or emails used for tracking) and ETags (part of the HTTP protocol for the web) or through the permissions we give on our phones allowing apps to access our contacts and other folders (The Windows Club, n.d.). This industry generates $156 billion in yearly revenues (The Windows Club, n.d.) so there is a serious market for data.
Social media ‘have a political economy that includes user-generated content as part of an exploitative labour process’ (Reading, 2014, p. 752). We become the product being sold to other companies, with our data collected by data brokers and a profile made.
Corporations therefore collect masses of data because of the value they might generate (Sadowski, 2016), for instance imagine the ‘hyper sensitive expression-targeted advertising’ (Lomas, 2017) that data from facial recognition could lead to.
Apple are said to be aware of these risks:
‘In clause 5.1.2 (iii) of the developer guidelines, Apple writes: Data gathered for the HomeKit API or from depth and/or facial mapping tools… may not be used for advertising or other use-based data mining, including by third parties.’ (Lomas, 2017).
In addition, these issues of privacy are currently being combatted by the European Parliament, the Council of the European Union and the European Commission through the creation of legislation with the purpose to:
‘bring data protection legislation into line with new, previously unforeseen ways that data is now used.’ (Curtis, 2017)
This legislation is called the General Data Protection Regulation (GDPR) and will be applied from 25 May 2018, and it focuses on the information companies hold, privacy notices, individuals’ rights, lawful basis for processing personal data, consent, data breaches and data protection (ICO, 2017).
Though this legislation is a European legislation, fortunately for many consumers, it has a ‘borderless scope’ ‘causing problems for both European and non-European businesses’ (Hewett, 2017) and therefore, will affect Apple too. This should be reassuring for many, but as this lessens the implications on the consumers directly, there are still other more physical consequences for our rural areas.
With so much data being collected, and held, all the time there grows the issue of where to store such data. Server farms are springing up in rural areas in towns and cities, finding space with a lot of open land. Industrial complexes are then built up to support the cloud infra-structure through ‘globital’ electronic memory (Reading, 2014, p. 754)
‘and back-up for large memory-hungry social media platforms such as Facebook, Google, YouTube and Twitter.’ (Reading, 2014, p. 754).
Ensuring that these centres, filled with computers and servers, continue to function properly takes a lot of electricity. Google reports ‘that its data centers account’ for 0.01% ‘of global electricity use’, this is about the same ‘as the entire country of Turkey’ (Burrington, 2015). The huge environmental implications for data collection are huge, but some companies, through marketing decisions, are beginning to operate using renewable energy, such as Facebook (Burrington, 2015).
In another positive, this year, Apple sought permission to build a new data-centre server in Arizona, and they showed that they could provide 150 high-paid positions at the facility (Leswing, 2017).
This will be one of the centres where they will now hold facial recognition data from their iPhone X users, despite the ‘secure enclaves’ they state will be holding the data. As reports suggest that Apple are going to share the biometric data from Face ID with app developers, it must be held somewhere in their cloud.
Even so, it is made clear in Apple’s guidelines that developers must not sell biometric data, but their access to the information still concerns privacy experts (Bonnington, 2017) especially when both Uber and Path have previously violated these guidelines in this way (Bonnington, 2017). Though:
‘Other privacy experts… are confident in Apple’s abilities to… manage Face ID data. “Apple has had great success historically imposing limitations on developers in exchange for access to their lucrative iOS user base, and I don’t see that changing with Face ID,” Travis Jarae, CEO of identity and privacy research company OWI, said.’ (Bonnington, 2017).
Apple’s ability to manage their customers data has been seen in the case between Apple and the FBI. A federal judge asked the company to unlock an iPhone belonging to a suspect responsible for the death of 14 people, and Apple declined to help as this affected their customer’s privacy and they did not want to set a precedent for the FBI to abuse further (Kharpal, 2016).