Comment

Big Brother Watch is watching you

DATA DRIVEN: Big data may offer near-endless commercial opportunities. But if the technology industry is to remain credible, individual privacy must always come before profit, says Big Brother Watch director Emma Carr, with consent the basis for collecting any personal information.

emma carrFor most of us already engaged in the world of technology, we understand how internet companies make their money. We know that data is what drives the online world, allowing us to experience personalised services and providing us with the ability to communicate on a scale never seen before.

The best part is that, more often than not, we get all this for free. Or at least that is seemingly the case. Because it is us after all, or at least our data, that is fuelling these companies.

The public is also now starting to become aware of this. A poll conducted for Big Brother Watch this year showed that 79% of UK adults surveyed were concerned about their privacy online. The public want more to be done to protect them from companies who abuse their access to customer data.

For instance, following Google’s controversial amalgamation of all of their products’ privacy policies in 2012, 72% of people said that the Information Commissioner’s Office – the UK data regulator – should be doing more to force the company to comply with data regulations. Interestingly, 58% also felt that companies should never gather peoples’ data unless they explain why and for what purpose.

For me as a privacy campaigner, consent should be the bedrock before collecting personal information. Simply, users should be given relevant information on what will happen to their information, including who will be using it, so that they can then give or withhold their informed consent. Instead we too often see implied consent – a system where information is processed unless an individual expressly opts out.

This then leads to vocal concerns from the user, and quite often the media, all because the company failed to communicate properly.

Facebook is a good example. When it announced last year that it had conducted an experiment to see whether reading cheerful or depressing posts affected our mood, the media had a field day about the “manipulation” of our online world that had taken place, having altered our online reality without us knowing.

In our hyper-connected world, what defines us is our personal information. Collecting a variety of unrelated pieces of information can build a very accurate and intimate picture of a person.

The American retailer Target used “predictive analytics” to monitor changes in their female customers’ shopping habits. Target sought to establish who was pregnant to better target adverts. However, by using this method Target was manufacturing personally identifiable information. Customers would therefore have no idea that their information was being used in this manner, even if they suspected that some of it might be being analysed.

This is yet another example of poor communication with customers.

Different types of information warrant different levels of concern. Take Twitter as an example. Tweets are seen as being in the public domain, whilst the data that surrounds them, such as geo-location, raises privacy concerns.

This is a difference that is not always understood by those seeking to use that information, such as researchers or developers.

Researchers at IBM developed an algorithm that could predict the location of people based on their Tweets with almost 70% accuracy. Perhaps more disturbing, websites such as Sleeping Time, which has the tagline “Find the sleeping schedule of anyone on Twitter”, analyses the times and time zones from which Tweets were sent to predict when and how long Twitter users are likely to be sleeping.

We often hear from companies that this sort of analysis of our data is OK, because it is anonymised. But what does that actually mean? Anonymising data is the process of taking personally identifiable information and making it into non-identifiable data sets. There are a variety of ways that this process can take place, including “hashing” which scrambles previously identifiable information into an entirely new and anonymous code.

But applying this process does not mean there aren’t privacy concerns. The main concern being that anonymisation isn’t permanent – that it can be easily reversed or circumvented. This is potentially achieved by cross-referencing the data with other available datasets. With an increasing amount of personal information being gathered in a widening number of places, the potential for re-identification is becoming greater.

Is there a solution to these privacy concerns? Big Brother Watch’s polling showed that the public think data regulators should do more to protect our privacy. But what more could be done?

In the UK the way we use data is regulated by the Data Protection Act of 1998, which seeks to ensure that any information collected is for “legitimate purposes” does not adversely affect the individuals in question, and reinforces that use of the data should have transparent aims.

Whilst laudable, there are some issues with the legislation. The penalties and sanctions that can be handed to those who break the law are unsubstantial. Under the Act, anyone who commits an offence will not face a custodial sentence or a criminal record, instead they can be handed a fairly trivial fine.

In one case a Barclays employee, who committed 23 offences, was fined £2,990, just £130 for each time the Act was broken. For the legislation to be effective fines need to be increased and for serious cases custodial sentences should be introduced.

Data in itself is not a negative thing. Any positive or negative outcomes from the use of data are wholly down to the intentions of the analyst. What these examples show is that any perceived negatives, more often than not, are because companies failed to adopt a privacy first approach and poorly communicated what they are doing.

The internet has provided us with more choice than ever before. Rather than companies hiding behind how they use data, they should be forward about it, giving users the choice of whether to use the product.

A privacy first and transparent approach may just be the unique selling point that users, and companies, have been looking for.

Emma Carr is also co-founder TechCentral, a technology hub at the UK political party conferences. Find out more at techcentral.org.uk