Alastair Croll has written a thought-provoking article, Big data is our generation’s civil rights issue, and we don’t know it. His basic argument is that the new economics of collecting and analyzing data has led to a change in how it is used. Once it was expensive to collect, so only data needed to answer particular questions was collected. Today it is cheap to collect, so it can be collected first and then analyzed – “we collect first and ask questions later”. This means that the questions asked can be very different from the questions the data seem to be about, and in many cases they can be problematic. Race, sexual orientation, health or political views – important for civil rights – can be inferred from apparently innocuous information provided for other purposes – names, soundtracks, word usage, purchases, and search queries.
The problem as he notes is that in order to handle this new situation is that we need to tie link what the data is with how it can be used. And this cannot be done just technologically, but requires societal norms and regulations. What kinds of ethics do we need to safeguard civil rights in a world of big data?
…governments need to balance reliance on data with checks and balances about how this reliance erodes privacy and creates civil and moral issues we haven’t thought through. It’s something that most of the electorate isn’t thinking about, and yet it affects every purchase they make.
This should be fun.
The smith was working hard on making a new tool. A passer-by looked at his work and remarked that it looked sharp and dangerous. The smith nodded: it needed to be very sharp to do its work. The visitor wondered why there was no cross-guard to prevent the user’s hand to slide onto the blade, and why the design made it easy to accidentally grip the blade instead of the grip. The smith explained that the tool was intended for people who said they knew how to use it well. “But what if they were overconfident, sold it to somebody else, or had a bad day? Surely some safety measures would be useful?” “No”, said the smith, “my customers did not ask for them. I could make them with a slight effort, but why bother?”
Would we say the smith was doing his job in an ethical manner?
Here are two other pieces of news: Oxford City Council has decided to make it mandatory for taxicabs in Oxford to have CCTV cameras and microphones recording conversations of the passengers. As expected, many people are outraged. The stated reason is to improve public safety, although the data supporting this decision doesn’t seem to be available. The surveillance footage will supposedly not be made available other than as evidence for crimes, and not stored for more than 28 days. Meanwhile in the US, there are hearings about the Stop Online Piracy Act (SOPA) and the PROTECT IP Act, laws intended to make it easier to block copyright infringement and counterfeiting. Besides concerns that critics and industries most affected by the laws are not getting access to the hearings, a serious set of concerns is that they would make it easy to censor websites and block business on fairly loose grounds, with few safeguards against false accusations (something that occurs regularly), little oversight, few remedies for the website, plus the fact that a domestic US law would apply internationally due to the peculiarities of the Internet and US legal definitions.