Data & Ethics - Big Data Coe BarcelonaBig Data Coe Barcelona

In the last months we have seen that “Ethics” has emerged as an extremely sensitive topic for Data and Analytics community. Most likely, one of the main drivers of this wave of concern was Facebook scandal: Mark Zuckerberg (founder and CEO of Facebook) had to testify in front of US Congress about how his company handles its users’ data and how this could have influenced results in recent elections in several countries. But Facebook is not the only company whose practices are under scrutiny. Tones of questions have also been raised regarding how much personal data Google collects and how this is being used: according to Guillaume Chaslot (an ex-Google engineer), the Youtube algorithm “does not appear to be optimising for what is truthful, or balanced, or healthy for democracy”.

In other words, we are talking not only about privacy but also on how data could even threaten our political system. As Cathy O’Neil writes in her must-read book Weapons of math destruction, “the math-powered applications powering the data economy were based on choices made by fallible human beings. Some of these choices were no doubt made with the best intentions. Nevertheless, many of the models encoded human prejudice, misunderstanding and bias into the software systems that increasingly managed our lives. Like gods, these mathematical models are opaque (…) Their verdicts, even wrong or harmful, were beyond dispute or appeal. And they tended to punish the poor and the oppressed in our society, while making the rich richer”.

As Data-Driven professionals we cannot ignore this inconvenient truth and must address it. This is one of the reasons we at BcnAnalytics organised a session to discuss about Data & Ethics. As speakers we had Carlos Castillo (Distinguished Research Professor at Universitat Pompeu Fabra) and Gemma Galdon (Founder at Eticas Research & Consulting and Researcher at Universitat de Barcelona).

Carlos focused his talk on algorithmic discrimination. He initially reviewed the concept of discrimination from a philosophical perspective and then explained the concept of group discrimination, which means “disadvantageous treatment to an individual because he or she belongs to a specific socially salient group”. According to Carlos a further step is statistical discrimination which can be observed “when group discrimination happens because of some statistical belief, which means that someone has certain data, has looked at this data and based on statistics extracted from this data has decided to treat someone worse than another person”. After reviewing these concepts, Carlos raised the key issue: machine learning algorithms can discriminate.

Why is that? Machine learning systems take data and extract statistical beliefs from this data and therefore they are enabled to discriminate some individuals, regardless of intention and animosity. The key aspect is the consequences of this algorithm in terms of treating worse a person because he or she belongs to a group. Carlos emphasized that to avoid this discrimination, models need to optimize not only accuracy but also need to look at “the risk of two different populations of not getting the same outcome”. Carlos also highlighted how important is that systems are transparent: “if you get a negative outcome, you have to have a way to challenge this decision in a way that is effective… If I am denied a loan or parole, I need to have a way of effectively challenge the decision to say the systems was wrong in my case”.

Gemma started her talk quoting “The Fall of Public Man” from Richard Sennett. “In a city full of sensors and cameras and surveillance everywhere, where would Romeo and Juliet fall in love?”. From Gemma’s perspective, technology is changing our lives and we really need to ask  ourselves: Why are we investing in technology? What kind of societies are these technologies creating or promoting? Are we building the cities that we want to build? Do we want to live in a world where everything is remembered? Do we want to live in a world where we can never forget? As she mentioned: “for the first time in history, forgetting is more expensive than remembering. Everything we do is recorded by a camera or a sensor”. Gemma, then, started to review real cases on non-expected outcomes of certain technologies. For instance, “smart borders” based on biometrics. They were not part of the legislative debate because they were seen “as technical amendments”, but currently biometrics have become our IDs, and certain individuals self-mutilate when they want to hide their identities. In other words, their bodies became their enemies.

Gemma asked herself: “How can we hide behind a technical amendment? And what about false positives?  There is no redress mechanism”. According to her, the most burning issue is we, as society, did not think technology could fail. But it fails. And this triggers the key issue: the way we do technology is very irresponsible and no one is facing the consequences of their actions, the consequences of their false positives…which might be human rights. Gemma ended her speech highlighting the fact we need to start thinking how technology is impacting our civilization: “we have the responsibility to decide how we build a social-technical infrastructure that is responsible and desirable for our generation and the next generations”.

Data & Analytics Club

By: Manuel Bruscas

4
1 ratings
Admin Big Data BCN
Admin Big Data BCN
I've spent the last few weeks helping others find and shop for the perfect gift. It's been my passion, and I've enjoyed every minute of it. I love being able to help people find something that will make them happy, and I take a lot of pride in my work.