Visiting scholar: Gemma Galdon Clavell from Eticas Research and Consulting

In the beginning of October Gemma Galdon Clavell visited the 3TU.Centre for ethics and Technology. Gemma is the co-founder of Eticas, a consultancy company that takes on a combined legal, social and ethical perspective on pressing issues in technology. In her own words: she is in the business of “privacy by disaster”. In an interview I asked her about her work, her projects and impression of the Netherlands.

Who is Gemma Galdon Clavell?

Dr. Gemma Galdon Clavell is a policy analyst working on surveillance, social, legal and ethical impacts of technology, smart cities, privacy, security policy, resilience and policing. She is a founding partner at Eticas Research & Consulting and a researcher at the Universitat de Barcelona’s Sociology Department. She completed her PhD on surveillance, security and urban policy at the Universitat Autònoma de Barcelona, where she also received an MSc in Policy Management, and was later appointed Director of the Security Policy Programme at the Universitat Oberta de Catalunya (UOC). She is a member of the international advisory board of Privacy International and a regular analyst on TV, radio and print media. Her recent academic publications tackle issues related to the proliferation of surveillance in urban settings, urban security policy and community safety, security and mega events, the relationship between privacy and technology and smart cities.

Can you tell us about your consultancy company?

Eticas is a consultancy company based in Barcelona with a focus on the social, legal and ethical impact of data intensive technologies. I started this company with some of my  fellow researchers and PhD’s: we attempted to reproduce what our ideal department would be, but in a private setting – which gives us considerably more flexibility and ownership over what they do. Our main goal as a company is to improve how we understand and research the relationship between technology and society. Eticas works with large companies but also with governments and public departments that want to buy or implement technology and want to develop policy in a responsible way.

Over time, I expect that people will get more and more critical of technologies. Hence the need for a societal assessment will be even greater. As long as we keep learning about technology on the basis of its failure, of rights being infringed upon, values being eroded, people will become more worried. Big companies have had a great ten years where people were not very aware of such issues –  but society is slowly learning that technology can be great, but that it not always is. It therefore also makes sense from a business perspective: companies are realizing that they need to pay attention to social, legal and ethical issues to maintain a good image. A product will not sell or companies will have a reputation crisis when these issues are ignored. It is not only about doing the right thing, but also about being wise.

Can you tell us about some concrete projects you worked on?

Two prestigious projects I particularly enjoyed working on are the technological investments in border-crossing and technology in education. In the project on technological investments in border-crossing, Eticas researched automated border crossing and its effect on fundamental human rights. In our  assessment we looked at how technological process can impact the migration process. This varies from the moment asylum or visa is requested at an embassy and finger prints have to be provided by  migrants the moment they get off the plane, or the biometric information one has to divulge when at times crossing a land border. What do these technological mediations mean for the people that are in the middle of it? How can the process be designed in a way that respects the rights and values of migrants?

A second, also highly prestigious, project that meant a lot to me concerned the research of technology in education. Here, what concerns me is the constant monitoring and surveillance of the learning process of minors through educational technology. It is highly disturbing that the children can have their Iris monitored while learning so that teachers or parents will know when they are actually in front of the screen or when they go to the toilet. The devices have the capability to track everything kids do in their private rooms and this could affect their sense of autonomy, privacy, andindependence – and their relationship with technology itself.

Let’s move to ethics more broadly. What challenges are you facing in the field of consultancy?

It is difficult to determine where the boundaries are. Some technologies can be improved but some cannot and therefore you sometimes you have to say no – and saying no is hard. There could be a future with pirate ethicists where people say yes to everything and turn Eticas’ work into a checkbox exercise. As a consultant working in this field you have to be brave enough to give up on big projects and money.

Do you have tips for being an ethics consultant?

If you want to apply already existing frameworks to technologies you have to be brave. Academics are often not very prepared to be brave, sometimes you have to take a leap of faith and launch a hypothesis and stick to it. Sometimes it’s wrong, sometimes it’s right, but you need that confidence in your own work and your own vision.

In closing, how did you experience your stay in the Netherlands?

I am really glad I came. I am very much impressed with the Centre you are running. Sometimes, in my own work, I feel isolated. Eticas’ approach and outlook are rarely shared with other people – it is hard to find someone to have a conversation with that doesn’t need to start at zero. So, during my days at your Centre, I had conversations where I learned from them – instead of just having conversations where I merely presented what I am doing myself. The highlights were talking to people working in EU projects that centered on ethics, meeting people who work on ethical assessment in disciplines I would personally not have considered, like robotics – and seeing how there are new possibilities for collaboration between robotics and data intensive technologies. I find it fantastic that your Centre pays so much attention to such ethical and social questions. I expect the future will look a lot like what you are doing already.