Designing for Darkness: Urban Nighttime Lighting and Environmental Values


4TU.Ethics colleagues are cordially invited to Taylor Stone’s PhD Defence. On 21st January he will present and defend his dissertation “Designing for Darkness: Urban Nighttime Lighting and Environmental Values.” Following the defence there will be a reception and a panel discussion with some of the committee on “Re-imagining the City at Night.” Please note that both the defence and the panel are public  (… so feel free to pass along to anyone who may be interested).

Monday 21 January 2019 @ 12:00 – PhD Defence of Taylor Stone (Senaatszaal, Aula, TU Delft) and @ 15:30 – Panel discussion (Commissiekamer 3, Aula, TU Delft)

PhD Defence

Designing for Darkness: Urban Nighttime Lighting and Environmental Values

Artificial illumination has had profound and far-reaching impacts on the development, use, and perceptions of urban nights, and has brought with it many benefits. However, in recent years its adverse costs and effects – commonly referred to as light pollution – have emerged as a topic of concern. Nighttime lighting uses enormous amounts of energy, costs billions of dollars annually, can be detrimental to the health of humans and ecosystems, and cuts off access to a starry night sky. Addressing these impacts, and more fundamentally understanding the underlying values shaping contemporary discourse, is a complex and pressing challenge with moral, aesthetic, political, and technical dimensions. This dissertation takes up this challenge by offering a critical examination of the historical roots and normative presuppositions shaping the concept of light pollution. This critique leads to the proposal of an alternative normative framework: instead of focusing on reducing lighting, it argues for fostering darkness in urban nightscapes. A designing for darkness approach is developed on two interrelated levels. The first is conceptual, exploring the relationship between darkness, illumination, and environmental values. The second is practical, proposing first steps towards realizing darker nights via the responsible design of new and emerging technologies, namely LEDs and autonomous vehicles. Taken together, the chapters of this dissertation weave together a critical investigation and constructive contribution to a pressing urban challenge for the 21st century.


Panel Discussion

Towards a Darker Future? Re-imagining the city at night

Monday 21 January 2019 Commissiekamer 3, Aula, TU Delft 15:30-17:00

Nighttime lighting is foundational to the design and use cities at night. Artificial illumination effectively creates the city at night, carving space and time out of darkness. New innovations to how we light cities can have far-reaching effects on issues such as sustainability, safety, commerce, nightlife and ‘24/7’ societies, mobility, and social justice. Yet, because of the immense scale and seeming permanence of our lighting, we take the existing infrastructure as a given. It all too easily fades into the backdrop of daily life, only noticed when it fails or during special events. But, what if that wasn’t so? What if we re-focused our attention on the generative force of urban lighting?

Imagine that we could flip a switch, and reset our nighttime lighting. What sort of urban nightscapes would we want for our future cities, and why? With new technologies on the horizon (e.g., smart systems, autonomous robotics, etc.), it is possible to design innovative lighting strategies and novel nighttime environments. But, what exactly should a “new” urban nightscape look like, and why? Should it be drastically different? What values (environmental, social, aesthetic) should inform and drive the technological innovation, design and policy choices, and use patterns? And, how can (and should) future visions be enacted? Through exploring these questions, we can start to envision a future of urban nights radically different from those of the 20th century.

Confirmed Panellists:

  • Prof. Jeroen van den Hoven, TU Delft
  • Prof. Carola Hein, TU Delft
  • Prof. Nick Dunn, Lancaster University
  • Prof. Andrew Light, George Mason University/World Resources Institute

Global Terrorism and Collective Moral Responsibility: Redesigning Military, Police and Intelligence Institutions in Liberal Democracies

International terrorism, e.g. Al Qaeda, IS, is a major global security threat. Counter-terrorism is a morally complex enterprise involving police, military, intelligence agencies and non-security agencies. Counter-terrorism should be framed as a collective moral responsibility of governments, security institutions and citizens. (1) How is international terrorism to be defined? (2) What is the required theoretical notion of collective moral responsibility? (3) What counter-terrorist strategies and tactics are effective, morally permissible and consistent with liberal democracy? Tactics: targeted killing, drone warfare, preventative detention, and bulk metadata collection (e.g. by NSA); (4) How is this inchoate collective moral responsibility to be institutionally embedded in security agencies? (i) How are security institutions to be redesigned to enable them to realise and coordinate their counter-terrorism strategies without over-reaching their various core institutional purposes which have hitherto been disparate, (e.g. law enforcement versus military combat), and without compromising human rights, (e.g. right to life of innocent civilians, right to freedom, right to privacy), including by means of morally unacceptable counter-terrorism tactics? (ii) How are these tactics to be integrated with a broad-based counter-terrorism strategy which has such measures as anti-radicalisation and state-to-state engagement to address key sources of terrorism, such as the dissemination of extremist religious ideology (e.g. militant Wahhabi ideology emanating from Saudi Arabia) and the legitimate grievances of some terrorist groups (e.g. Palestinian state)? What ought a morally permissible and efficacious (i) structure of counter-terrorist institutional arrangements, and (ii) set of counter-terrorist tactics, for a contemporary liberal democracy collaborating with other liberal democracies facing the common problem of international terrorism consist of?

Vacancy: 2 PhD positions at the Department of Technology, Policy and Management of Delft University of Technology [closed]

The Department of Technology, Policy and Management of Delft University of Technology is hiring two PhD candidates for a project on crowd-based innovations. For one of these positions, we are looking for a candidate with a background in applied ethics and affinity with technology and engineering. 


The “crowd” increasingly seems to be key for innovation in all kind of sectors, e.g. crowdfunding, open platforms, citizen initiatives for energy production, or the sharing economy. Such crowd-based initiatives provide many opportunities for innovations in socio-technical systems, but also significant challenges because they often occur in the context of traditional, well-established, institutional and governance structures and practices. In this research project we will study how to deal with the conflicts that can arise where crowd-based innovations meet existing structures. The goal is to design governance arrangements so that the power of mobilising people and organisations can be combined with legitimacy and responsible innovation. We are looking for two PhD researchers, one applying a normative perspective, and one applying an empirical perspective. The candidates will closely collaborate to combine both perspectives.

Please find the full vacancy text at:

4TU.Ethics @ CPDP2017 – Videos online

The video recording of the 4TU.Ethics panel at “Computer, Data Protection, and Privacy” conference is online:

The panel of the EDPS’s Ethics Advisory Group on “Ethics in the Digital Era” (featuring Jeroen Van den Hoven) is also online:

Finally, check out the recording of the “Surveillance and Privacy in Smart Cities” panel, which includes a talk and discussion with Michael Nagenborg:

4TU.Ethics @ CPDP 2017

4TU.Ethics is proud to sponsor once again the Computer, Data Protection, and Privacy conference. The event takes place from January 25 until January 27, 2017, in Brussel. The 4TU.Ethics panel is dedicated to the topic “AI, Privacy, and Ethics” and will feature our members Nolen Gertz, Luisa Marin, and Iris Huis in ‘t Veld (ETICAS). They will be joined by Pete Fussey, Pierre Nicolas Schwab, Susanne Dehmel (bitkom), and Sartor Giovanni. Join us on Thursday at 14:15! And don’t miss the talks by other members of our center: Michael Nagenborg will speak at a session on “Surveillance and Smart Cities” (Wednesday, 10:30). Jeroen van den Hoven will be one of the speakers a the EDPS’ Ethics Advisory Group’s panel on “Ethics in the Digital Era.”


Fully-funded 4-year Ph.D. position on “Fearful Technologies: Historical and ethical perspectives on the role of fear in pro-technology discourses”

 Eindhoven University of Technology

Technology, Innovation and Society Group

Starting the latest in June 2017

The PhD position is interdisciplinary with a shared advisory team from the Technology, Innovation and Society Group (dr. Kalmbach, prof.dr. v.d. Vleuten) and the Philosophy & Ethics Group (dr. Spahn) of the Department of Industrial Engineering and Innovation Sciences at Eindhoven University of Technology and has a focus on the historical and ethical perspectives on the role of fear in pro-technology discourses. We invite applications for a 4-year fully funded PhD starting at the latest in June 2017.


Research Context:

Fears of technologies are a prominent research topic in Science and Technology Studies and Risk Research. Industry, academia, and policy makers have been eager to develop strategies to overcome such fears. An area that has, however, been neglected so far is the question how the promotion of new technologies has itself appealed to fears. To name only a few examples: The fear that ‘the lights could go out’ has been addressed continuously in the promotion of nuclear energy; the fear that one could miss out on the meaning of life is stressed in the promotion of in-vitro-fertilization; and the fear that the world’s knowledge could get lost is emphasized in the calls for digital archives.

Our project will therefore investigate the argumentative role of fear in pro-technology discourses and the effects of these appeals to fear. It will bring together research from history and ethics of technology. The main research question will be: Which role did and do emotions, and especially fear play (historically) in the pro-technology discourse of emerging technologies and how should these appeals to fear be evaluated from an ethical perspective? We will investigate this question on the basis of empirical case studies and in relation to the ethical debate on the role of emotions and fear in technology acceptance.


Your Profile:

The candidate holds a Master’s degree or equivalent degree in one of the following fields: History (of Technology / Sciences), Philosophy, Sciences and Technology Studies, Innovation Studies, or related fields.

The candidate has excellent writing and qualitative analytic skills, fluency in English (which is the working language of the research group), and a strong interest in conducting research in an interdisciplinary and international team. The candidate should be able to take up the position latest by June 1.

The TU/e is an equal opportunity employee. Members of underrepresented groups are strongly encouraged to apply.


Our Offer:

We offer a four-year PhD position in a vibrating research environment which is shaped by an interdisciplinary and international team of researchers and their tight integration in (inter)national research networks. The position comes with a gross salary of € 2.191 per month in the first year, increasing to €2.801 per month in the fourth year. Additionally, Eindhoven University of Technology provides excellent facilities for professional and personal development, a holiday allowance (amounts to 8% of the annual salary), an end-of-year bonus (amounts to 8,3%), and a number of additional benefits. The candidate will be given the possibility to gain teaching experience as well as to receive graduate training.

The application should contain the following documents:

–  Letter of motivation which explains your interest in the position and your qualifications for it;

–  Curriculum vitae (including, if applicable, a publication list and a teaching overview)

–  Research statement (roughly 500 words, plus (short!) bibliography) in which you elaborate on your ideas for a research focus in line with the project

–  Names and contact information of three persons willing to provide references.


If you are interested in this position, we invite you to apply before January 1, 2017. To apply please click HERE and use the ‘apply now’ button. Please note that a maximum of 5 documents can be uploaded, so if you have more than 5 documents you will have to combine them. Please do not send us applications by e-mail. Shortlisted candidates will be interviewed in Eindhoven (travel expenses will be covered) or via a video call end of January 2017, decision about the appointment will be made shortly after the interviews. The appointment is initially for a period of one year; contingent on satisfactory performance, the position will be extended by a maximum of an additional three years leading to the completion of a PhD thesis.


Further information:

For further information on the research context of this PhD position please contact: Karena Kalmbach ( or Andreas Spahn (

Information about the terms of employment can be obtained from Mrs. Ellen van der Mierden, HR advisor (

More information about the Department of Industrial Engineering and Innovation Sciences can be found at


Colloquium TU Delft: Experiments, Exploration and Robots

Departmental colloquium TU Delft, Ethics and Philosophy of Technology

Date: November 21, 2016; 15:30 – 17:00 hr.
Location: Boardroom (A1.370)

Speaker: Viola Schiaffonati, Department of Electronics, Information and Bioengineering, Politecnico di Milano

The debate on the experimental method, its role, its limits, and its possible applications has recently gained attention in autonomous robotics. If, from the one hand, classical experimental principles, such as repeatability and reproducibility, play as an inspiration for the development of good experimental practices in this research area, from the other hand, some recent analyses have evidenced that rigorous experimental approaches are not yet full part of the research habits in this community.

In this presentation, by investigating autonomous robotics, I will claim that the traditional notion of experimentation cannot be always applied as such to computer engineering and I will propose that the notion of explorative experiment is a good candidate to be considered in some situations. By explorative experiments I mean a form of investigation of novel and interesting ideas or techniques without the typical constraints of rigorous experimental methodologies. These are experiments that are driven by the desire of investigating the realm of possibilities pertaining to the functioning of an artefact and its interaction with the environment in the absence of a proper theory or theoretical background. Moreover, while recognizing a substantial continuity of the engineering sciences with the natural ones, I will try to show why the latter need not only an adaptation from the traditional frameworks already established in the philosophy of science (Staples 2015), but also a shift from them.

In my endeavor, I plan to move along three different but interconnected directions. The first one deals with the notion of directly action-guiding experiment, as characterizing a significant part of the experimental practice in autonomous robotics, in opposition to the one of epistemic experiment. The second direction concerns the debate around engineering epistemology, and whether adapting frameworks from the traditional philosophical debate can suffice to take into account the peculiarity of the discipline. Finally, the third direction acknowledges the empirical turn in the recent philosophy of technology, introduces the framework of technoscience as an engineering way of being in science, and invites philosophers of science to take this notion seriously in order to shed light on a range of questions that have been neglected so far.

Viola Schiaffonati is associate professor at Politecnico Milano. She got the Laurea degree in Philosophy from Università degli Studi di Milano in 1999 and the PhD in Philosophy of Science from Università di Genova in 2004. She has been visiting scholar at the Department of Philosophy of the University of California at Berkeley during the academic year 2000/01 and visiting researcher at the Suppes Center for the Interdisciplinary Study of Science and Technology of the Stanford University in 2005. Currently she is associate professor of Logic and Philosophy of Science at the Dipartimento di Elettronica, Informazione e Bioingegneria of Politecnico di Milano. Her main research interests include: the philosophical foundations of artificial intelligence and robotics, and the philosophy of computing sciences and information, with particular attention to the philosophical issues of computational science and the epistemology of experiments.

Call for Papers: First Workshop on Ethics in Natural Language Processing

To be held at EACL 2017 in Valencia on April 3 or 4, 2017

Submission deadline: Jan 16, 2017


NLP is a rapidly maturing field. NLP technologies now play a role in business applications and decision processes that affect billions of people on a daily basis. However, increasing amounts of data and computational power also mean increased responsibility and new questions for researchers and practitioners. For example, are we inadvertently building unfair biases into our data sets and models? What information is it ethical to infer from user data? How can we prioritize accountability and transparency? What are the big picture ethical consequences and implications of our work?

This one-day, interdisciplinary workshop will bring together researchers and practitioners in NLP with researchers in the humanities, social sciences, public policy, and law to identify and discuss some of the most pressing issues surrounding ethics in NLP. The focus will be on ethics as it relates to the practice of NLP—i.e., actual uses of NLP technologies—not on general aspects of academic ethics (e.g., conflicts of interest, double blind reviewing, etc.), unless they can be addressed with NLP technologies.

The workshop will consist of:
– invited talks,
– contributed talks and posters
– panel discussions

Topics of Interest:
We invite submissions by researchers and practitioners in NLP as well as the humanities, social sciences, public policy, and law on any area of NLP related to:

· Bias in NLP models (e.g., reporting bias, implicit bias).
· Exclusion and inclusion (e.g., exclusion of certain groups or beliefs, how/when to include stakeholders and representatives for the user population to be served).
· Overgeneralization (e.g., making false classifications on tasks including authorship attribution, NER, knowledge base population).
· Exposure (e.g., underrepresentation/overrepresentation of languages or groups).
· Dual use (e.g., the positive and negative aspects of NLP applications, the close relationship between government and industry interests and NLP research).
· Privacy protection (e.g., anonymization of biomedical documents, best practices for researchers in industry to ensure the privacy of their users’ data, educating the public about how much industry and government may know about them, privacy protection for data annotated with non-linguistic features such as emotion).
· Any other topic which concerns ethical considerations in NLP.

Paper submission:
Submissions have to be made electronically via the START submission system: Submissions should be in PDF format and anonymized for review.

All submissions must be written in English and follow the EACL 2017 formatting requirements (available on the EACL 2017 website: We strongly advise the use of the LaTeX template files provided by EACL 2017:
· Each long paper submission must consist of up to eight pages of content, plus two pages for references. Accepted long papers will be given one additional page (i.e., up to nine pages) for content, with unlimited pages for references.
· Each short paper submission must consist of up to four pages of content, plus two pages for references. Accepted short papers will also be given one additional page (i.e., up to five pages) of content, with unlimited pages for references.

All submissions will be peer reviewed, but authors can opt for non-archival submission, since some journals won’t accept work that has been published previously.

Organizing committee:

Dirk Hovy, University of Copenhagen, Denmark
Margaret Mitchell, Google Research, USA
Shannon Spruit, Technical University Delft, The Netherlands
Michael Strube, Heidelberg Institute for Theoretical Studies gGmbH, Germany
Hanna Wallach, Microsoft Research, USA

Workshop at the 4TU.Center for Ethics and Technology, Delft on “Health, technology, and moralization: How are technologies influencing the moralization of health?”

Health, technology, and moralization: How are technologies influencing the moralization of health?

 Date: 2.-3.12. 2016, Delft 

The moralization of health occurs when behaviors and decisions that were previously treated as matters of preference or luck come to be subject to moral evaluation, responsibility, and blame. Moralization can also occur when a new domain of health decision-making emerges with significant moral dimensions. Technology often plays an important role in moralization by providing patients and society with new levels of knowledge and control. For instance, new imaging technologies and genetic tests for prenatal screening supply previously unavailable information to parents, introducing new contexts of morally-freighted decisions. “Lifestyle” and tracking technologies give users a wealth of data about health metrics that can transform choices about diet, exercise, sleep, etc. into moral decisions. Related phenomena include ‘responsibilization’ and its reverse: assigning responsibility for health (and other outcomes) to individuals, reducing attribution of responsibility for decisions and behaviors, or changing the sorts of decisions one is expected to make. This intensive workshop will provide significant opportunities for interaction between participants. Invited speakers include Tamar Sharon (Maastricht), Kalle Grill (Umeå), Rebecca Brown (Aberdeen), Marcel Verweij (Wageningen) and Paula Boddington (Oxford).

The workshop will consider questions such as the following:

  • Which health-related behaviors and decisions are becoming moralized, and what role(s) are technologies playing in this process? In what areas of health and in which situations are technologies un-moralizing health decisions, e.g. by freeing people from the need to make decisions that were previously treated as moral, or by transforming our perception of conditions previously treated as character flaws?
  • How does moralization relate to responsibilization? Are there ways of moralizing that go beyond or move away from attributing responsibility to individuals?
  • What is the relationship between medicalization and moralization—how does seeing something primarily as a matter of health and disease invite or hinder moralization?
  • What are the advantages and disadvantages of moralization in different health domains? What effects is this likely to have on the way we conceive of individual and social responsibility, blame, autonomy, justice, and on views of the good life in both the public and the private sphere?

There are a few spots available for attendees. Please register with Dr. Saskia Nagel