Beyond Checklist Approaches to Ethics in Design
Join us at our virtual workshop at CSCW 2020!
Submit a position paper by September 20, 2020.
Recent public and academic discussions about the intersection of technologies and social values call for greater consideration of ethics in technology development and deployment. The desire to see technology companies address social values has been surfaced in new ways, in part due to regulatory forces (e.g., the California Consumer Privacy Act and the EU General Data Protection Regulation), public scandals like data breaches or systems that cause harm to marginalized communities, as well as technology worker actions such as walkouts and letter writing. These efforts view practices related to technology production as sites of intervention to promote, protect, or embed ethics and social values. The range of social values at stake range from concerns about privacy and fairness in data collection and use, to the potential harms of algorithmic categorization and decision making, to the potential for platforms to enable action based on racial biases or spread misinformation, to manipulation of users’ behavior using data or designed dark patterns, to corporate contracts with militaries and governments perpetuating harms, and the need for greater diversity and inclusion within technology companies’ workforces (e.g., [9,12,15,19,21,22]). The location of the ethics problem varies across these concerns, for instance some being concerned about the technical products themselves, some focusing on the ways in which technologies are used, and others focusing on the people and organizations creating technical artifacts.
Interdisciplinary research under the rubric of “values in design” has long been interested in how technological artifacts and the practices of production promote or embed social values, ethics, and politics, and how technology designers’ practices and beliefs affect these artifacts [1,5,13,20]. Values in Design research in CSCW and adjacent fields has sought to analyze how social values are implicated in practices of technology design, creation, maintenance, and repair [7,8,14,19,21]. Parallel research from the human computer interaction (HCI) and design research communities has developed a range of design approaches to develop technical systems in ways that are cognizant of values and ethics and center their promotion as a core goal of design (e.g., [3,4,6]).
Simultaneously to this work in HCI and CSCW, industry approaches to intervening in values and ethical issues in practice have largely adopted compliance- or checklist-oriented approaches. This includes creating legal compliance programs for values articulated in laws and regulations (such as privacy) or checklist-based toolkits, such as the Ethical OS toolkit, Ethics & AI toolkit, IBM’s AI Fairness 360 toolkit, and the Digital Impact toolkit, among many in circulation. These toolkits and approaches may present practitioners with a set of guidelines they should follow, sets of checklists, or a set of diagnostic questions to ask themselves. While useful tools, these approaches tend to pre-define what values or ethics mean; there is some set of exogenous values or ethics requirements that the tool helps a practitioner meet. Several also frame issues of ethics and values as risk-management problems. However, surfacing discussion and consideration of ethics and values in broader, more open-ended ways during the design process may help surface unique needs, social corner cases, or new or different understandings of values and ethics .
Workshop Goals and Outcomes
This workshop will convene CSCW researchers and practitioners working across a wide array of domains to propose and consider new interventions and approaches to ethics in design that go beyond formal checklist- and compliance-oriented approaches. CSCW’s rich set of qualitative, quantitative, and design-based methods when investigating values and ethics  provides a starting point for thinking about approaches and interventions for ethics and values in design that go beyond compliance- and checklist- oriented approaches. These may include new design-based activities (e.g., [2,21]), games and roleplaying (e.g., ), critical making (e.g., ), considerations of and changes to organizational structure and work practice (e.g., [7,11,18]), or conducting empirical research on ethics and values. We take the conceptual multiplicity surrounding ethics as an opportunity to convene researchers and practitioners from different disciplines to map out a broader space of approaches to values and ethics in design, exploring the complimentary advantages of alternative modes of intervention. Thus our goal is to explore multiple and alternative forms of values and ethics interventions, rather than coming to a particular “correct” or “best” approach.
This workshop contributes to values and ethics in design research by (1) mapping out a space of interventions for values and ethics, (2) proposing new approaches and interventions, and (3) crafting an agenda for experimenting with and evaluating design interventions.
 Madeleine Akrich. 1992. The De-Scription of Technical Objects. In Shaping Technology Building Society: Studies in Sociotechnical Change, Wiebe Bijker and John Law (eds.). MIT Press, 205–224.
 Stephanie Ballard, Karen M. Chappell, and Kristen Kennedy. 2019. Judgment Call the Game: Using value sensitive design and design fiction to surface ethical concerns related to technology. In Proceedings of the 2019 on Designing Interactive Systems Conference – DIS ’19, 421–433. https://doi.org/10.1145/3322276.3323697
 Lynn Dombrowski, Ellie Harmon, and Sarah Fox. 2016. Social Justice-Oriented Interaction Design. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (DIS ’16), 656–671. https://doi.org/10.1145/2901790.2901861
 Paul Dourish, Janet Finlay, Phoebe Sengers, and Peter Wright. 2004. Reflective HCI: Towards a Critical Technical Practice. CHI’04 Extended Abstracts on Human Factors in Computing Systems: 1727–1728. https://doi.org/10.1145/985921.986203
 Mary Flanagan, Daniel C. Howe, and Helen Nissenbaum. 2008. Embodying values in technology: Theory and practice. In Information technology and Moral Philosophy, Jeroen Van Den Hoven and John (Charles Sa Weckert (eds.). Cambridge University Press, New York, 322–353. Retrieved June 15, 2014 from http://medcontent.metapress.com/index/A65RM03P4874243N.pdf
 Batya Friedman, Peter H. Kahn, and Alan Borning. 2008. Value Sensitive Design and Information Systems. In The Handbook of Information and Computer Ethics, Kenneth Einar Himma and Herman T. Tavani (eds.). John Wiley & Sons, Inc., 69–101.
 Colin M. Gray and Shruthi Sai Chivukula. 2019. Ethical Mediation in UX Practice. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems – CHI ’19, 1–11. https://doi.org/10.1145/3290605.3300408
 Lara Houston, Steven J Jackson, Daniela K Rosner, Syed Ishtiaque Ahmed, Meg Young, and Laewoo Kang. 2016. Values in Repair. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems – CHI ’16, 1403–1414. https://doi.org/10.1145/2858036.2858470
 Steven J. Jackson, Tarleton Gillespie, and Sandy Payette. 2014. The policy knot. In Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing – CSCW ’14, 588–602. https://doi.org/10.1145/2531602.2531674
 Nassim JafariNaimi, Lisa Nathan, and Ian Hargraves. 2015. Values as Hypotheses: Design, Inquiry, and the Service of Values. Design Issues 31, 4: 91–104. https://doi.org/10.1162/DESI_a_00354
 Jacob Metcalf, Emanuel Moss, and danah boyd. 2019. Owning ethics: Corporate logics, Silicon Valley, and the institutionalization of ethics. Social Research 86, 2: 449–476.
 Michael Muller. 2014. Whose Values ? Whose Design ? CSCW Workshop on Co-creating & Identity-making in CSCW. Retrieved from http://ethicsworkshopcscw2014.files.wordpress.com/2013/10/muller-whose-values.pdf
 Helen Nissenbaum. 2001. How computer systems embody values. Computer 34, 3: 120–119. https://doi.org/10.1109/2.910905
 Samir Passi and Steven J Jackson. 2018. Trust in Data Science: Collaboration, Translation, and Accountability in Corporate Data Science Projects. Proceedings of the ACM on Human-Computer Interaction 2, CSCW: 1–28. https://doi.org/10.1145/3274405
 Pablo Alejandro Quinones, Stephanie D. Teasley, and Steven Lonn. 2013. Appropriation by unanticipated users: Looking beyond design intent and expected use. Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW: 1515–1526. https://doi.org/10.1145/2441776.2441949
 Matt Ratto. 2011. Critical making: Conceptual and material studies in technology and social life. Information Society 27, 4: 252–260. https://doi.org/10.1080/01972243.2011.583819
 K. Shilton, D. Heidenblad, A. Porter, S. Winter, and M. Kendig. (In press) Role-Playing Computer Ethics: Designing and Evaluating the Privacy by Design Simulation. Science and Engineering Ethics.
 Katie Shilton. 2013. Values Levers: Building Ethics into Design. Science, Technology, & Human Values 38, 3: 374–397. https://doi.org/10.1177/0162243912436985
 Katie Shilton, Jes A. Koepfler, and Kenneth R. Fleischmann. 2014. How to see values in social computing: Methods for Studying Values Dimensions. In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW ’14), 426–435. https://doi.org/10.1145/2531602.2531625
 Langdon Winner. 1980. Do Artifacts Have Politics? Daedalus 109, 1: 121–136.
 Richmond Y. Wong, Deirdre K Mulligan, Ellen Van Wyk, James Pierce, and John Chuang. 2017. Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks. Proceedings of the ACM on Human Computer Interaction 1, CSCW. https://doi.org/10.1145/3134746
 Haiyi Zhu, Bowen Yu, Aaron Halfaker, and Loren Terveen. 2018. Value-Sensitive Algorithm Design: Method, Case Study, and Lessons. Proceedings of the ACM on Human-Computer Interaction 2, CSCW: 1–23. https://doi.org/10.1145/3274463