You Might Also Think This Is Unfair
I gave this talk, “You Might Also Think This Is Unfair: Operationalizing Fairness and Respect in Information Systems”, on March 24, 2022 for the University of Michigan School of Information.
Every day, information access systems mediate our experience of the world beyond our immediate senses. Google and Bing help us find what we seek, Amazon and Netflix recommend things for us to buy and watch, Apple News gives us the day’s events, and LinkedIn helps us find new jobs. These systems deliver immense value, but also have profound influence on how we experience information and the resources and perspectives we see. The influence and impacts of these systems raise a number of questions: how are the costs and benefits of search, recommendation, and other information access systems distributed? Is that distribution equitable, or does it benefit a few at the expense of many? Are they designed with respect for their users, producers, and other affected people?
In this talk, I will discuss how to locate specific questions about the equity of an information access system in a landscape of harms, present some of my own group’s work on quantifying and measuring systematic biases, and look to a future of engaged, human-centered research and development on information access systems grounded in the dignity and well-being of everyone they affect.
These papers provide more details on the research I presented. Many of them have accompanying code to reproduce the experiments and results.
2022. Fairness in Information Access Systems. Foundations and Trends® in Information Retrieval (to appear), 92 pp. Impact factor: 8., , , and .
2018. All The Cool Kids, How Do They Fit In?: Popularity and Demographic Biases in Recommender Evaluation and Effectiveness. In Proceedings of the 1st Conference on Fairness, Accountability and Transparency (FAT* 2018). PMLR, Proceedings of Machine Learning Research 81:172–186. Acceptance rate: 24%. Cited 97 times., , , , , , and .
2021. Exploring Author Gender in Book Rating and Recommendation. User Modeling and User-Adapted Interaction 31(3) (February 2021), 377–420. Impact factor: 4.412. Cited 3 times.and .
2018. Exploring Author Gender in Book Rating and Recommendation. In Proceedings of the 12th ACM Conference on Recommender Systems (RecSys ’18). ACM, pp. 242–250. Acceptance rate: 17.5%. Cited 78 times., , , , and .
2020. Evaluating Stochastic Rankings with Expected Exposure. In Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM ’20). ACM, pp. 275–284. Acceptance rate: 20%. Nominated for Best Long Paper. Cited 54 times., , , , and .
2021. Pink for Princesses, Blue for Superheroes: The Need to Examine Gender Stereotypes in Kids’ Products in Search and Recommendations. In Proceedings of the 5th International and Interdisciplinary Workshop on Children & Recommender Systems (KidRec ’21), at IDC 2021. Cited 2 times., , and .
2021. Evaluating Recommenders with Distributions. At Proceedings of the RecSys 2021 Workshop on Perspectives on the Evaluation of Recommender Systems (RecSys ’21)., , and .
2021. Estimation of Fair Ranking Metrics with Incomplete Judgments. In Proceedings of The Web Conference 2021 (TheWebConf 2021). ACM. Acceptance rate: 21%. Cited 8 times., , , , , and .
- NSF CAREER award
- Boise State University College of Education Civility Grant
Other Work Cited
- Franklin, Ursula M. 2004. The Real World of Technology. Revised Edition. CBC Massey Lectures (1989). Toronto, Ont.; Berkeley, CA: House of Anansi Press.
- Green, B and Viljoen, S. 2020. Algorithmic Realism: Expanding the Boundaries of Algorithmic Thought. In Conference on Fairness, Accountability, and Transparency (FAT* ’20) doi:10.1145/3351095.3372840.
- Friedler, Sorelle A., Carlos Scheidegger, and Suresh Venkatasubramanian. 2021. “The (Im)possibility of Fairness: Different Value Systems Require Different Mechanisms for Fair Decision Making.” Communications of the ACM 64 (4): 136–43. doi:10.1145/3433949.
- Mitchell, Shira, Eric Potash, Solon Barocas, Alexander D’Amour, and Kristian Lum. 2020. “Algorithmic Fairness: Choices, Assumptions, and Definitions.” Annual Review of Statistics and Its Application 8 (November). doi:10.1146/annurev-statistics-042720-125902.
- Chouldechova, Alexandra. 2017. “Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments.” Big Data 5 (2): 153–63. doi:10.1089/big.2016.0047.
- Binns, Reuben. 2020. “On the Apparent Conflict between Individual and Group Fairness.” In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 514–24. FAT* ’20. doi:10.1145/3351095.3372864.
- Rishabh Mehrotra, Ashton Anderson, Fernando Diaz, Amit Sharma, Hanna Wallach, and Emine Yilmaz. 2017. “Auditing Search Engines for Differential Satisfaction Across Demographics.” In Proceedings of the 26th International Conference on World Wide Web Companion, 626–33. doi:10.1145/3041021.3054197.
- Harambam, Jaron, Dimitrios Bountouridis, Mykola Makhortykh, and Joris van Hoboken. 2019. “Designing for the Better by Taking Users into Account: A Qualitative Evaluation of User Control Mechanisms in (news) Recommender Systems.” In Proceedings of the 13th ACM Conference on Recommender Systems, 69–77. RecSys ’19. doi:10.1145/3298689.3347014.
- Ferraro, Andres, Xavier Serra, and Christine Bauer. 2021. “What Is Fair? Exploring the Artists’ Perspective on the Fairness of Music Streaming Platforms.” In Proceedings of the 18th IFIP International Conference on Human-Computer Interaction (INTERACT 2021). http://arxiv.org/abs/2106.02415.