Oct. 2023 talk at University of Glasgow

Slide from Glasgow talk

Title: Search, Recommendation, and Sea Monsters

Abstract

Ensuring that information access systems are “fair”, or that their benefits are equitably experienced by everyone they affect, is a complex, multi-faceted problem. Significant progress has been made in recent years on identifying and measuring important forms of unfair recommendation and retrieval, but there are still many ways that information systems can replicate, exacerbate, or mitigate potentially discriminatory harms that need careful study. These harms can affect different stakeholders — such as the producers and consumers of information, among others — in many different ways, including denying them access to the system’s benefits, misrepresenting them, or reinforcing unhelpful stereotypes.

In this talk, I will provide an overview of the landscape of fairness and anti-discrimination in information access systems and their underlying theories, discussing both the state of the art in measuring relatively well-understood harms and new directions and open problems in defining and measuring fairness problems.

Recording

Also available on YouTube.

Slides

Resources

This talk draws heavily from our paper:

FnT22
2022

Michael D. Ekstrand, Anubrata Das, Robin Burke, and Fernando Diaz. 2022. Fairness in Information Access Systems. Foundations and Trends® in Information Retrieval 16(1–2) (July 11th, 2022), 1–177. DOI 10.1561/1500000079. arXiv:2105.05779 [cs.IR]. NSF PAR 10347630. Impact factor: 8. Cited 184 times. Cited 84 times.

I also also discusses the following work we have published:

FAccTRec23
2023

Amifa Raj and Michael D. Ekstrand. 2023. Towards Measuring Fairness in Grid Layout in Recommender Systems. Presented at the 6th FAccTrec Workshop on Responsible Recommendation at RecSys 2023 (peer-reviewed but not archived). arXiv:2309.10271 [cs.IR]. Cited 1 time.

CHIIR23
2023

Christine Pinney, Amifa Raj, Alex Hanna, and Michael D. Ekstrand. 2023. Much Ado About Gender: Current Practices and Future Recommendations for Appropriate Gender-Aware Information Access. In Proceedings of the 2023 Conference on Human Information Interaction and Retrieval (CHIIR ’23). DOI 10.1145/3576840.3578316. arXiv:2301.04780. NSF PAR 10423693. Acceptance rate: 39.4%. Cited 19 times. Cited 11 times.

FAccTRec22
2022

Michael D. Ekstrand and Maria Soledad Pera. 2022. Matching Consumer Fairness Objectives & Strategies for RecSys. Presented at the 5th FAccTrec Workshop on Responsible Recommendation at RecSys 2022 (peer-reviewed but not archived). arXiv:2209.02662 [cs.IR]. Cited 6 times. Cited 4 times.

SIGIR22
2022

Amifa Raj and Michael D. Ekstrand. 2022. Measuring Fairness in Ranked Results: An Analytical and Empirical Comparison. In Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR ’22). pp. 726–736. DOI 10.1145/3477495.3532018. NSF PAR 10329880. Acceptance rate: 20%. Cited 59 times. Cited 44 times.

AIMAG22
2022

Nasim Sonboli, Robin Burke, Michael Ekstrand, and Rishabh Mehrotra. 2022. The Multisided Complexity of Fairness in Recommender Systems. AI Magazine 43(2) (June 23rd, 2022), 164–176. DOI 10.1002/aaai.12054. NSF PAR 10334796. Cited 33 times. Cited 19 times.

UMUAI21
2021

Michael D. Ekstrand and Daniel Kluver. 2021. Exploring Author Gender in Book Rating and Recommendation. User Modeling and User-Adapted Interaction 31(3) (February 4th, 2021), 377–420. DOI 10.1007/s11257-020-09284-2. arXiv:1808.07586v2. NSF PAR 10218853. Impact factor: 4.412. Cited 201 times (shared with RecSys18). Cited 107 times (shared with RecSys18).

CIKM20-ee
2020

Fernando Diaz, Bhaskar Mitra, Michael D. Ekstrand, Asia J. Biega, and Ben Carterette. 2020. Evaluating Stochastic Rankings with Expected Exposure. In Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM ’20). ACM, pp. 275–284. DOI 10.1145/3340531.3411962. arXiv:2004.13157 [cs.IR]. NSF PAR 10199451. Acceptance rate: 20%. Nominated for Best Long Paper. Cited 187 times. Cited 166 times.

AJIM20
2020

Michael D. Ekstrand, Katherine Landau Wright, and Maria Soledad Pera. 2020. Enhancing Classroom Instruction with Online News. Aslib Journal of Information Management 72(5) (November 17th, 2020; online June 14th, 2020), 725–744. DOI 10.1108/AJIM-11-2019-0309. Impact factor: 1.903. Cited 19 times. Cited 12 times.

Work Cited

  • Beutel, Alex, Ed H. Chi, Cristos Goodrow, Jilin Chen, Tulsee Doshi, Hai Qian, Li Wei, et al. 2019. “Fairness in Recommendation Ranking through Pairwise Comparisons.” In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. doi:10.1145/3292500.3330745.
  • Biega, Asia J., Krishna P. Gummadi, and Gerhard Weikum. 2018. “Equity of Attention: Amortizing Individual Fairness in Rankings.” In The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 405–14. ACM. doi:10.1145/3209978.3210063.
  • Binns, Reuben. 2020. “On the Apparent Conflict between Individual and Group Fairness.” In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 514–24. FAT* ’20. doi:10.1145/3351095.3372864.
  • Bower, Amanda, Kristian Lum, Tomo Lazovich, Kyra Yee, and Luca Belli. “Random Isn’t Always Fair: Candidate Set Imbalance and Exposure Inequality in Recommender Systems.” Presented at FAccTRec 2022. <arxiv:2209.05000>.
  • Chouldechova, Alexandra. 2017. “Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments.” Big Data 5(2): 153–63. doi:10.1089/big.2016.0047.
  • D’Ignazio, Catherine, and Lauren F. Klein. 2020. Data Feminism. MIT Press. https://data-feminism.mitpress.mit.edu/.
  • Friedler, Sorelle A., Carlos Scheidegger, and Suresh Venkatasubramanian. 2021. “The (Im)possibility of Fairness.” Communications of the ACM 64(4): 136–43. doi:10.1145/3433949.
  • Friedman, Batya, and Helen Nissenbaum. 1996. “Bias in Computer Systems.” ACM Transactions on Information Systems 14(3): 330–47. doi:10.1145/230538.230561.
  • Gebru, Timnit, Jamie Morgenstern, Briana Vecchione, Jennifer Wortman Vaughan, Hanna Wallach, Hal Daumé III, and Kate Crawford. 2018. “Datasheets for Datasets.” arXiv [cs.DB]. arXiv. http://arxiv.org/abs/1803.09010.
  • Kamishima, Toshihiro, Shotaro Akaho, Hideki Asoh, and Jun Sakuma. 2018. “Recommendation Independence.” In Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 81:187–201. Proceedings of Machine Learning Research. New York, NY, USA: PMLR. http://proceedings.mlr.press/v81/kamishima18a.html
  • Mehrotra, Rishabh, Ashton Anderson, Fernando Diaz, Amit Sharma, Hanna Wallach, and Emine Yilmaz. 2017. “Auditing Search Engines for Differential Satisfaction Across Demographics.” In Proceedings of the 26th International Conference on World Wide Web Companion, 626–33. doi:10.1145/3041021.3054197.
  • Mitchell, Shira, Eric Potash, Solon Barocas, Alexander D’Amour, and Kristian Lum. 2020. “Algorithmic Fairness: Choices, Assumptions, and Definitions.” Annual Review of Statistics and Its Application 8 (November). doi:10.1146/annurev-statistics-042720-125902.x`
  • Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.
  • Selbst, Andrew D., Danah Boyd, Sorelle A. Friedler, Suresh Venkatasubramanian, and Janet Vertesi. 2019. “Fairness and Abstraction in Sociotechnical Systems.” In Proceedings of the Conference on Fairness, Accountability, and Transparency, 59–68. FAT* ’19. doi:10.1145/3287560.3287598.
  • Wang, Lequn, and Thorsten Joachims. 2021. “User Fairness, Item Fairness, and Diversity for Rankings in Two-Sided Markets.” In Proceedings of the 2021 ACM SIGIR International Conference on Theory of Information Retrieval, 23–41. ICTIR ’21. doi:10.1145/3471158.3472260.