Mar. 2023 talk at UT

Slide from UT talk

I gave a seminar talk, “Search, Recommendation, and Sea Monsters”, for the UT Austin iSchool HCI group on Mar. 26, 2023.


Ensuring that information access systems are “fair”, or that their benefits are equitably experienced by everyone they affect, is a complex, multi-faceted problem. Significant progress has been made in recent years on identifying and measuring important forms of unfair recommendation and retrieval, but there are still many ways that information systems can replicate, exacerbate, or mitigate potentially discriminatory harms that need careful study. These harms can affect different stakeholders — such as the producers and consumers of information, among others — in many different ways, including denying them access to the system’s benefits, misrepresenting them, or reinforcing unhelpful stereotypes.

In this talk, I will provide an overview of the landscape of fairness and anti-discrimination in information access systems, discussing both the state of the art in measuring relatively well-understood harms and new directions and open problems in defining and measuring fairness problems.



This talk draws heavily from our paper:


Michael D. Ekstrand, Anubrata Das, Robin Burke, and Fernando Diaz. 2022. Fairness in Information Access Systems. Foundations and Trends® in Information Retrieval 16(1–2) (July 11th, 2022), 1–177. FnT IR 16(1–2) (July 11th, 2022). DOI 10.1561/1500000079. arXiv:2105.05779 [cs.IR]. NSF PAR 10347630. Impact factor: 8. Cited 90 times. Cited 51 times.

I also also discusses the following work we have published:


Christine Pinney, Amifa Raj, Alex Hanna, and Michael D. Ekstrand. 2023. Much Ado About Gender: Current Practices and Future Recommendations for Appropriate Gender-Aware Information Access. In Proceedings of the 2023 Conference on Human Information Interaction and Retrieval (CHIIR ’23). Proc. CHIIR ’23. DOI 10.1145/3576840.3578316. arXiv:2301.04780. NSF PAR 10423693. Acceptance rate: 39.4%. Cited 3 times. Cited 4 times.


Michael D. Ekstrand and Maria Soledad Pera. 2022. Matching Consumer Fairness Objectives & Strategies for RecSys. Presented at the 5th FAccTrec Workshop on Responsible Recommendation (peer-reviewed but not archived). arXiv:2209.02662 [cs.IR].


Amifa Raj and Michael D. Ekstrand. 2022. Measuring Fairness in Ranked Results: An Analytical and Empirical Comparison. In Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR ’22). pp. 726–736. Proc. SIGIR ’22. DOI 10.1145/3477495.3532018. NSF PAR 10329880. Acceptance rate: 20%. Cited 24 times. Cited 20 times.


Nasim Sonboli, Robin Burke, Michael Ekstrand, and Rishabh Mehrotra. 2022. The Multisided Complexity of Fairness in Recommender Systems. AI Magazine 43(2) (June 23rd, 2022), 164–176. DOI 10.1002/aaai.12054. NSF PAR 10334796. Cited 11 times. Cited 9 times.


Michael D. Ekstrand and Daniel Kluver. 2021. Exploring Author Gender in Book Rating and Recommendation. User Modeling and User-Adapted Interaction 31(3) (February 4th, 2021), 377–420. UMUAI 31(3) (February 4th, 2021). DOI 10.1007/s11257-020-09284-2. arXiv:1808.07586v2. NSF PAR 10218853. Impact factor: 4.412. Cited 145 times. Cited 24 times (shared with RecSys18).


Fernando Diaz, Bhaskar Mitra, Michael D. Ekstrand, Asia J. Biega, and Ben Carterette. 2020. Evaluating Stochastic Rankings with Expected Exposure. In Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM ’20). ACM, pp. 275–284. Proc. CIKM ’20. DOI 10.1145/3340531.3411962. arXiv:2004.13157 [cs.IR]. NSF PAR 10199451. Acceptance rate: 20%. Nominated for Best Long Paper. Cited 137 times. Cited 127 times.


Michael D. Ekstrand, Katherine Landau Wright, and Maria Soledad Pera. 2020. Enhancing Classroom Instruction with Online News. Aslib Journal of Information Management 72(5) (June 15th, 2020), 725–744. AJIM 72(5) (June 15th, 2020). DOI 10.1108/AJIM-11-2019-0309. Impact factor: 1.903. Cited 12 times. Cited 9 times.

Work Cited

  • Beutel, Alex, Ed H. Chi, Cristos Goodrow, Jilin Chen, Tulsee Doshi, Hai Qian, Li Wei, et al. 2019. “Fairness in Recommendation Ranking through Pairwise Comparisons.” In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. doi:10.1145/3292500.3330745.
  • Biega, Asia J., Krishna P. Gummadi, and Gerhard Weikum. 2018. “Equity of Attention: Amortizing Individual Fairness in Rankings.” In The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, 405–14. ACM. doi:10.1145/3209978.3210063.
  • Binns, Reuben. 2020. “On the Apparent Conflict between Individual and Group Fairness.” In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 514–24. FAT* ’20. doi:10.1145/3351095.3372864.
  • Chouldechova, Alexandra. 2017. “Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments.” Big Data 5(2): 153–63. doi:10.1089/big.2016.0047.
  • D’Ignazio, Catherine, and Lauren F. Klein. 2020. Data Feminism. MIT Press.
  • Friedler, Sorelle A., Carlos Scheidegger, and Suresh Venkatasubramanian. 2021. “The (Im)possibility of Fairness.” Communications of the ACM 64(4): 136–43. doi:10.1145/3433949.
  • Friedman, Batya, and Helen Nissenbaum. 1996. “Bias in Computer Systems.” ACM Transactions on Information Systems 14(3): 330–47. doi:10.1145/230538.230561.
  • Gebru, Timnit, Jamie Morgenstern, Briana Vecchione, Jennifer Wortman Vaughan, Hanna Wallach, Hal Daumé III, and Kate Crawford. 2018. “Datasheets for Datasets.” arXiv [cs.DB]. arXiv.
  • Kamishima, Toshihiro, Shotaro Akaho, Hideki Asoh, and Jun Sakuma. 2018. “Recommendation Independence.” In Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 81:187–201. Proceedings of Machine Learning Research. New York, NY, USA: PMLR.
  • Mehrotra, Rishabh, Ashton Anderson, Fernando Diaz, Amit Sharma, Hanna Wallach, and Emine Yilmaz. 2017. “Auditing Search Engines for Differential Satisfaction Across Demographics.” In Proceedings of the 26th International Conference on World Wide Web Companion, 626–33. doi:10.1145/3041021.3054197.
  • Mitchell, Shira, Eric Potash, Solon Barocas, Alexander D’Amour, and Kristian Lum. 2020. “Algorithmic Fairness: Choices, Assumptions, and Definitions.” Annual Review of Statistics and Its Application 8 (November). doi:10.1146/annurev-statistics-042720-125902.x`
  • Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.
  • Selbst, Andrew D., Danah Boyd, Sorelle A. Friedler, Suresh Venkatasubramanian, and Janet Vertesi. 2019. “Fairness and Abstraction in Sociotechnical Systems.” In Proceedings of the Conference on Fairness, Accountability, and Transparency, 59–68. FAT* ’19. doi:10.1145/3287560.3287598.
  • Wang, Lequn, and Thorsten Joachims. 2021. “User Fairness, Item Fairness, and Diversity for Rankings in Two-Sided Markets.” In Proceedings of the 2021 ACM SIGIR International Conference on Theory of Information Retrieval, 23–41. ICTIR ’21. doi:10.1145/3471158.3472260.