Jan. 2023 talk at UW RAISE

I gave a seminar talk, “Search, Recommendation, and Sea Monsters”, for the UT Austin iSchool HCI group on Mar. 26, 2023. Ensuring that information access systems are “fair”, or that their benefits are equitably experienced by everyone they affect, is a complex, multi-faceted problem. Significant progress has been made in recent years on identifying and measuring important forms of unfair recommendation and retrieval, but there are still many ways that information systems can replicate, exacerbate, or mitigate potentially discriminatory harms that need careful study. These harms can affect different stakeholders — such as the producers and consumers of information, among others — in many different ways, including denying them access to the system’s benefits, misrepresenting them, or reinforcing unhelpful stereotypes. In this talk, I will provide an overview of the landscape of fairness and anti-discrimination in information access systems, discussing both the state of the art in measuring relatively well-understood harms and new directions and open problems in defining and measuring fairness problems. This talk draws heavily from our paper: 2022. Fairness in Information Access Systems. Foundations and Trends® in Information Retrieval 16(1–2) (July 2022), 1–177. DOI 10.1561/1500000079. arXiv:2105.05779. Impact factor: 8. Cited 28 times. Cited 24 times. I also also discusses the following work we have published: 2023. Much Ado About Gender: Current Practices and Future Recommendations for Appropriate Gender-Aware Information Access. In ACM SIGIR Conference on Human Information Interaction and Retrieval (CHIIR ’23). DOI 10.1145/3576840.3578316. arXiv:2301.04780. Acceptance rate: 39.4%. Cited 1 time. 2022. Matching Consumer Fairness Objectives & Strategies for RecSys. Presented at the 5th FAccTrec Workshop on Responsible Recommendation (peer-reviewed but not archived). DOI 10.48550/arXiv.2209.02662. arXiv:2209.02662. 2022. Fire Dragon and Unicorn Princess: Gender Stereotypes and Children’s Products in Search Engine Responses. In SIGIR eCom ’22. DOI 10.48550/arXiv.2206.13747. arXiv:2206.13747. Cited 2 times. 2022. Measuring Fairness in Ranked Results: An Analytical and Empirical Comparison. In Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR ’22). pp. 726–736. DOI 10.1145/3477495.3532018. NSF PAR 10329880. Acceptance rate: 20%. Cited 7 times. Cited 3 times. 2022. The Multisided Complexity of Fairness in Recommender Systems. AI Magazine 43(2) (June 2022), 164–176. DOI 10.1002/aaai.12054. NSF PAR 10334796. Cited 5 times. Cited 3 times. 2021. Exploring Author Gender in Book Rating and Recommendation. User Modeling and User-Adapted Interaction 31(3) (February 2021), 377–420. DOI 10.1007/s11257-020-09284-2. NSF PAR 10218853. Impact factor: 4.412. Cited 114* times. 2020. Evaluating Stochastic Rankings with Expected Exposure. In Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM ’20). ACM, pp. 275–284. DOI 10.1145/3340531.3411962. arXiv:2004.13157. NSF PAR 10199451. Acceptance rate: 20%. Nominated for Best Long Paper. Cited 93 times. Cited 89 times. 2020. Enhancing Classroom Instruction with Online News. Aslib Journal of Information Management 72(5) (June 2020), 725–744. DOI 10.1108/AJIM-11-2019-0309. Impact factor: 1.903. Cited 6 times. Cited 7 times.Abstract
Slides
Resources
Work Cited