Recommending for People: 2017 Edition
This page collects resources referenced in my research seminar talk, Recommending for People, as I have given it in 2017.
I tweak up the exact talk a little bit for different audiences, but the core ideas are generally the same. Recommender systems help people find movies to watch, introduce new friends on social networks, increase sales for online retailers by connecting their customers with personally-relevant products, and direct readers to additional articles on news publishers’ partner sites. Users interact with recommenders almost everywhere they turn on the modern Internet. However, there is a great deal we still do not yet know about how to best design these systems to support their users’ needs and decision-making processes, and how the recommender and its sociotechnical context support and affect each other. In this talk, I will present work on understanding the ways in which different recommender algorithms impact and respond to their users. This research applies several methodologies, including analysis of recommender algorithms on public data sets and studies of both the stated preferences and observable behaviors of the users of a recommender system. Our findings provide evidence, consistent across different experimental settings, that different users are more satisfied by different recommendation algorithms even within the single domain of movie recommendation. I will also discuss our ongoing work on how recommender algorithms interact with biases in their underlying input data and on deep challenges in evaluating recommender effectiveness with respect to actual user needs. These projects, along with several others, are a part of our broad vision for designing recommenders and other algorithmic information systems that are responsive to the needs, desires, and well-being of the people they will affect. Dr. Michael D. Ekstrand is an assistant professor in the Department of Computer Science at Boise State University, where he studies human-computer interaction and recommender systems. He received his Ph.D in 2014 from the University of Minnesota, supporting reproducible research and examining user-relevant differences in recommender algorithms with the GroupLens research group. He co-leads (with Dr. Sole Pera) the People and Information Research Team (PIReT) at Boise State; is the founder and lead developer of LensKit, an open-source software project aimed at supporting reproducible research and education in recommender systems; and co-created (with Dr. Joseph A. Konstan at the University of Minnesota) the Recommender Systems specialization on Coursera. His research interests are primarily in the ways users and intelligent information systems interact, with the goal of improving the ability of these systems to help their users and produce social benefit, and in the reproducibility of such research. 2011. Rethinking The Recommender Research Ecosystem: Reproducibility, Openness, and LensKit. In Proceedings of the Fifth ACM Conference on Recommender Systems (RecSys ’11). ACM, pp. 133–140. DOI 10.1145/2043932.2043958. Acceptance rate: 27% (20% for oral presentation, which this received). Cited 255 times. Cited 195 times. 2017. Sturgeon and the Cool Kids: Problems with Random Decoys for Top-N Recommender Evaluation. In Proceedings of the 30th International Florida Artificial Intelligence Research Society Conference (Recommender Systems track). AAAI, pp. 639–644. No acceptance rate reported. Cited 16 times. Cited 10 times. 2012. When Recommenders Fail: Predicting Recommender Failure for Algorithm Selection and Combination. Short paper in Proceedings of the Sixth ACM Conference on Recommender Systems (RecSys ’12). ACM, pp. 233–236. DOI 10.1145/2365952.2366002. Acceptance rate: 32%. Cited 88 times. Cited 73 times. 2014. User Perception of Differences in Recommender Algorithms. In Proceedings of the 8th ACM Conference on Recommender Systems (RecSys ’14). ACM. DOI 10.1145/2645710.2645737. Acceptance rate: 23%. Cited 283 times. Cited 184 times. 2015. Letting Users Choose Recommender Algorithms: An Experimental Study. In Proceedings of the 9th ACM Conference on Recommender Systems (RecSys ’15). ACM. DOI 10.1145/2792838.2800195. Acceptance rate: 21%. Cited 138 times. Cited 100 times. 2016. Behaviorism is Not Enough: Better Recommendations through Listening to Users. In Proceedings of the Tenth ACM Conference on Recommender Systems (RecSys ’16, Past, Present, and Future track). ACM. DOI 10.1145/2959100.2959179. Acceptance rate: 36%. Cited 140 times. Cited 94 times. 2016. Dependency Injection with Static Analysis and Context-Aware Policy. Journal of Object Technology 15(1) (February 1st, 2016), 1:1–31. DOI 10.5381/jot.2016.15.1.a1. Cited 16 times.Abstract
Bio
Slides
Resources
Research Presented
Works Cited