Social Dimensions of Recommendation Remote REU
I am providing an online Research Experience for Undergraduates in summer 2020. The selected students will work with myself and a Ph.D student remotely, communicating via Slack and Zoom. Due to logistical timelines, the application window is limited; applications are due by April 24, 2020.
In this project, we are trying understand the social impact of information access systems, such as search engines, recommender systems, and other tools that help people find and filter information and products; we are particularly interested in how issues of bias and discrimination affect such systems, such as possible discriminatory patterns in the books or music recommended by the kinds of algorithms that drive platforms like GoodReads and Spotify1. It is funded by NSF grant 17-51278; for more of our work here, see Fair Recommendation.
Work in this project will focus on the following research questions, or related topics based on student interest:
- What does it mean for a recommender to be fair, unfair, or biased?
- What potentially discriminatory biases are present in the recommender’s input data, algorithmic structure, or output?
- How do these biases change over time through the recommender-user feedback loop?
This is a part of our overall, ongoing goal to help make recommenders (and other AI systems) better for the people they affect.
This is a 10-week paid research opportunity with the People and Information Research Team. Before the REU starts, we will discuss participants' interests and specific research ideas remotely.
For further background on some of the research in this project, see:
- Specific platforms are mentioned only by way of concrete example, and do not imply discriminatory bias by any particular company or product.↩