Abstract:Do online platforms facilitate the consumption of potentially harmful content? Despite widespread concerns that YouTube's algorithms send people down "rabbit holes" with recommendations to extremist videos, little systematic evidence exists to support this conjecture. Using paired behavioral and survey data provided by participants recruited from a representative sample (n=1,181), we show that exposure to alternative and extremist channel videos on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment. These viewers typically subscribe to these channels (causing YouTube to recommend their videos more often) and often follow external links to them. Contrary to the "rabbit holes" narrative, non-subscribers are rarely recommended videos from alternative and extremist channels and seldom follow such recommendations when offered.