![berkman klein center alternatives berkman klein center alternatives](https://cdn.theatlantic.com/thumbor/tCNHKhwMvW68XqXwOeBUh2kDlKw=/249x0:1749x1125/1200x900/media/img/mt/2021/10/SeeInsideFB2/original.jpg)
In a first step, we created a list with political channels, conspiracy theory channels, as well as the top 250 channels from. Stumbling over the rabbit hole: our methodįor our analysis, we followed YouTube’s “related channel” function i.e., its channel recommendation system.
![berkman klein center alternatives berkman klein center alternatives](https://static.wixstatic.com/media/d9ecfa_7d89fc92b6a542f8bc6480e9bd506e66~mv2.png)
We posit that this specific project and the issues that it touched upon are endemic in computational social science, need to be discussed, and guidelines to be introduced. In this piece, we will outline our method, to, then, highlight the issues that we encountered along the way, both from a research as well as a legal perspective. We -that is Yasodora Córdova, Adrian Rauchfleisch, and Jonas Kaiser- set out to map the Brazilian YouTube-sphere but ended up navigating YouTube’s communities of sexually suggestive channels as well as videos of children. With over 200 million inhabitants, Brazil is the fourth largest democracy in the world and in the last elections YouTubers even got voted into office (Broderick, 2018). Our starting research question changed from “How is the Brazilian YouTube-sphere structured?” to “How bad is it?”.īut let’s start from the beginning: like in our previous work, we were interested in the communities and prominent channels in the Brazilian YouTube-sphere. And while we had started like we had done previously in our analysis on the United States (Kaiser & Rauchfleisch, 2018a, 2018b) and Germany (Kaiser & Rauchfleisch, 2017), this time, we had truly fallen down YouTube’s algorithmic rabbit hole. But, as we will show here, YouTube is not only plagued by political extremism and misinformation, but also by its own algorithms that created a filter bubble of sorts for paedophiles. Once famous for enabling everyone to “broadcast yourself”, YouTube is now known for the “alternative influence network” (Lewis, 2018) that the US far-right has set up on the platform. In this piece, they expand on their findings, show how easily accessible the channels were, reflect upon the risks of conducting online research, media cooperations, and demand more accountability from social media platforms.įuelled by fears of radicalisation (O’Callaghan et al., 2015 Munn, 2019), misinformation (Briones et al., 2012), and filter bubbles (Pariser, 2011), YouTube and especially its recommendation algorithms have come under severe scrutiny in the last few years. When they shared their findings with The New York Times, YouTube implemented changes, and US lawmakers demanded consequences. Following YouTube’s video recommendation down the “rabbit hole” lead them to videos of minors as well as children.
![berkman klein center alternatives berkman klein center alternatives](https://i1.rgstatic.net/publication/346701602_Patterns_anticipation_and_participatory_futures/links/5fceedc4299bf188d4000d54/largepreview.png)
While conducting research on YouTube’s algorithms, three researchers discovered that YouTube’s recommendations had created a community of sexually suggestive channels.