28
56
Ali, M., Sapiezynski, P., Bogen, M., Korolova, A., Mislove, A., & Rieke, A. (2019). Discrimination through
optimization: How Facebook’s ad delivery can lead to skewed outcomes. Arxiv 1904.02095v4 [Cs]. Retrieved from
https://arxiv.org/pdf/1904.02095.pdf
57
Ricci, F., Rokach L., & Shapira, B. (2015) Recommender Systems Handbook. Springer, Cham, SH.
58
Ibid.
59
Ananny, M., & Crawford, K. (2018) Seeing without knowing: Limitations of the transparency ideal and its application
to algorithmic accountability. New Media & Society, 20(3): 973-989.
60
Diakopoulos, N. (2015) Algorithmic accountability: Journalistic investigation of computational power structures.
Digital Journalism, 3(3): 398-415.
61
Kroll, J. A. (2018). The fallacy of inscrutability. Philosophical Transactions of the Royal Society A: Mathematical, Physical and
Engineering Sciences, 376(2133).
62
Burrell, J. (2016) How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society,
3: 1.; Van Dijck, J., Poell, T., & De Waal, M. (2018). The platform society: Public values in a connective world. Oxford University
Press.
63
For instance, Google provides relatively detailed guidance to 3rd party reviewers that evaluate search results::
https://www.google.com/search/howsearchworks/mission/users/
and
https://static.googleusercontent.com/media/guidelines.raterhub.com/en//searchqualityevaluatorguidelines.pdf
64
See Part II Analysis: How are platforms and governments addressing the algorithmic amplification of hate speech and
disinformation? below (Under ‘Algorithmic content curation (and non-discrimination’).
65
Cobbe, J., & Singh, J. (2019). Regulating Recommending: Motivations, Considerations, and Principles. (April 15,
2019). Available at SSRN: https://ssrn.com/abstract=3371830 or http://dx.doi.org/10.2139/ssrn.3371830
; Gary, J.,
and Soltani, A. (2019) First Things First: Online Advertising Practices and Their Effects on Platform Speech, Knight First
Amendment Institute at Columbia University, available at:
https://knightcolumbia.org/content/first-things-first-
online-advertising-practices-and-their-effects-on-platform-speech; Lewis, P. (2018). Fiction is outperforming reality”:
How YouTube’s algorithm distorts truth. The Guardian, February 2, 2018. A recent study on YouTube’s algorithms
refutes the radicalization claim and finds evidence that YouTube’s recommendation algorithm favors mainstream
sources. See Ledwich, M., & Zaitsev, A. (2019). Algorithmic Extremism: Examining YouTube's Rabbit Hole of
Radicalization. arXiv preprint arXiv:1912.11211.
66
See M Golebiewski and D Boyd, Data Voids: Where Missing Data Can Easily Be Exploited (2018),
https://datasociety.net/wp-content/uploads/2018/05/Data_Society_Data_Voids_Final_3.pdf
67
Bahara, H., Kranenberg, A., Tokmetzis, D. (2019) Hoe YouTube rechtse radicalisering in de hand werkt, De
Volkskrant, 8 februari 2019.
68
See M. Golebiewski and D. Boyd, Data Voids: Where Missing Data Can Easily Be Exploited (2018),
https://datasociety.net/wp-content/uploads/2018/05/Data_Society_Data_Voids_Final_3.pdf
69
Napoli, P. (2014). Digital intermediaries and the public interest standard in algorithm governance. Media Policy Blog.
70
Bennet and Livingston, 2018
71
Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the
“post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369.
72
See Helberger, N. (2019, in press). On the democratic role of news recommenders, Digital Journalism. Available at:
https://www.tandfonline.com/doi/full/10.1080/21670811.2019.1623700
73
Cadwalladr, C., & Graham-Harrison, E. (2018). The Cambridge Analytica files. The Guardian, 21, 6-7.
74
Moore, M., & Tambini, D. (2018). Digital Dominance: The Power of Google, Amazon, Facebook, and Apple. Oxford
University Press.
75
Helberger et al., 2018
76
Lewis, P. & McCormick, E. (2018). How an ex-YouTube insider investigated its secret algorithm. The Guardian,
February 2, 2018.