(Image: AAP/Tracey Nearmy)

This week, a New York Times article revealed that YouTube’s video recommendation algorithm gave paedophiles access to a tailor-made stream of content containing children.

The videos were usually innocent -- family movies taken by parents of their kids playing in a swimming pool, dancing, or doing gymnastics. Nor did they violate the platforms terms. But they soon racked up thousands of views, as the algorithm identified people with an interest in this content, and kept them watching by directing them to similar videos across the platform.

The Times story further exposes just how deep the problems with YouTube and its algorithm run. Researchers and journalists have been drawing attention to the way the algorithm pushes people towards more extreme, radical content. And in the same week as the paedophile article, the platform came under attack yet again for its failure to police racist and homophobic harassment.