(Image: AAP/Tracey Nearmy)

This week, a New York Times article revealed that YouTube’s video recommendation algorithm gave paedophiles access to a tailor-made stream of content containing children.

The videos were usually innocent -- family movies taken by parents of their kids playing in a swimming pool, dancing, or doing gymnastics. Nor did they violate the platforms terms. But they soon racked up thousands of views, as the algorithm identified people with an interest in this content, and kept them watching by directing them to similar videos across the platform.