YouTube Faces New Questions Over its Algorithm-Selected Content Recommendations.

The online space has become increasingly dominated by algorithms, digital systems which 'learn' from your behavior, and recommend more content along similar lines to keep you engaged and on platform.That makes sense from the perspective of the companies which benefit from keeping you locked into their apps, but the problem with algorithms is that they don't use any form of judgment. They simply recommend more of what you like - so if you like racist, hate-filled conspiracy theories, guess what you see more of? And if you're a pedophile who's looking to watch videos of underage children...That's the issue that YouTube has been battling over the last year or so, amid criticism around how its machine learning systems essentially facilitate pedophile networks within the app. Back in February, YouTuber Matt Watson revealed how YouTube's system had enabled such activity, which prompted YouTube to implement a range of new measures, including deactivating comments on "tens of millions of videos that could be subject to predatory behavior". Evidently, however, the issue still remains - according to a new report in The New York Times, a new concern has arisen where YouTube's system has been recommending videos with images of children in the background of home movies to these same online pedophile networks.

Spotlight

Other News

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More

Dom Nicastro | April 03, 2020

Read More