There is No Algorithm for Truth - with Tom Scott
The Royal Institution・49 minutes read
Google developed a machine learning system to filter search results based on scientific consensus, challenging personal beliefs like a no-deal Brexit. Algorithms in platforms like YouTube can exhibit biases, leading to unfair demonetization and perpetuating inequalities.
Insights
- Google developed a machine learning system to filter search results based on scientific consensus, potentially challenging personal beliefs and highlighting the importance of unbiased information dissemination.
- The lecture emphasizes the complexities of algorithmic bias, showcasing how systemic societal biases can influence machine learning systems, leading to inequalities in content promotion and the importance of addressing these issues for fairness in artificial intelligence.
Get key ideas from YouTube videos. It’s free
Recent questions
How does Google distinguish fact from fiction?
By using a machine learning system, Google ensures only scientific consensus appears in search results and recommendations. This system may challenge personal beliefs by suggesting controversial topics like a no-deal Brexit or dissolving the UK into Europe.
What are the challenges of reaching diverse audiences in science communication?
The speaker, a successful science communicator, discusses the challenges of reaching diverse audiences and the impact of algorithms on media consumption. He acknowledges the complexities of modern science communication and the influence of algorithms on shaping content consumption patterns.
How do machine learning systems categorize content?
Machine learning systems categorize content based on human-curated examples and learn from feedback. These systems can exhibit biases, like YouTube's algorithm associating LGBT content with explicit material, leading to unfair demonetization.
Why is algorithmic bias a significant concern?
Algorithmic bias is a significant concern because systemic biases in society can influence machine learning systems, perpetuating inequalities. The speaker emphasizes the difficulty of eliminating biases in artificial intelligence and acknowledges the work of experts in addressing these issues.
What is the goal of YouTube's recommendation engine?
YouTube's recommendation engine prioritizes increasing watch time to retain viewers on the platform, potentially perpetuating biases and systemic inequalities in content promotion. The algorithm rewards videos that attract viewers and keep them engaged, particularly with advertisements.
Related videos
CNBC Television
Is this the end of Google Search? How the giant could lose its lead
Code.org
The Internet: How Search Works
Chris Williamson
“AI Is Being Used To Rewrite History” - Former CIA Agent Reacts
Rosencreutz
Monetization will ruin the internet (again)
The Royal Institution
Q&A: There is No Algorithm for Truth - with Tom Scott