YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — a recent study found. (Marijan Murat/picture alliance via Getty Images ...
15don MSNOpinion
YouTube's algorithm is on an AI slop and brainrot-only diet
If you've ever opened YouTube and seen a talking AI SpongeBob, a looping slime video, or something that feels engineered to ...
The researchers, the New York Times reports, find that the same tenets that reward extremism also happen with sexual content on YouTube: A user who watches erotic videos might be recommended videos of ...
YouTube Shorts, the shortform platform from Google-owned video giant YouTube, has seen massive success since its launch in September 2020. Today, an estimated 1% of all waking human hours are spent ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
A new study conducted by the Computational Social Science Lab (CSSLab) at the University of Pennsylvania sheds light on a pressing question: Does YouTube's algorithm radicalize young Americans?
You may think you’re too smart to fall for a conspiracy theory. Your social media is dedicated to cat videos, Trader Joe’s hauls and Saturday Night Live sketches. You think you’re safe in this ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results