The algorithm pulls up material that it has been programmed to think is related to other things you might be interested in, but makes it more extreme each time
'It took just a few clicks before I went from reliable COVID-19 news to a spiral of...
www.sfgate.com
Literally all youtube has to do is be transparent and upfront and stop programming their algorithm to push more and more extreme upsetting content as an easy way to keep viewers on the site. That would solve all the "radicalization" problem. Instead they deny and push the responsibility onto the viewer. Expecting accountability from these unaccountable social media companies is too much to ask for.
It's a social experiment of sorts imo.
You have to delete your search and watch history each time before you exit youtube and go against what they recommend. Never click on any of it. Best to skip past the frontpage as quickly as possible and rely on typing in search terms.
It's like on reddit when instead of the long route of typing out the sub name in the address bar, you go to search the sub only to see curated results (like if you search for men's rights or mgtow and see only anti men's rights and anti mgtow subs show up first).