An important discussion about what is the cause of extremism-YouTube the last days marked. A recent study, the algorithm of YouTube users, videos that claims to offer people the causes of radicalisation.
YouTube's algorithm has penned a new article about YouTube that caused the radicalization of the debate of re-aggravation caused. Zaitsev's article penned by Anna Ledwich mark and then a video on YouTube claims to offer similar extremist videos. A refereed journal article, which are not yet treated in the mainstream media to protect against independent content channels, YouTube's algorithm claimed.
Other experts who work in this field, Ledwich, and written by Zaitsev, the article responded to. Experts have criticised the article to make such an assessment and the method of algorithm one of the important factors is that only a few data science alone that you can't answer this question, they told me.
YouTube radikallesmed a sociologist Zeynep Tüfekçi, who has taken that first role in the field of technology studies, published in the New York Times said. Gunsmith, directs users to YouTube's suggested videos said gradually more extreme content. Zeynep Tüfekçi, jogging videos about ultra marathons, conspiracy theories, Holocaust denial videos, such as videos of videos of videos about vaccines and politics, he wrote and directed.
Algorithms, if left unchecked, the forces of extremism will take over the media
YouTube Guillaume's former employees Charlotte, discussed in detail the arguments of Zeynep's riflemen. Castlot, the YouTube video that is biased against conspiracy theories and proposals is actually fundamentally wrong, but people are going to spend more time on the site tells me the way from here.
Duration of views that may be taken to the highest level, it forms the basis of YouTube's algorithms. YouTube as a company, lack of transparency on this issue, makes it impossible for the battle to struggle against. Without transparency, it is difficult to find ways to improve the situation.
YouTube in not being transparent, he can hardly he's not alone. From companies to government agencies structure, he's not being transparent about large systems. A system that allows the children to be placed in schools from a system, determining your credit score, machine learning algorithms. Usually these systems are making decisions about how the description is not made by institutions and companies.
You need to explain how the recommendation system works
Machine learning systems often are large and complex. These systems enters information in general, again, the emergence of knowledge and action, are defined as black boxes. Therefore, without knowing exactly how it works YouTube's recommendation system algorithms, such as, a car without opening the hood to try to understand how the site works is similar to trying to understand how it works.
For the Prevention of extremism and extremism companies and government agencies be more transparent about the algorithms they use, they could. There are two ways of ensuring this transparency. First, the submission of counterfactual statements. Comments opposed without explaining all the logic of the algorithm for the basic logic can be explained. In determining your credit score is a bank, “you are over the age of 18 and previously you accepted your debt or loan” in the form of an explanation, a description may be simple but effective for the functioning of the system.
The second method is the test and the control algorithm. The constant checking and testing of the algorithm backward, wrong suggestions can ensure that it is prevented. YouTube's algorithm supervision, suggestions, feature videos which may reveal that it is recommended as a priority.
The Trouble With Youtuber Youtube How The Algorithm Actually Works?
Counterfactual explanations or algorithm to control a process, although difficult and costly, is extremely important. That comes along because the alternative is worse for humanity results may reveal. The algorithms is checked and, if left unchecked, gradually the media conspiracy theorists and extremists may be subject to seizure.
Source : https://thenextweb.com/syndication/2020/01/30/scientists-are-arguing-over-youtubes-role-in-online-radicalization/