Tag Archives: youtube

YouTube gives users more control over channel suggestions

In a blog post, ubiquitous online video platform YouTube said on 26 June that users would now be able to tell it to stop suggesting videos from specific channels, putting the onus on users, rather than the company, to more proactively curate the content they see – and consume – on the platform.

YouTube said it would make three changes that would be rolled out over the weeks following the announcement. Firstly, users will be able to “ more easily explore topics and related videos” on their homepage and in the “up next” section – these suggestions will be “based on your existing personalized suggestions and are meant to help you find what you’re looking for faster”.

They will include “videos related to the one you’re watching, videos published by the channel you’re watching, or other topics which may be of interest to you”, and will be available “for signed-in users in English on the YouTube app for Android and will be available on iOS, desktop and other languages soon”.

Secondly, users will gain the ability to “remove suggestions from channels you don’t want to watch”. On the menu next to a video on the homepage or “Up Next” section, users can now click “Don’t recommend channel”, after which they “should” no longer see YouTube suggest videos from that channel.

It remains to be seen how effective the function will actually be for the majority of users, and YouTube noted that “you may still be able to find them if you subscribe, search for them, or visit the channel page or Trending tab”. This new feature is already available globally on the YouTube app for Android and iOS, and will be available on desktop soon.

Lastly, YouTube now offers information about why a video may have been suggested to a specific user, a functionality that has been available in Google Ads for some time now. When the platform recommends videos “based on what other viewers with similar interests have liked and watched in the past”, users will now see a box with more information below the video.

YouTube explained that this is in an effort to “explain why these videos surface on your homepage in order to help you find videos from new channels you might like”, and said the feature is already available globally on the YouTube app for iOS, and will soon be available on Android and desktop.

YouTube has recently faced criticism from various parties – including journalists and politicians – for allowing conspiracy theories, extremist views and misinformation to thrive and spread across its platform – and frequently recommending such content to users.

Researchers and journalists have demonstrated that people who visit YouTube to watch videos on innocuous – sometimes unrelated – subjects or mainstream news sources have been served recommendations pushing them towards extreme content.

Additionally, videos ostensibly for children on YouTube have contained horrifying content like suicide tips and violence inflicted on popular cartoon characters. The company is apparently under investigation by the US Federal Trade Commission (FTC) over this issue and as the investigation progresses, YouTube is considering moving all children’s content to its separate YouTube Kids app.

YouTube uses machine learning tool to cut down long ads

YouTube uses machine learning tool to cut down long ads
Image by Gerd Altmann from Pixabay

Popular US-based video-sharing website YouTube has launched a new tool that uses machine learning to automatically cut down longer advertisements on videos uploaded to the website into smaller versions that only last for six seconds.

The “bumper” advertisement format was first introduced on YouTube back in early 2016 as part of the website’s drive to optimize the site for mobile viewing and are un-skippable. In a blog post, product manager Zach Lupei said the new format would be “ideal for driving incremental reach and frequency” particularly on mobile where so-called “snackable videos” perform well.

YouTube’s new tool – dubbed the Bumper Machine – is currently in alpha testing (which will lead to beta testing and eventually general availability) and relies on machine learning models that are trained to “identify interesting, well-structured moments in a longer video”.

These elements include aspects such as product or brand information, human faces, motion or contrast. The Bumper Machine “organizes these moments and brings them together to generate several different six-second ad variations for you to pick from, all in a matter of minutes”. The result can be adjusted with “simple edits” before users save them.

YouTube announced the new tool at parent company Google’s Marketing Live conference for advertisers on 14 May, alongside new “Discovery ads” that allegedly combine “rich audience targeting features and visually engaging, native formats to help you better personalize your ads to inspire customer action at scale”.

They don’t require advertisers to provide a video, instead requesting that they simply provide the “best images” from existing social media campaigns – including logos and other promotional images – as well as a headline, description, business name, URL and call-to action text. They will then place ads with a gallery of up to eight images in search results.

YouTube claims it will then “optimize your media mix or maximum performance across Gmail, Discover and the YouTube Home feed”. This includes placing the new ads on the Google homepage via its Discover feature, a Facebook-style news feed that users can swipe through to view an algorithmically personalized set of articles, videos and other digital content.

Google executives reportedly told journalists that the new features were “a response to how users behave, not competition”. However, they come as “choppy revenue growth [prompted] questions from some Alphabet investors about whether services such as [Amazon] and [Facebook’s] Instagram are drawing online shoppers and in turn, advertisers away from Google”.

The ads on the Google homepage appear on what the company calls its Discover feature, a Facebook-style news feed that users swipe through to view an algorithmically personalized set of links to articles, videos and other online content.

Google has been testing ads on Discover since last Autumn, when it said more than 800 million people were using the feature monthly. The gallery ads are part of an effort to make search results more visual, the company reportedly said, and they are expected to garner more clicks, which could lead them being shown in more results.

YouTube, Facebook and Twitter grilled over abuse faced by British MPs

YouTube, Facebook and Twitter executives have been grilled by members of the British Parliament at a committee hearing over how the social networks handle online abuse levelled at parliamentarians, the BBC reports.

Members of Parliament (MPs) are said to have argued that such hostility undermined democratic principles, with Twitter representative Katy Minshall admitting that it was “unacceptable” that the site had relied wholly on users to flag abuse in the past.

She insisted that the social network’s response to abuse had improved but acknowledged that there was more to be done.

Harriet Harman, chair of the Human Rights Committee, said there was “a strong view amongst MPs generally that what is happening with social media is a threat to democracy”, and SNP MP Joanna Cherry cited specific tweets containing abusive content that were not removed swiftly by Twitter, one of which was only taken down on the evening before the committee hearing.

“I think that’s absolutely an undesirable situation,” Minshall, Twitter’s head of UK government, public policy and philanthropy, said.

In response, Cherry argued it was in fact part of a pattern in which Twitter only reviewed its decisions when pressed by people in public life.

When MPs questioned how useful automated algorithms are for identifying abusive content, Facebook’s UK head of public policy, Rebecca Stimson, admitted that their application is limited with the platform’s algorithms only correctly identifying around 15% of pieces of offensive content as in breach of the site’s rules.

“For the rest you need a human being to have a look at it at the moment to make that judgement,” she explained.

Labour MP Karen Buck suggestd that algorithms might not identify messages such as, “you’re going to get what Jo Cox got” as hostile, referring to the MP Jo Cox who was murdered in June 2016.

“The machines can’t understand what that means at the moment,” Stimson agreed.

Both Stimson and Minshall said that their respective social networks were working to gradually improve their systems, and to implement tools to proactively flag and block abusive content, even before it’s posted.

The committee said it was shocked to learn that none of the companies had policies of reporting criminal material to law enforcement, except in rare cases when there was an immediate threat.

Committee chair Yvette Cooper pressed Facebook’s public policy director, Neil Potts, on whether the company was reporting identities of those trying to upload footage of the Christchurch shooting to the authorities in New Zealand.

Potts said the decisions were made on a “case by case” basis but that Facebook does not “report all crimes to the police”, and that “these are tough decisions to make on our own . . . where government can give us more guidance and scrutiny”.

Representatives of both Twitter and Google, YouTube’s parent company, admitted neither of their companies would necessarily report to the police instances of criminal material they had taken down.