In a move to quell conspiracy theories on its video-sharing platform, YouTube CEO Susan Wojcick has announced that the company will start marking conspiracy videos with what it is referring to as “information cues,” within the next couple of weeks.
Addressing an SXSW (South by South West) panel in Austin, Texas, on Tuesday, Wojcick gave an illustrative explanation of how these information cues will accompany hoax and misinformation videos, providing accurate information on the subject with excerpts from Wikipedia articles.
She came prepared with information cues for the moon landing and chemtrail videos, to demonstrate how the so-called cues will appear directly below these hoax videos as a textbox. A handy link to Wikipedia will also be provided for more information.
“When there are videos that are focused around something that’s a conspiracy — and we’re using a list of well-known internet conspiracies from Wikipedia — then we will show a companion unit of information from Wikipedia showing that here is information about the event,” said the YouTube chief.
— Maureen Fitzgerald (@movandy) March 13, 2018
Building on what Wojcick said at the SXSW event, a Google spokesperson said:
“We’re always exploring new ways to battle misinformation on YouTube. At SXSW, we announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, like the moon landing. These features will be rolling out in the coming months, but beyond that, we don’t have any additional information to share at this time.”
While it’s definitely great news and all that, what one fails to understand is why YouTube chose not to inform Wikimedia about its grand anti-conspiracy theory efforts, in which the web-based encyclopedia is supposed to be a major player.
Following the SXSW announcement, Wikimedia Foundation’s executive director, Katherine Maher tweeted:
“While we are thrilled to see people recognize the value of @Wikipedia’s non-commercial, volunteer model, we know the community’s work is already monetized without the commensurate or in-kind support that is critical to our sustainability.”
While we are thrilled to see people recognize the value of @Wikipedia’s non-commercial, volunteer model, we know the community’s work is already monetized without the commensurate or in-kind support that is critical to our sustainability. https://t.co/d8TdTTPdgp
— Katherine Maher (@krmaher) March 14, 2018
The Wikimedia Foundation also confirmed in a tweeted statement on Wednesday that YouTube had not entered into any sort of deal with the Foundation, whatsoever.
“We are always happy to see people, companies, and organizations recognize Wikipedia’s value as a repository of free knowledge. In this case, neither Wikipedia nor Wikimedia Foundation are part of a formal partnership with YouTube. We were not given advance notice of this announcement,” the statement read.
— Wikimedia Foundation (@Wikimedia) March 14, 2018
While the gaffe may have caused YouTube some embarrassment, it must be said that they are not legally bound to inform Wikipedia because Wikipedia content is freely licensed and anyone can choose to reuse it, which the foundation has accepted in its statement.
“Today, Wikimedia is a fundamental part of the internet’s infrastructure. From out articles to our citations to our datasets, we’re a major part of the open global commons, a driver of free learning and open research, and an example of what the World Wide Web was supposed to be. Hundreds of millions of people rely on Wikipedia every day, in hundreds of languages,” said the statement.
“Wikipedia’s content is also freely licensed for reuse by anyone, and that’s part of our mission: that every single person can share in free knowledge. We want people all over the world to use, add to, and remix Wikipedia.”
One must remember that being a non-profit set up, Wikipedia runs with the help of donations, volunteers, and supporters. In fact, the foundation urges companies to use Wikipedia content and donate freely to allow Wikipedia keep up the good work.
In that spirit, YouTube should have kept Wikipedia in the loop about its plan, and even discussed how it could contribute to the good work that the web-based encyclopedia has become synonymous with.
“All this is possible because of the six million people who donate to keep Wikipedia running, the hundreds of thousands of volunteer contributors, and countless others who support our work,” read the foundation’s tweet.
“ At the same time, we encourage companies who use Wikimedia’s content to give back in the spirit of sustainability. In doing so, they would join the millions of individuals who chip in to keep Wikipedia strong and thriving,” added the Wikimedia statement.
Veteran Wikipedia editor Liam Wyatt told Motherboard’s Samantha Cole that he learned about the YouTube announcement through the Wired.
“YouTube is outsourcing responsibility for the truth,” Wyatt told Cole in a phone interview.
“Their job is not to be responsible for truth, that’s not their mission. But they’re abrogating their responsibility to society by saying there is some organization that can answer for that, and we don’t have to deal with it,” he added.
Speaking on Wikipedia’s readiness to handle the expected surge of trolls on its platform after YouTube implements its plans, Wyatt feels there’s hardly any cause for concern.
“If it was a controversial topic before, then it would already have people monitoring it, and editing restrictions from drive-by vandalism in place,” he said, in a Google Hangouts chat, according to Motherboard.
“If YouTube suddenly sent lots of conspiracy theory people towards one obscure article—that would be a temporary problem for us. but if they’re sending people to a variety of well-established and well-monitored articles, no problem.”