The company has since adjusted algorithms to help filter out fake news, introduced new services to help users see links from publications they pay for higher up in their search results and funded Canadian programs aimed at helping children learn how to identify misinformation.
Still, experts say more can be done. Governments around the world agree.
The U.K. parliament, for example, has been eying new legislation that requires tech companies to take more responsibility for the safety of their users. It could include measures involving dedicated fact-checkers to “minimize the spread of misleading and harmful disinformation ... particularly during election periods.”
Singapore is meanwhile mulling a bill allowing the government to decide what constitutes fake news and demand corrections for items deemed to be false.
Canada has yet to settle on — or even float — potential solutions, but those that have been implemented or considered elsewhere have typically faced opposition from tech companies that have argued they should be trusted to self-regulate.
Democratic Institutions Minister Karina Gould has said the country is looking at regulation because leaving tech giants to police themselves “is not yielding the results that societies are expecting these companies to deliver.”
Global Affairs Minister Chrystia Freeland has also weighed in, saying the current tech era has “eerie” similarities to the rise of monopolies a century ago and could require equally drastic steps to address the situation.
“I am not worried,” Gingras told the Star about the possibility of the government regulating Google and trying to discipline it for spreading misinformation.
“We fully recognize the importance of servicing those questions, but from our perspective, we continue to do everything we can to be responsible participants in the ecosystem.”
Gingras said Google will “clearly do our best” to ensure any measures Canada enacts can be “realistically complied with.”
He couldn’t think of any specific policies Google will lobby for, but warned that such matters can be “tricky” and said “on our side, it is not a regulatory thing.”
Gingras believes the spread of misinformation is “nothing new.”
“We have seen fake news since the printing press was introduced in the U.S.,” he said. “What is different now is that it is so much easier for people to have the opportunity to drive information virally.”
He’s seen that with synthetic media — content that has algorithmically been altered and often involves mixing audio and video to create fake footage depicting things that have not occurred.
Such media, he said, is getting increasingly difficult to detect because of technology advancements, forcing companies to focus more on determining the sources of such clips.
Cash-strapped news publishers engaged in fact-based journalism are also facing difficulties with misinformation that seeks to undermine their credibility and make it more challenging for Canadians to identify trustworthy news, Gingras said.
Forty per cent of Canadians find it difficult to distinguish truth from misinformation in the news, revealed an Earnscliffe Strategy Group poll that was commissioned by the CJF and released Thursday.
About 53 per cent of the poll’s respondents admitted they have come across stories featuring facts twisted to push an agenda and 74 per cent said they agreed that the average person does not know how to tell good journalism from rumours or falsehoods.
Numbers like that are why Google was keen on the CJF grant.
“It’s important we give focus to how we can give citizens the ability to filter the information they see and make sure they can find out the authoritative from that that is less so,” said Gingras.
“Frankly, our hearts and minds are behind how we do that.”
© Toronto Star