“A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team. The score they compute for each page is its Knowledge-Based Trust score.”
“The software works by tapping into the Knowledge Vault, the vast store of facts that Google has pulled off the internet. Facts the web unanimously agrees on are considered a reasonable proxy for truth. Web pages that contain contradictory information are bumped down the rankings.”
“Knowledge Vault has pulled in 1.6 billion facts to date. Of these, 271 million are rated as “confident facts”, to which Google’s model ascribes a more than 90 per cent chance of being true. It does this by cross-referencing new facts with what it already knows.”
This seems too good to be true so I’ll start by assuming it is not. But NewScientest is, in my opinion, a reliable source. And I want this to be a real thing. Imagine how disruptive something like this would be. Would you keep going back to a site with a really low Knowledge-Based Trust score? Sure, there’d be lots of kicking and screaming but I could see this working. At lots of levels.