However, it said research suggested the “red flag” approach actually “entrenched deeply held beliefs”.
It will now display “related articles” next to disputed news stories.
“Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs – the opposite effect to what we intended,” Facebook’s Tessa Lyons wrote in a blog post.
Instead of displaying a warning icon in the news feed, it will instead “surface fact-checked articles” and display them next to disputed stories.
Facebook said it had tested the approach and found that although the new approach did not reduce the number of times disputed articles were clicked on, it did lead to them being shared fewer times.
People who do try and share a disputed article are showed a pop-up with links to fact-checked sources.
“Just as before, as soon as we learn that an article has been disputed by fact-checkers, we immediately send a notification to those people who previously shared it.”
Critics say social networks should face regulation if they do not tackle the spread of misinformation and propaganda.
“What Facebook is trying to do is respond to pressure that it should be treated as a publisher, rather than a platform,” said Tim Luckhurst, professor of journalism at the University of Kent.
“I think that argument is dead. They are a publisher, so it is not enough to offer people a menu of other related stories.
“We have a generation of people that are so anti-establishment and sceptical of evidence-based news, we need regulation of the type imposed on broadcasters since they first emerged.”
Prof Luckhurst said he was “appalled” by Facebook’s argument that it was different from traditional media.
“They usually raise the objection that they cannot be regulated because they’re international. Well so is the BBC, so is CNN.”