“The potential to prevent harm is high here, particularly with the widespread existence of health misinformation on the platform.”

Facebook is expanding the third-party fact-checking program that it launched on its own platform in 2016 to Instagram — something that many who watch the space have advocated for awhile. Facebook has owned Instagram for seven years; this is an expansion of a trial that began in May.

“The potential to prevent harm is high here, particularly with the widespread existence of health misinformation on the platform,” Facebook fact-checking partner Full Fact noted in a recent report.

To report a post, you tap the three dots at the top right and choose “Report” → “It’s Inappropriate” → “False Information.” As of Thursday morning, the option wasn’t available on my Instagram, but Facebook says it will roll out to everyone by the end of the month. Poynter notes that only U.S.-based fact-checkers will be verifying Instagram posts for now. (Poynter’s International Fact-Checking Network sets the standard for eligibility for Facebook’s fact-checkers.)

A Facebook representative cautioned that this is “an initial step as we work towards a more comprehensive approach to tackling misinformation.” When users flag posts as inappropriate, they won’t automatically be sent on to fact-checkers right away; rather, “our aim is to use this feedback, along with other signals, to determine if content should be sent to third-party fact-checkers. Other signals include, for example, how old a given post is and previous behaviors from the account that posted it.”

Adam Mosseri, who has run Instagram since last October, was previously in charge of Facebook’s News Feed, which has as much experience as any part of Greater Facebook in fact-check integration.