The internet is accessible to anybody and everybody. Is this…
Facebook and other social media companies like Twitter have come under immense pressure over allegations that they were used to manipulate last year’s presidential election in the United States.
In reaction to this, Facebook has changed the way political ads appear on it. Mark Zuckerberg took to his Facebook wall to explain it thus:
When someone buys political ads on TV or other media, they’re required by law to disclose who paid for them. Now we’re bringing Facebook to an even higher standard of transparency:
1. We’re making all ads more transparent, not just political ads. We’ll soon start testing a feature that lets anyone visit any page on Facebook and see what ads that page is currently running. For political advertisers, we’re working on a tool that will let you search an archive of ads they’ve run in the past. You’ll also be able to see how much an advertiser paid, the type of people who saw the ads and the number of impressions. Our goal is to fully roll this out in the US ahead of the 2018 midterm elections.
2. Political advertisers will now have to provide more information to verify their identity. Once they’ve done that, we will label their ads as political and they will have to disclose who paid for them. We’ll start testing this in US federal elections and then move to more races later.
3. We’re strengthening our systems to catch anyone trying to break these rules. We’re adding thousands of people to our review teams and will start using machine learning to identify political ads, just like we do with spam. We’re also going to work with other tech companies to share information on the threats we find.
These changes will make it easier to see what different groups are trying to communicate around elections and will make it harder for anyone to break the rules. This won’t stop all bad actors, but it’s one of many important steps forward and we’ll have more to share soon.
It will be interesting to see if this new strategy works to prevent political manipulation on Facebook.