inauthentic pages target independent news platform – will Facebook take notice [part 2, the case of Mikroskop Media]

This month, a series of articles published by The Guardian newspaper revealed how leaders across the world, used Facebook loopholes to harass their critics at home. And how despite having information about these violations, the platform lets these cases sit sometimes for months on end if not more, instead choosing to deal with more high profile cases. “The investigation shows how Facebook has allowed major abuses of its platform in poor, small and non-western countries in order to prioritize addressing abuses that attract media attention or affect the US and other wealthy countries. The company acted quickly to address political manipulation affecting countries such as the US, Taiwan, South Korea, and Poland, while moving slowly or not at all on cases in Afghanistan, Iraq, Mongolia, Mexico and much of Latin America.”

The Guardian investigations show that Azerbaijan was on the list of neglected countries. If it wasn’t for Facebook’s former employee Sophie Zhang memo published in September of last year, those inauthentic pages that Facebook removed 14 months later (once the memo was out) likely would have stayed. 

But even though those pages have been reportedly removed, hundreds if not thousands more continue to target independent media in Azerbaijan. AIW covered the story of Meydan TV here and The Guardian uncovered a similar pattern of targeting in the case of Azad Soz. AIW now presents its findings on targeting Mikroskop Media, a Riga-based online news platform that covers Azerbaijan. 

Mikroskop Media shared with AIW the list of Facebook posts where the platform received a high volume of comments. The preliminary investigation indicates that the Facebook page of Mikroskop Media was also targeted by hundreds of inauthentic Facebook pages set up to look like personal accounts flooding the posts with comments supportive of the ruling government and its relevant decisions. 

On March 24, Mikroskop Media shared the following post on its Facebook page. The post looks at the total number of citizens who have received vaccination so far in Azerbaijan as well as the total number of vaccines on March 23. This post received over 1.6k comments. AIW looked at 550 comments and almost all of these comments were posted by owners of pages that posed as users on the platform. 

Another post investigated by AIW was one posted on March 11, indicating the total number of businesses who have applied to the authorities to launch their businesses in Karabakh. The post receives over 400 comments. Having analyzed 200 of them, AIW was again, discovered that all of them were pages. 

On April 5, Mikroskop Media shared a link to a story they published about this investigation that was first originally published by VICE on March 29, exposing how little known Berlin-based television channel was part of a “lobbying strategy to polish Azerbaijan’s image in Germany” thanks to large sums of money paid through bribery of certain politicians. The story shared by Mikroskop Media on its Facebook page received almost 400 comments. AIW analyzed these comments, and once again, with an exception of a few profiles (although these too were suspicious given the lack of any recent activity on their profiles) that almost all of the comments were posted by inauthentic Facebook pages. 

At other times, Mikroskop Media’s Facebook page was targeted by troll accounts. This was especially the case in this example – on November 12, 2020, Mikroskop Media shared an infographic, about the number of times, Azerbaijan’s national constitution was amended. Among the 385 comments that were analyzed, a relatively high number of these comments were posted by Facebook profiles. A closer look at these profiles showed while some of the owners were employees at the state universities and government institutions, some were not authentic accounts at all. The majority of the comments once again were in favor of these changes, expressed pride in the country and the president’s decisions as well as accused the media platform of bias and unfair reporting. 

AIW would be happy to assist Facebook’s threat intelligence team in investigating the “coordinated inauthentic behavior” that AIW has observed and has shared in its reporting so far, but the main question still lingers, will it take notice? 

Facebook looks the other way when it comes to Azerbaijan and others – The Guardian investigations show

Almost a month after AIW published this story about how some 500 inauthentic Facebook pages targeted Berlin-based independent online news platform Meydan TV, little has changed. While all of the pages that targeted Meydan TV remain active, someone else has taken notice. 

On April 13, The Guardian published this story explaining how Facebook allowed state-backed harassment campaigns, target independent news outlets, and opposition politicians on its platform.  

The story mentions the case of Azad Soz (Free Speech) and how the post shared on March 4 about two men sentenced to eight months received over 1.5k comments. It analyzes the top 300 comments and discovers that 294 out of 300 comments were inauthentic Facebook pages.  

Just like in the case of Meydan TV. 

The Guardian cites Sophie Zang’s work during her time at Facebook, working for the team tasked with “combating fake engagement, which includes likes, shares, and comments from inauthentic accounts.” During her research, Zhang uncovered “thousands of Facebook pages- profiles for businesses, organizations, and public figures – that had been set up to look like user accounts and were being used to inundate the Pages of Azerbaijan’s few independent news outlets and opposition politicians on a strict schedule: the comments were almost exclusively made on weekdays between 9am and 6pm, with an hour break at lunch,” writes The Guardian journalists Julia Carrie Wong and Luke Harding. 

Wong and Harding also mention the platform’s response mechanism. “The company’s vast workforce includes subject matter experts who specialize in understanding the political context in nations around the world, as well as policy staff who liaise with government officials. But Azerbaijan fell into a gap: neither the eastern European nor the Middle Eastern policy teams claimed responsibility for it, and no operations staff – either full-time or contract – spoke Azerbaijani.”

But the story of Facebook and Azerbaijan is not the only one that The Guardian identified loopholes with. “The Guardian has seen extensive internal documentation showing how Facebook handled more than 30 cases across 25 countries of politically manipulative behavior that was proactively detected by company staff. The investigation shows how Facebook has allowed major abuses of its platform in poor, small, and non-western countries in order to prioritize addressing abuses that attract media attention or affect the US and other wealthy countries. The company acted quickly to address political manipulation affecting countries such as the US, Taiwan, South Korea, and Poland, while moving slowly or not at all on cases in Afghanistan, Iraq, Mongolia, Mexico, and much of Latin America.”

Honduras 

The administration in Honduras relied on astroturfing to attack government critics. Sophie Zang discovered how Juan Orlando Hernandez – the authoritarian leader – “received hundreds of thousands of fake likes from more than a thousand inauthentic Facebook pages” that were set up to look like Facebook user accounts. Very similar to what happened in Azerbaijan, in the case of Azad Soz and Myedan TV. And just like it was in the case of Azerbaijan, in the case of Honduras, the platform took nearly a year to respond.

Russia 

During 2016 US election, Russia’s Internet Research Agency set up Facebook pages to “manipulate individuals and influence political debates” pretending to be Americans.

Facebook’s intervention was much faster in the case of Russia targeting US elections, likely the result of “Facebook’s prioirty system for protecting political discourse and elections,” wrote Wong, in another story in The Guardian.   

As a result of this kind of cherry picking, Facebook’s response mechanism worked faster in the Taiwan, India, Indonesia, Ukraine and Poland but not in countries where similar inauthentic behavior was spotted such as Azerbaijan, Mexico, Honduras, Paraguay, Argentina and others. The difference in response rate was as quick as 1 day in the case of Poland and as long as 426 days in the case of Azerbaijan. 

Many others were left uninvestigated at all. Among them, Tunisia, Mongolia, Bolivia, and Albania. 

Back in Azerbaijan, at the time of writing this post, pages that targeted Meydan TV remain, and even if they are removed, nobody knows how long it will take Facebook to respond, next time, such behavior is spotted.