kmiainfo: Electronic flies are a tool in a war fueled by algorithms "Smear" campaigns and fake fake photos. Electronic flies are a tool in a war fueled by algorithms "Smear" campaigns and fake fake photos.

Electronic flies are a tool in a war fueled by algorithms "Smear" campaigns and fake fake photos.

Electronic flies are a tool in a war fueled by algorithms "Smear" campaigns and fake fake photos.  Fake accounts run by bots Participation in social networking sites requires us to understand the dirty work that some governments may do in their media war with opponents or with other competing countries using what has become known as electronic flies.  With the boom of the age of social networks, some parties have started to create a large number of accounts on popular social networking sites to influence or change the perception of surfers about a particular issue.  The primary goal of these accounts is to publish and repost tweets in the virtual world to become a general opinion of users who seem to have one opinion, and thus push tweeters who are usually on the fence to tweet and participate in the debate and the ongoing conversation.  These accounts are usually managed using robots and algorithm programs that perform repetitive and automatic tasks through programming that ends with placing likes, retweets, or commenting on tweets in a pattern of heavy publishing that leads to the publication of a flood of hashtags that enter the competition arena in global trend lists, or lead the trend in the target countries.  Elaborate forgery One of the goals of these dirty war actions is to establish an electronic support team on social media called “electronic flies,” a metaphor for the intensity of the attack on a specific topic, amounting to hundreds of thousands of fake accounts, to slander and tarnish the reputation of the other party. Or broadcasting specific subversive materials to spread chaos and turn peoples against each other. Very meticulous processes are carried out to design “smear” campaigns based on fabricated information and images with a high degree of elaborate falsification to delude the recipient of their credibility.  Using fake accounts to promote and create the illusion of widespread popularity for a particular point of view is just a propaganda tactic  Usually the goal of these campaigns is purely political. The term “electronic committee” or “electronic committees” is used for the same meaning, which is a union between a group of people or a group of electronic organizations working to direct or change the direction of public opinion to a specific thought, whether it is a thought or belief contrary to the truth or with it.  On the second of last December, Twitter announced the removal of two advertising campaigns linked to the Chinese government; Specifically, 2,048 accounts are intended to support the Chinese Communist Party's theses about Xinjiang and its Uyghur population, and 112 are attributed to Shangyu Kultur, a private company working for the Xinjiang regional government.  A team at the Stanford Internet Observatory analyzed these two networks. The team found that the two networks echoed pro-CPC narratives about the treatment of Uighurs in Xinjiang, by posting content from Chinese state media or sharing videos of Uyghurs testifying about how happy they are in Xinjiang.  As with previous Twitter removals of pro-Chinese Communist Party networks, accounts in the former network were somewhat blocked. Instead of presenting account holders as potential real people, they often displayed virtual images or popular photos, and sometimes a brief bio, without activity. Previous to post content that includes the subject matter of the operation.  What distinguishes these networks is their tactical frequency and persistence. Even in the few weeks after Twitter removed the specific accounts that were vetted, hundreds of accounts with similar profiles and posting patterns were observed. Other researchers have also observed similar patterns with thousands of other accounts, but they differ from the networks we analyzed.  Why repeat the same strategies year after year when they don't seem to work?  Advertising campaigns Each time the process is erased, the operators have to start over and create new followers. And if they expect their account to be removed quickly, they may see that character development is not worth the time to waste and that flooding the account (or a particular hashtag) with a large amount of contributions, in an attempt to be a simple distraction, is a more ideal strategy.  However, a more pessimistic reading can be adopted; One of the presumed rationale for announcing the removal of campaigns from platforms is that it has a deterrent effect. For example, members of Facebook's security team wrote that "there is a reputational impact when publicly labeled and removed for foreign or domestic (ad campaigns)".  As it stands, there is not a lot of research to show that political actors actually experience enough reputational damage to change their behaviour. The fact that the same operators continue to campaign despite repeated attempts to delete them may indicate that the deterrent effect is not working, or that the offender believes the benefit is worth the cost.  For example, research on domestic social media propaganda in China has shown that the government uses the so-called “50C Party” (online commentators are paid to post at the direction of the government while pretending to be regular social media users) to fabricate hundreds of millions of social media posts. Social, not to argue with party critics but rather to “distract the public and change the prevailing subject.”  This certainly differs from researchers’ observations that accounts, operated by the Russian Internet Research Agency, for example, changed their names and focus during their tenure, to explore themes and characters in an effort to find those that were ideal for a highly engaged audience, and their account began with “Jesus Army,” which became one of the The most followed account of the agency's operational timeline is his life on social media as a page devoted first to Kermit the Frog, then to The Simpsons.  If engagement is not the measure of focus, we would expect quantitative advertisers to invest in quality, focusing on mass-produced accounts and tweets with the intended message, rather than building followers or even actively promoting accounts within the network. Sponsored campaigns may post a lot of accounts and re-appear after those accounts are removed because campaign operators are paid based on posts, not interaction.  Fake impressions Why repeat the same strategies year after year when they don't seem to work?  These recent contributions to outsourcing organizations may be a drop in the ocean, as this is not the first time that outsourcing has been done, but rather the first that Facebook and Twitter have discovered and announced. And researchers in the field of misinformation found an increase in the outsourcing of disinformation campaigns by government agencies to public relations or marketing companies.  Several outsourcing operations in the Middle East and North Africa showed high engagements of followers.  Using fake accounts to promote and create the illusion of widespread popularity for a particular point of view is just one advertising tactic. The Chinese Communist Party, for example, has a wide range of international-foreign propaganda capabilities spanning the fields of broadcast, print, and digital platforms, which have developed over decades.  The party may believe that investing in diplomats' Twitter accounts may be more effective than covert disguised campaigns, or it may mix overt and covert tactics using fake accounts to give the impression of huge popular support.  Research from the Associated Press and the Oxford Internet Institute showed that accounts that inflate Chinese diplomats with the aim of manipulating the platform (a sign of disbelief) are often subsequently suspended by Twitter. The party's choice to prioritize other channels, such as tapping YouTube influencers, can help explain the relatively modest reach of Twitter personalities.  If so, it may be worrying. The party's wide range of resources and relatively cheap labor force mean that incremental changes in strategy can lead to major shifts in the disinformation landscape.

Fake accounts run by bots

Participation in social networking sites requires us to understand the dirty work that some governments may do in their media war with opponents or with other competing countries using what has become known as electronic flies.

With the boom of the age of social networks, some parties have started to create a large number of accounts on popular social networking sites to influence or change the perception of surfers about a particular issue.

The primary goal of these accounts is to publish and repost tweets in the virtual world to become a general opinion of users who seem to have one opinion, and thus push tweeters who are usually on the fence to tweet and participate in the debate and the ongoing conversation.

These accounts are usually managed using robots and algorithm programs that perform repetitive and automatic tasks through programming that ends with placing likes, retweets, or commenting on tweets in a pattern of heavy publishing that leads to the publication of a flood of hashtags that enter the competition arena in global trend lists, or lead the trend in the target countries.

Elaborate forgery

One of the goals of these dirty war actions is to establish an electronic support team on social media called “electronic flies,” a metaphor for the intensity of the attack on a specific topic, amounting to hundreds of thousands of fake accounts, to slander and tarnish the reputation of the other party. Or broadcasting specific subversive materials to spread chaos and turn peoples against each other. Very meticulous processes are carried out to design “smear” campaigns based on fabricated information and images with a high degree of elaborate falsification to delude the recipient of their credibility.

Using fake accounts to promote and create the illusion of widespread popularity for a particular point of view is just a propaganda tactic

Usually the goal of these campaigns is purely political. The term “electronic committee” or “electronic committees” is used for the same meaning, which is a union between a group of people or a group of electronic organizations working to direct or change the direction of public opinion to a specific thought, whether it is a thought or belief contrary to the truth or with it.

On the second of last December, Twitter announced the removal of two advertising campaigns linked to the Chinese government; Specifically, 2,048 accounts are intended to support the Chinese Communist Party's theses about Xinjiang and its Uyghur population, and 112 are attributed to Shangyu Kultur, a private company working for the Xinjiang regional government.

A team at the Stanford Internet Observatory analyzed these two networks. The team found that the two networks echoed pro-CPC narratives about the treatment of Uighurs in Xinjiang, by posting content from Chinese state media or sharing videos of Uyghurs testifying about how happy they are in Xinjiang.

As with previous Twitter removals of pro-Chinese Communist Party networks, accounts in the former network were somewhat blocked. Instead of presenting account holders as potential real people, they often displayed virtual images or popular photos, and sometimes a brief bio, without activity. Previous to post content that includes the subject matter of the operation.

What distinguishes these networks is their tactical frequency and persistence. Even in the few weeks after Twitter removed the specific accounts that were vetted, hundreds of accounts with similar profiles and posting patterns were observed. Other researchers have also observed similar patterns with thousands of other accounts, but they differ from the networks we analyzed.

Why repeat the same strategies year after year when they don't seem to work?

Advertising campaigns

Each time the process is erased, the operators have to start over and create new followers. And if they expect their account to be removed quickly, they may see that character development is not worth the time to waste and that flooding the account (or a particular hashtag) with a large amount of contributions, in an attempt to be a simple distraction, is a more ideal strategy.

However, a more pessimistic reading can be adopted; One of the presumed rationale for announcing the removal of campaigns from platforms is that it has a deterrent effect. For example, members of Facebook's security team wrote that "there is a reputational impact when publicly labeled and removed for foreign or domestic (ad campaigns)".

As it stands, there is not a lot of research to show that political actors actually experience enough reputational damage to change their behaviour. The fact that the same operators continue to campaign despite repeated attempts to delete them may indicate that the deterrent effect is not working, or that the offender believes the benefit is worth the cost.

For example, research on domestic social media propaganda in China has shown that the government uses the so-called “50C Party” (online commentators are paid to post at the direction of the government while pretending to be regular social media users) to fabricate hundreds of millions of social media posts. Social, not to argue with party critics but rather to “distract the public and change the prevailing subject.”

This certainly differs from researchers’ observations that accounts, operated by the Russian Internet Research Agency, for example, changed their names and focus during their tenure, to explore themes and characters in an effort to find those that were ideal for a highly engaged audience, and their account began with “Jesus Army,” which became one of the The most followed account of the agency's operational timeline is his life on social media as a page devoted first to Kermit the Frog, then to The Simpsons.

If engagement is not the measure of focus, we would expect quantitative advertisers to invest in quality, focusing on mass-produced accounts and tweets with the intended message, rather than building followers or even actively promoting accounts within the network. Sponsored campaigns may post a lot of accounts and re-appear after those accounts are removed because campaign operators are paid based on posts, not interaction.

Fake impressions

Why repeat the same strategies year after year when they don't seem to work?

These recent contributions to outsourcing organizations may be a drop in the ocean, as this is not the first time that outsourcing has been done, but rather the first that Facebook and Twitter have discovered and announced. And researchers in the field of misinformation found an increase in the outsourcing of disinformation campaigns by government agencies to public relations or marketing companies.

Several outsourcing operations in the Middle East and North Africa showed high engagements of followers.

Using fake accounts to promote and create the illusion of widespread popularity for a particular point of view is just one advertising tactic. The Chinese Communist Party, for example, has a wide range of international-foreign propaganda capabilities spanning the fields of broadcast, print, and digital platforms, which have developed over decades.

The party may believe that investing in diplomats' Twitter accounts may be more effective than covert disguised campaigns, or it may mix overt and covert tactics using fake accounts to give the impression of huge popular support.

Research from the Associated Press and the Oxford Internet Institute showed that accounts that inflate Chinese diplomats with the aim of manipulating the platform (a sign of disbelief) are often subsequently suspended by Twitter. The party's choice to prioritize other channels, such as tapping YouTube influencers, can help explain the relatively modest reach of Twitter personalities.

If so, it may be worrying. The party's wide range of resources and relatively cheap labor force mean that incremental changes in strategy can lead to major shifts in the disinformation landscape.

Post a Comment

Previous Post Next Post