Meta’s ongoing efforts regarding Russia’s invasion of Ukraine

  • We have set up a special operations center staffed with experts from across the company, including native Russian and Ukrainian speakers, who monitor the platform around the clock, allowing us to respond to issues in real time.
  • We’ve added several security features in Ukraine, including the ability for users to lock their Facebook profile, removing the ability to view and search friend lists, and additional tools on Messenger.
  • We are taking significant steps to combat the spread of misinformation by expanding our third-party fact-checking capability in Russian and Ukrainian. We are also ensuring more transparency around state-controlled media, banning Russian state media ads and demonetizing their accounts.

Our hearts go out to all those affected by the war in Ukraine. We are taking important steps in our apps to help keep our community safe and support people who use our services, both in Ukraine and around the world.

Here are some of the specific actions we have taken regarding Russia’s invasion of Ukraine:

Help keep people safe in Ukraine

We have added several security features in Ukraine in response to the situation on the ground.

  • Lock your profile: This tool allows people to lock their Facebook profile in just one step. When someone’s profile is locked, people who aren’t their friends can’t download, enlarge, or share their profile picture, or view messages or other photos on someone’s profile, regardless of when they published it. Our teams work with non-governmental organizations and civil society organizations to make sure people know these tools are available.
  • Friends lists: We’ve temporarily removed the ability to view and search Facebook account friend lists in Ukraine to help protect people from being targeted.
  • Instagram privacy and security reminders: We are sending all Instagram users in Ukraine a notification at the top of the feed regarding account privacy and security. For public accounts, we remind them to check their settings in case they want to make their accounts private. When someone makes their account private, any new followers will need to be approved, and only their followers will be able to see their posts and stories. For people who already have private accounts, we share tips on how to secure their account with strong passwords and two-factor authentication.
  • Privacy and security in Messenger: We have increased the tools available to Messenger users in Ukraine, for example by rapidly deploying notifications for screenshots and disappearing messages in our end-to-end encrypted chats.
  • Secure messaging on WhatsApp: As always, your personal messages and calls are protected by end-to-end encryption by default so that they cannot be intercepted by any government. You can now use “see once“media to send photos or videos that disappear after being viewed and”vanishing mode” to automatically clear all new chats after 24 hours to protect information in case your phone is lost. We strongly recommend everyone to enable two-step verification to protect you from hackers who may try to block your access to your account.

Application of our policies

We take extra steps to enforce our Community Standards and Community Guidelines, not only in Ukraine and Russia, but also in other countries around the world where content may be shared.

  • We enforce our policies around hate speech, violence and provocation and damage coordinationamong other things, using technology to help us find content quickly, often before people see it and report it to us.
  • We have established a Special Operations Center staffed with experts from across the company, including native Russian and Ukrainian speakers, working around the clock to monitor and respond to this rapidly evolving conflict in real time. This allows us to more quickly remove content that violates our Community Standards or Community Guidelines and provides another line of defense against misinformation.
  • We have teams of native Russian and Ukrainian content reviewers to help us review potentially infringing content. We’re also using technology to help us scale the work of our content review teams and prioritize what content those teams should be spending their time on, so we can remove more non-compliant content before it becomes viral.
  • We receive feedback from a network of local and international partners on emerging risks and act quickly to address those risks. We recognize that local context and language-specific expertise are essential for this work, so we will remain in close communication with experts, partner institutions and non-governmental organisations.
  • As part of this effort, our security teams continue to monitor emerging threats and counter malicious activity.

Reduce the spread of misinformation

We are taking significant steps to combat the spread of misinformation about our services and continue to consult with outside experts.

  • We remove content that violates our policies and work with third-party fact checkers in the region to debunk false claims. When they rate something as false, we move that content lower in the feed so fewer people see it.
  • In response to the crisis, we have expanded our Russian- and Ukrainian-language third-party fact-checking capacity across the region and are working to provide additional financial support to Ukrainian fact-checking partners.
  • To supplement the labels of our fact-checking partners, we warn users in the area when they try to share war-related images that our systems have been detecting for over a year so people have more information. on outdated or misleading images that may be taken. out of context.
  • We’ve also made it easier for fact checkers to find and review war-related content, as we recognize that speed is especially important during current events. We use keyword detection to group related content in one place, making it easier for fact checkers to find.
  • We’re also giving people more information to decide what to read, trust and share by adding warning labels on content deemed false by third-party fact checkers and applying labels to state-controlled media publishers. .
  • Messenger, Instagram, and WhatsApp have limits on forwarding messages and label messages that aren’t from the sender so people know something is third-party information.
  • We’re notifying people who have already shared or are trying to share verified content so they can decide for themselves if it’s something they want to continue sharing.
  • Facebook pages, groups, accounts and domains that repeatedly share false information will receive additional penalties. For example, we’ll remove them from recommendations and show all the content they post lower in the feed, so fewer people see it.
  • We show a pop-up notification when you connect to a Facebook page, group, or Instagram account that has repeatedly shared content that fact-checkers have found to be false. You can also click to read more, including that fact-checkers said some posts shared by this page, group or account included false information and a link to more information about our fact-checking program .

Transparency around state-controlled media

We provide greater transparency on state-controlled media accounts, including Russia-based RT and Sputnik, because they combine the influence of a media organization with the strategic support of a state, and we think people should know if news they read comes from a publication that may be under the influence of a government.

  • We ban ads from Russian state media and demonetize their accounts.
  • We continue to apply labels to other Russian state media.
  • We have refused an order from Russian authorities to stop independent fact-checking and labeling of content posted on Facebook by four Russian state media outlets.
  • State-controlled media, like other publishers, are eligible for fact-checking, and our third-party fact-checking partners can and do rate their content.
  • State-controlled outlets must adhere to our Community Standards and Advertising Policies.
  • Ads and posts by state-controlled media on Facebook and Instagram are prominently labeled. We also apply these labels to Instagram profiles, the “About This Account” section of Instagram accounts, the Page Transparency section of Facebook Pages, and our Ads Library.
  • We developed our definition and standards for state-controlled media organizations with input from over 65 experts from around the world specializing in media, governance, human rights and development.

We remain alert to emerging trends and stand ready to take additional steps to meet the demands of this ongoing conflict.

Comments are closed.