A Better Way to Learn About Ads on Facebook

[ad_1]

By Satwik Shukla, Product Manager

We’re working hard to make advertising more transparent and to give people more information about the ads they see. Today, we’re rolling out an improved Ad Library that brings together all ads on our platforms. We’re also making transparency information more visible on Pages, expanding access to our API to help more people analyze political or issue ads, and exempting news publishers from labeling their ads as related to politics or issues in the US.

Introducing a New Ad Library and Report

We launched the Ad Library — previously called the Ad Archive in the US — to help people learn more about ads related to politics or issues that have run on Facebook or Instagram. The Ad Library offers information about who saw the ad, as well as its spend and impressions, and houses ads for seven years. Today, we’re updating the Ad Library to make it easier to learn about all ads and the Pages that run them. This includes:

  • Bringing in all active ads any Page is running — not just ads related to politics or issues — creating one place to see ads on Facebook. This was previously only visible on a Page in the Info and Ads section.
  • Adding more information about Pages in the Ad Library, including:
    • Page creation date, previous Page merges, and name changes.
    • Primary country location of people who manage a Page, provided it has a large audience, or runs ads related to politics or issues in select countries.
    • Advertiser spend information for ads related to politics or issues where the Ad Library Report is currently available. This includes all-time spend and spend over the last week, which was previously only available in the Ad Library Report.
  • Improved search in the Ad Library. Past user searches will be saved for people logged in to Facebook, and people can now search by Page, not just keywords.
  • The ability to report ads from within the Ad Library.

We’re also making improvements to the Ad Library Report. Starting in mid-May, we’ll update the Ad Library Report for ads related to politics or issues daily, rather than weekly. We’ll also offer weekly, monthly and quarterly reports that are downloadable for anyone.

 

More Ways to Find Page Information

Beyond the Ad Library, we’re making it easier to find helpful information within Pages too. Now the “Home” tab of every Page will include a “Page Transparency” section with the same Page information we’ve added to the Ad Library: the date the Page was created, previous Page merges, name changes, and primary country location for certain Pages. People previously had to click “Info and Ads” to find this information.

Expanding API Access

We know we can’t protect elections alone. That’s why we’re expanding access to our Ad Library API for others to analyze ads related to politics or issues. To gain access: Go through the Facebook Identity Confirmation process, create a Facebook Developer account, and agree to our platform terms of service. Our identity confirmation process helps us make sure people are who they say they are, and can take up to a few weeks.

Exemptions for News Publishers in the Ad Library in the US

As we announced in November, we will no longer require qualifying news publishers targeting people in the US to go through the ad authorization process or label their ads as related to politics or issues. All active ads for all Pages will be visible in the Ad Library, but ads from news publishers will no longer be labeled as related to politics or issues — and the news filter will no longer be included in the Ad Library. We will use membership lists from third party industry organizations, as well as the news Page index and additional criteria to identify news publishers for exemption.

Protecting 2019 Elections Through Greater Ads Transparency

Over the past two years, we’ve made real strides when it comes to finding and addressing threats to election integrity, including: expanding our teams focused on safety and security to more than 30,000 people globally, on-boarding new fact-checking partners, and standing up elections operations centers in Dublin and Singapore. We’ve also launched ads transparency tools in the US, the UK, Brazil and India, as well as Ukraine and Israel leading up to key elections.

Now, ahead of the European Parliamentary election, we’re introducing ads transparency tools in the EU. EU advertisers will need to be authorized in their country to run ads related to the European Parliamentary election or issues of importance within the EU, as well as provide a “Paid for by” disclaimer clearly communicating who is responsible for the ad. These ads will also be archived in the new Ad Library for seven years.

Looking Ahead

We’re committed to creating a new standard of transparency and authenticity for advertising. By the end of June, we’ll roll out transparency tools for political or issue ads around the world. Shining a brighter light on advertising and Pages on Facebook holds us and advertisers more accountable, which is good for people and businesses.



[ad_2]

Source link

Protecting Elections in the EU

[ad_1]

By Richard Allan, VP Global Policy Solutions

In the run up to the European Parliamentary Elections in May, we are making big changes to the way ads with political content work on Facebook.

We are introducing some new tools to help us deliver on two key goals that experts have told us are important for protecting the integrity of elections — preventing online advertising from being used for foreign interference, and increasing transparency around all forms of political and issue advertising.

Authorizing Advertisers

To help prevent abuse and interference all EU advertisers will need to be authorized in their country to run ads related to the European Parliamentary elections. We will ask them to submit documents and use technical checks to confirm their identity and location. We will be using a combination of automated systems and user reporting to enforce this policy. We recognize that some people can try and work around any system but we are confident this will be a real barrier for anyone thinking of using our ads to interfere in an election from outside of a country.

Importantly, this means that all the people who are reaching you with ads identified as related to politics or issues have been authorized as being in your country and will be required to provide accurate information about who they are. This will help relevant authorities investigate them if they have any suspicions. There are many issues that only election regulators can effectively decide, for example if rules on campaign finance have been followed, and our new tools will help them in this important work.

Labeling Political and Issue Ads

To increase transparency, all ads related to politics and issues on Facebook and Instagram in the EU must be clearly labeled — including a “Paid for by” disclosure from the advertiser at the top of the ad. This means that you can see who is paying for the ad and, for any business or organization, their contact details. When you click on the label, you’ll be able to see more information such as the campaign budget associated with an individual ad, how many people saw it and their age, location and gender. We are inviting all political campaigns to start the ads authorization process now and we will start to block political or issue ads that have not been properly registered from mid-April.

We want to make sure that we capture the broad range of ads that are intended to influence the outcome of the election. This means including not only ads that advocate voting for particular candidates or parties but also issue ads, which don’t explicitly back one candidate or political party but which focus on highly politicized topics such as immigration.

Ad Library

Many people have an interest in getting information about the ads run by political campaigns. This information is especially useful for election regulators and watchdog groups. We have built a new tool called Ad Library to make it easy for everyone to find out about political or issue ads on Facebook. Here you will see all the ads that have been classified as relating to politics or issues and we will keep them in the library for seven years. When you click “See Ad Details” for these ads, you will see information on the number of times the ad was viewed, and demographics about the audience reached including age range, location, and gender. As well as allowing anyone to browse and search in the library, we are expanding access to our API so news organizations, regulators, watchdog groups and people can hold advertisers and us more accountable.

We recognize that media coverage of elections and important issues is distinct from advocacy or electoral ads, even if those news stories may refer to parties and candidates and receive paid distribution on Facebook. That is why we are working to exempt ads by news publishers from these new tools. We do not have the systems in place yet to exempt news stories in all countries but will be rolling this out across the EU as we build out the necessary infrastructure.

Committing to Transparency and Accountability 

We believe that more transparency will lead to increased accountability and responsibility over time — not just for Facebook but for advertisers as well. We’re investing heavily in more people and better technology to proactively identify abuse. But if you see an ad which you believe is related to politics or issues and isn’t labeled, please report it. Facebook will review the ad, and if it falls under our political advertising policy, we’ll take it down and add it to the Ad Library.

These changes will not prevent abuse entirely. We’re up against smart, creative and well-funded adversaries who change their tactics as we spot abuse. But we believe that they will help prevent future interference in elections on Facebook. And that is why they are so important.

For an update on our ads transparency tools around the world, see here.



[ad_2]

Source link

Removing Coordinated Inauthentic Behavior From the Philippines

[ad_1]

By Nathaniel Gleicher, Head of Cybersecurity Policy

Today we removed 200 Pages, Groups and accounts that engaged in coordinated inauthentic behavior on Facebook and Instagram in the Philippines, misleading others about who they were and what they were doing.

We are constantly working to detect and stop this type of activity because we don’t want our services to be used to manipulate people. We’re taking down these Pages and accounts based on their behavior, not the content they posted. In this case, the people behind this activity coordinated with one another and used fake accounts to misrepresent themselves, and that was the basis for our action.

While we are making progress rooting out this abuse, as we’ve said before, it’s an ongoing challenge and we constantly have to improve to stay ahead. That means building better technology, hiring more people and working more closely with law enforcement, security experts and other companies.

We’ll update this post with more details when we have them, or if the facts change.

What We’ve Found So Far

The individuals behind this activity used a combination of authentic and fake accounts to disseminate content across a variety of Pages and Groups. They frequently posted about local and political news, including topics like the upcoming elections, candidate updates and views, alleged misconduct of political opponents, and controversial events that were purported to occur during previous administrations. Although the people behind this activity attempted to conceal their identities, our investigation found that this activity was linked to a network organized by Nic Gabunada.

  • Presence on Facebook and Instagram: 67 Pages, 68 Facebook accounts, 40 Groups and 25 Instagram accounts.
  • Followers: About 3.6 million accounts followed one or more of these Pages, about 1.8 million accounts joined at least one of these Groups and around 5,300 accounts followed one or more of these Instagram accounts.
  • Advertising: Around 59,000 USD in spending for ads on Facebook paid for in Philippine peso, Saudi riyal and US dollars. The first ad ran in January 2014 and the most recent ad ran in March 2019.

We identified these accounts and Pages through an ongoing internal investigation into Philippines-linked coordinated inauthentic behavior. Our assessment benefited from open source reporting. We have shared relevant information with policymakers and industry partners.

Below is a sample of the content posted by some of these Pages:

Caption: DU30 blessed us again! MRT has new air conditioner units! It’s now gonna be cold in the Philippines!
Photo text: “Comfortable Life for all”

Caption: Husband and his wife that allegedly has 4 lovers, confront each other on Tulfo’s program. Watch the intense confrontation of the two here. Please spread.

Caption: The fountains are so beautiful

Caption: Apparently it’s possible for a man to give birth through Cesarian or C-section. How did this happen? Find out!

Caption: Since the time he was a congressman and now that he is a senator, Kuya Sonny Angara has passed 150 laws.



[ad_2]

Source link

Four Ideas to Regulate the Internet

[ad_1]

By Mark Zuckerberg, CEO 

Technology is a major part of our lives, and companies such as Facebook have immense responsibilities. Every day we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks. These are important for keeping our community safe. But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone.

I believe we need a more active role for governments and regulators. By updating the rules for the internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms.

From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.

First, harmful content. Facebook gives everyone a way to use their voice, and that creates real benefits — from sharing experiences to growing movements. As part of this, we have a responsibility to keep people safe on our services. That means deciding what counts as terrorist propaganda, hate speech and more. We continually review our policies with experts, but at our scale we’ll always make mistakes and decisions that people disagree with.

Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own. So we’re creating an independent body so people can appeal our decisions. We’re also working with governments, including French officials, on ensuring the effectiveness of content review systems.

Internet companies should be accountable for enforcing standards on harmful content. It’s impossible to remove all harmful content from the internet, but when people use dozens of different sharing services — all with their own policies and processes — we need a more standardized approach.

One idea is for third-party bodies to set standards governing the distribution of harmful content and measure companies against those standards. Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.

Facebook already publishes transparency reports on how effectively we’re removing harmful content. I believe every major internet service should do this quarterly, because it’s just as important as financial reporting. Once we understand the prevalence of harmful content, we can see which companies are improving and where we should set the baselines.

Second, legislation is important for protecting elections. Facebook has already made significant changes around political ads: Advertisers in many countries must verify their identities before purchasing political ads. We built a searchable archive that shows who pays for ads, what other ads they ran and what audiences saw the ads. However, deciding whether an ad is political isn’t always straightforward. Our systems would be more effective if regulation created common standards for verifying political actors.

Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference. Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting. We believe legislation should be updated to reflect the reality of the threats and set standards for the whole industry.

Third, effective privacy and data protection needs a globally harmonized framework. People around the world have called for comprehensive privacy regulation in line with the European Union’s General Data Protection Regulation, and I agree. I believe it would be good for the internet if more countries adopted regulation such as GDPR as a common framework.

New privacy regulation in the United States and around the world should build on the protections GDPR provides. It should protect your right to choose how your information is used — while enabling companies to use information for safety purposes and to provide services. It shouldn’t require data to be stored locally, which would make it more vulnerable to unwarranted access. And it should establish a way to hold companies  such as Facebook accountable by imposing sanctions when we make mistakes.

I also believe a common global framework — rather than regulation that varies significantly by country and state — will ensure that the internet does not get fractured, entrepreneurs can build products that serve everyone, and everyone gets the same protections.

As lawmakers adopt new privacy regulations, I hope they can help answer some of the questions GDPR leaves open. We need clear rules on when information can be used to serve the public interest and how it should apply to new technologies such as artificial intelligence.

Finally, regulation should guarantee the principle of data portability. If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.

This is important for the internet — and for creating services people want. It’s why we built our development platform. True data portability should look more like the way people use our platform to sign into an app than the existing ways you can download an archive of your information. But this requires clear rules about who’s responsible for protecting information when it moves between services.

This also needs common standards, which is why we support a standard data transfer format and the open source Data Transfer Project.

I believe Facebook has a responsibility to help address these issues, and I’m looking forward to discussing them with lawmakers around the world. We’ve built advanced systems for finding harmful content, stopping election interference and making ads more transparent. But people shouldn’t have to rely on individual companies addressing these issues by themselves. We should have a broader debate about what we want as a society and how regulation can help. These four areas are important, but, of course, there’s more to discuss.

The rules governing the internet allowed a generation of entrepreneurs to build services that changed the world and created a lot of value in people’s lives. It’s time to update these rules to define clear responsibilities for people, companies and governments going forward.



[ad_2]

Source link

Why Am I Seeing This? We Have an Answer for You

[ad_1]

By Ramya Sethuraman, Product Manager

News Feed is always personalized to you — and starting today you’ll have even more information and control over your experience.

We’re introducing “Why am I seeing this post?” to help you better understand and more easily control what you see from friends, Pages and Groups in your News Feed. This is the first time that we’ve built information on how ranking works directly into the app.

We’re also making improvements to “Why am I seeing this ad?” a tool we launched back in 2014. We’ve received valuable feedback over the years that has helped us expand the information we share with people about the ads they see.

This means you’ll be able to tap on posts and ads in News Feed, get context on why they’re appearing, and take action to further personalize what you see.

Helping You Understand and Control the Posts You See
The goal of News Feed is to show people the posts that are most relevant to them. “Why am I seeing this post?”, which can be found in the drop down menu in the right hand corner of a post, explains how your past interactions impact the ranking of posts in your News Feed. Specifically, you’ll be able to see:

  • Why you’re seeing a certain post in your News Feed — for example, if the post is from a friend you made, a Group you joined, or a Page you followed.
  • What information generally has the largest influence over the order of posts, including: (a) how often you interact with posts from people, Pages or Groups; (b) how often you interact with a specific type of post, for example, videos, photos or links; and (c) the popularity of the posts shared by the people, Pages and Groups you follow.
  • Shortcuts to controls, such as See First, Unfollow, News Feed Preferences and Privacy Shortcuts, to help you personalize your News Feed.

During our research on “Why am I seeing this post?”, people told us that transparency into News Feed algorithms wasn’t enough without corresponding controls. People wanted to be able to take action, so we’ve made it easy to manage what you see in News Feed right from this feature.

People’s feedback also helped us determine what specific information would be most valuable to highlight. For example, we’ve included examples of people’s interactions that contribute to each of the three signal categories we show in the feature. Over time, we’ll continue to get feedback on “Why am I seeing this post?” and make improvements.

Giving You More Context on Your Ads
We’re also making updates to “Why am I seeing this ad?”. Since we launched this feature more than four years ago, you’ve been able to see how factors like basic demographic details, interests and website visits contribute to the ads in your News Feed. Now we’ll include additional details about the ads you see when information on an advertiser’s list matches your Facebook profile.

Businesses can reach their customers by uploading information they already have, such as emails or phone numbers. We then try to match the ad to the most relevant audience without revealing any identifiable information back to the business. “Why am I seeing this ad?” will now provide details such as when the advertiser uploaded the information or if the advertiser worked with another marketing partner to run the ad.

An Ongoing Commitment
Both of these updates are part of our ongoing investment in giving people more context and control across Facebook. We will continue to listen to your feedback and evolve these features over time.

For more information, visit our Help Center and watch this video from Inside Feed.



[ad_2]

Source link

More Clarity, More Control | Facebook Newsroom

[ad_1]

News Feed is always personalized to you — and starting today you’ll have even more information and control over your experience.

We’re introducing “Why am I seeing this post?” to help you better understand and more easily control what you from see the friends, Pages and Groups you follow in your News Feed. This is the first time that we’ve built information on how ranking works directly into the app.

In this video, product manager Ramya Sethuraman and researcher Laura Rivera offer a preview of what you’ll see in the feature and what they learned designing it.



[ad_2]

Source link

Removing Coordinated Inauthentic Behavior and Spam From India and Pakistan

[ad_1]

By Nathaniel Gleicher, Head of Cybersecurity Policy

We have removed Pages, Groups and accounts for violating Facebook’s policies on coordinated inauthentic behavior or spam. Today’s action includes four separate takedownseach distinct and unconnected.

  • We removed 103 Pages, Groups and accounts on both Facebook and Instagram for engaging in coordinated inauthentic behavior as part of a network that originated in Pakistan.
  • We removed 687 Facebook Pages and accounts — the majority of which had already been detected and suspended by our automated systems — that engaged in coordinated inauthentic behavior in India and were linked to individuals associated with an IT Cell of the Indian National Congress (INC).
  • We removed 15 Facebook Pages, Groups and accounts that engaged in coordinated inauthentic behavior in India and were linked to individuals associated with an Indian IT firm, Silver Touch.
  • We removed 321 Facebook Pages and accounts in India that have broken our rules against spam. Unlike the first three actions, this last activity does not represent a single or coordinated operation — instead, these are multiple sets of Pages and accounts that behaved similarly and violated our policies.

We have detailed each of these actions below.

Coordinated Inauthentic Behavior Enforcement

The operations we found to be engaged in coordinated inauthentic behavior were two distinct sets of activity in India and one network in Pakistan. We didn’t find any links between the campaigns we’ve removed today, but they used similar tactics by creating networks of accounts to mislead others about who they were and what they were doing.

We are constantly working to detect and stop coordinated inauthentic behavior because we don’t want our services to be used to manipulate people. We’re taking down these Pages and accounts based on their behavior, not the content they posted. In each case detailed below, the people behind this activity coordinated with one another and used fake accounts to misrepresent themselves, and that was the basis for our action.

While we are making progress rooting out this abuse, as we’ve said before, it’s an ongoing challenge and we’re committed to continuously improving to stay ahead. That means building better technology, hiring more people and working more closely with law enforcement, security experts and other companies.

In each case below, we identified violating accounts and Pages through ongoing internal investigations into coordinated inauthentic behavior in the region ahead of the upcoming elections in India. We have shared relevant information with policymakers and technology platforms.

Pakistan

Today we removed 103 Pages, Groups and accounts for engaging in coordinated inauthentic behavior on Facebook and Instagram as part of a network that originated in Pakistan. The individuals behind this activity used fake accounts to operate military fan Pages; general Pakistani interest Pages; Kashmir community Pages; and hobby and news Pages. They also frequently posted about local and political news including topics like the Indian government, political leaders and military. Although the people behind this activity attempted to conceal their identities, our investigation found that it was linked to employees of the ISPR (Inter-Service Public Relations) of the Pakistani military.

  • Presence on Facebook and Instagram: 24 Pages, 57 Facebook accounts, 7 Groups and 15 Instagram accounts.
  • Followers: About 2.8 million accounts followed one or more of these Pages, about 4,700 accounts joined at least one of these groups, and around 1,050 accounts followed one or more of these Instagram accounts.
  • Advertising: Around $1,100 in spending for ads on Facebook paid for in US dollars and Pakistani rupees. The first ad ran in May 2015 and the most recent ad ran in December 2018.

Below is a sample of the content posted by some of these Pages:

Image text: Today is the 6th death anniversary of M.M. Alam, who destroyed 5 Indian jets within 1 minute. Who has the courage to hamper our flight? We fly with courage, not wings

India

Today we removed two separate and unrelated networks of Pages, Groups and accounts that engaged in coordinated inauthentic behavior on Facebook and Instagram in India. The first was linked to individuals associated with the INC IT Cell, and the second was connected to individuals associated with an Indian IT firm, Silver Touch. Here are some details.

We removed 687 Facebook Pages and accounts for engaging in coordinated inauthentic behavior in India. The individuals behind this activity used fake accounts, the majority of which had already been detected and suspended by our automated systems, and joined various Groups to disseminate their content and increase engagement on their own Pages. The Page admins and account owners typically posted about local news and political issues, including topics like the upcoming elections, candidate views, the INC and criticism of political opponents including the Bharatiya Janata Party (BJP). While the people behind this activity attempted to conceal their identities, our review found that it was connected to individuals associated with an INC IT Cell.

  • Presence on Facebook: 138 Pages and 549 Facebook accounts.
  • Followers: About 206,000 accounts followed one or more of these Pages.
  • Advertising: Around $39,000 USD in spending for ads on Facebook, paid for in Indian rupees. The first ad ran in August 2014 and the most recent ad ran in March 2019.

Below is a sample of the content posted by some of these Pages:

Caption: Rahul Gandhi’s callout for “Justice” against Modi’s “Injustice.”
Image Text: If a person’s income is 5,000 rupees, then Congress government will credited Rupees 7,000 in his account.

Caption: They want us to get angry / Fight in the name of temple mosques / But we are not fighting! / That’s just their defeat…!!
Image Text: They want us to get angry / Fight in the name of temple mosques / But we are not fighting! / That’s just their defeat…!!

Caption: Local Tribal’s lands were invaded and donated amongst the 10-15 of his friends.Corruption won’t be tolerated anymore. Uproot and throw Raman Government.
Video Text: The elaborate poetry of the caption mentioned above alleging the the CM of Chattisgarh for the crime

Caption: Make the right decision / Select the poor Congress party
Image Text: MNREGA Holding 14 million families out of poverty / Modi and BJP had opposed the MNREGA / Now Jetli, Modi and BJP are opposing the Act of providing Rs.72,000 annually to the poor / Make the right decision Select the poor Congress party

 

Separately, we removed 15 Pages, Groups and accounts for engaging in coordinated inauthentic behavior on Facebook and Instagram in India. A small number of Page admins and account owners used a combination of authentic and fake accounts to share their content across a variety of Pages. They posted about local news and political events, including topics like the Indian government, the upcoming elections, the BJP and alleged misconduct of political opponents including the INC. Although the people behind this activity attempted to conceal their identities, our investigation found that this activity was linked to individuals associated with an Indian IT firm, Silver Touch.

  • Presence on Facebook and Instagram: 1 Page, 12 Facebook accounts, 1 Group and 1 Instagram account.
  • Followers: About 2.6 million accounts followed this Page, about 15,000 accounts joined this Group, and around 30,000 accounts followed this Instagram account.
  • Advertising: Around $70,000 USD in spending for ads on Facebook, paid for in Indian rupees. The first ad ran in June 2014 and the most recent ad ran in Feb 2019.

Below is a sample of the content posted by some of these Pages:

Caption: If Modi is there it’s possible, UPA government did not even give the permission
Video Content: Interview of Former DRDO Chief VK Saraswat with TIMES NOW news channel of explaining how Modi government gave the approvals for Mission Shakti and how UPA government did not give the permission for this project. The whole interview is in English.

Caption: India has become super power in space also.
Image Text: India has blown up a live satellite in space.
Speech Bubble: “Hail Mother India!”

Removing Additional Pages and Accounts that Violate our Spam and Misrepresentation Policies in India

We also removed 227 Pages and 94 accounts in India for violating our policies against spam and misrepresentation. These policies, outlined in Facebook’s Community Standards, are designed to help make sure people can trust the connections they make on Facebook and are not misled about the content they are seeing. These Pages and accounts were engaging in behaviors that expressly violate our policies. This included using fake accounts or multiple accounts with the same names; impersonating someone else; posting links to malware; and posting massive amounts of content across a network of Groups and Pages in order to drive traffic to websites they are affiliated with in order to make money. Unlike the takedowns for coordinated inauthentic behavior, this activity was not part of one coordinated operation.

We routinely remove accounts and Pages that engage in this type of harmful, often financially-motivated, behavior — like ads for fraudulent products or fake weight loss “remedies.” The people behind this behavior create Pages using fake accounts or multiple accounts with the same names. They post clickbait posts on these Pages to drive people to websites that are entirely separate from Facebook and seem legitimate, but are actually ad farms. The people behind the activity also post the same clickbait posts in dozens of Facebook Groups, often hundreds of times in a short period, to drum up traffic for their websites. And they often use their fake accounts to generate fake likes and shares. This artificially inflates engagement for their inauthentic Pages and the posts they share, misleading people about their popularity and improving their ranking in News Feed.

This activity goes against what people expect on Facebook and it violates our policies. This is why we continue to invest in people and resources to improve the technology we use to detect this type of harmful behavior, and we will continue to take action on an ongoing basis to address it.



[ad_2]

Source link

Aggregate Blog Post

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam velit turpis, consectetur tempor vehicula at, posuere ut quam. Aenean lacinia dolor orci. Duis feugiat blandit viverra. Quisque tristique libero eu efficitur vestibulum. Phasellus vitae enim in enim luctus facilisis. Suspendisse et consequat mi. Nulla id massa a nulla ullamcorper gravida. Donec est mauris, varius nec ex at, ornare viverra nisi. Cras finibus urna ac porta dignissim. Integer ultricies magna tortor. Mauris egestas dolor sed convallis tempor. Duis eu elit vel lorem viverra lacinia ac vel odio. Cras non arcu eu nisl maximus malesuada sed ut eros.

Phasellus convallis quam non lectus vulputate, fermentum pulvinar eros sollicitudin. Ut nec mauris faucibus turpis lobortis convallis. Phasellus ac porta est, non hendrerit nulla. Morbi ac eleifend risus. Maecenas sodales velit ut lobortis placerat. Nullam et laoreet magna. Cras quis gravida elit. In vel arcu a tellus sagittis placerat. Duis eget accumsan augue, eu aliquet purus. Etiam suscipit erat ac sapien euismod, in vehicula elit euismod. Praesent a risus nec purus vulputate sodales. Aenean at porta magna, at tincidunt tortor. Nam vitae semper ipsum, nec condimentum tortor. Etiam at odio ipsum. Aliquam erat volutpat.

Praesent egestas lacus dui, a commodo odio posuere quis. Nullam vitae enim sed tellus facilisis cursus. Nunc porttitor feugiat massa in dapibus. Cras efficitur elit quis augue pulvinar, id pulvinar urna euismod. Sed tempus odio quam, non molestie odio elementum et. Suspendisse aliquam nunc ac libero luctus, vitae gravida erat vestibulum. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Suspendisse potenti. Aliquam lectus sem, aliquam ac massa vel, tempus venenatis libero. Praesent luctus pellentesque neque id pharetra. Duis ac nisi pretium, porttitor leo ac, varius nulla. In rhoncus mi sed justo commodo, eget faucibus lorem eleifend. Nulla facilisi.

A Conversation with Mark Zuckerberg and Mathias Döpfner

[ad_1]

Today, Mark Zuckerberg shared the second conversation of his 2019 personal challenge. He sat down with Mathias Döpfner, CEO of Europe’s largest publisher Axel Springer, to discuss the role quality journalism plays in building informed communities and the principles Facebook should use for building a news tab to surface more high quality news. They also covered the privacy-focused vision Mark laid out for the future of social networking and the four ideas for internet regulation he published over the weekend.

You can find the video, transcript and audio file below. And you can watch Mark’s conversation with Harvard Law Professor Jonathan Zittrain here.

[ad_2]

Source link

Getting Input on an Oversight Board

[ad_1]

By Brent Harris, Director of Governance and Oversight Board

Every day, teams at Facebook make difficult decisions about what content should stay up and what should come down based on our Community Standards. But given the size of our community — and the reach of our platform — we don’t think we should be making all of these decisions on our own. In November, Mark Zuckerberg laid out a plan for a new way for people to appeal content decisions to an independent board. And earlier this year, we released a draft charter giving more detail about its potential structure.

Today, we’re opening a public consultation process to help us answer questions around the design for this Oversight Board. This is another part of our ongoing, global engagement with a wide range of organizations, think tanks and researchers to determine how best to empower this entity to render independent judgment on some of Facebook’s most important and challenging content decisions.

Each submission is broken into two sections: a questionnaire and free-form questions. The questionnaire responses will feed into the engagement happening in round tables and workshops around the world. The free-form questions will focus on membership, case decisions and governance, and will provide invaluable design and structural options. Responses will be accepted for the next six weeks.

We’re partnering with a team from the firm Baker McKenzie, who are providing project management support for this effort and will review each submission. In June, we’ll publish a report summarizing what we’ve learned through these submissions and in the broader conversations we’re having.

We look forward to continuing outreach and engagement that will help build a board and create accountability and oversight of our content policy and enforcement decisions. We will continue to share milestones and progress along the way.

To share your opinion and perspective, go here.



[ad_2]

Source link