Facebook non-partisan, politically neutral: India chief Ajit Mohan

Facebook India Head Ajit Mohan has defended the handling of alleged hate speeches by members of the ruling BJP, saying the platform has remained true to its design of being neutral and non-partisan and acted based on inputs from various teams.

In an interview with PTI, Mohan rejected charges of Facebook India’s decisions being influenced by political leanings of individuals, saying the process followed at the platform is designed to ensure no one person can influence outcomes, let alone take any unilateral decisions.

“The content policy of team that is at the centre of all the enforcement decisions (on hate speeches) is separate and independent in India from the public policy team (that handles government relations),” he said. “It’s designed for independence.”

And the content management team is guided by only community standards. “And enforcement of that has to be objective, has to be non-partisan and neutral. I think that goes to the heart of how the platform has been designed from day one,” he asserted.

Individuals can have “points of views” or “leanings”, the “system is designed to make sure no one person can influence the outcomes,” he said.

“And so the answer is yes,” he said replying to a question on whether Facebook is a non-partisan and politically neutral entity.

The comments come amid a political storm over a report published in the Wall Street Journal last month, alleging the social media giant ignored extremist posts by ruling Bharatiya Janata Party leaders to protect its business interests in India.

According to the report, Facebook deleted anti-Muslim posts by BJP’s Telangana MLA T Raja Singh and three other Hindu nationalists only after being questioned by the paper. Facebook’s head of public policy Ankhi Das, the report said citing company employees, had opposed the deletion of the posts despite being flagged internally as breaching standards.

Facebook earlier this month banned the 42-year-old Telangana MLA, categorising him as a “dangerous individual”.

Mohan said there are no limits to respectable standard for free speech within and outside the company.

While there are people from multiple political leanings and backgrounds in the company, Facebook values people of experience in government or in public service.

“But at the same time, I think it is important to call out to you that the content policy of a team that is at the centre of all of these enforcement decisions is separate and independent in India from the public policy team here. It’s designed for independence. So the public policy team that engages with government, for example, central and state governments, is a part of my team. That is separate from the content policy team which is part of the global team,” he said.

“So, I think the point is while people can have points of view, they can have leanings, the system is designed to make sure no one person can influence the outcomes, let alone have any unilateral decision making power on this aspect. The separation in India context tells you how design is meant for independence,” he said.

Mohan said the content moderation is largely done through automated systems and human reviewers.

In dealing with complex issues such as designating individuals, especially elected officials, the content policy team comes into play.

“The public policy team does seek inputs from multiple functions and disciplines and teams including public policy team in India. That is not interference. That is the process, that is being designed to have enough local context from the local team,” he said. “But finally the decision that is taken is not taken by the public policy team.”

“So you have the opportunity in certain cases like designation, from multiple local and international teams, to provide a point of view that is by design. But it is not for them to take any unilateral decision. That still goes through the content policy team,” he said.

Facebook has over 300 million users in India, while its associate WhatsApp is the leader in messaging with over 400 million users.

In April this year, Facebook invested USD 5.7 billion to buy a 9.9 per cent stake in Jio Platforms, the digital arm of energy-to-telecom conglomerate Reliance Industries Ltd owned by India’s richest man Mukesh Ambani.

Mohan said Facebook has an impartial approach to dealing with content and that this is governed strongly by its community standards. These policies are enforced globally without regard to anyone’s political position, party affiliation or religious and cultural beliefs, he emphasised.

“That is the basis and enforcement of that has to be objective, has to be non-partisan and neutral. I think that goes to the heart of how the platform has been designed from day one. It goes to the heart of something all of us embrace, that we have to be neutral, we have to be non-partisan,” he added.

Related posts

Detective Agencies, Film Noir and Society’s Relationship to the Elderly: Maite Alberdi on Her Doc, The Mole Agent | Filmmaker Magazine

ChileThe Mole Agent

Responding to a help-wanted ad, 85-year-old Sergio Chamy agrees to infiltrate a Santiago nursing home as a “mole agent” to find out if a client’s mother is being abused. As a “spy” he uncovers a hidden world of frustration and loneliness. 

Maite Alberdi’s documentary borrows from film noir before evolving into an unsettling look at the lives of the elderly. It was developed with the help of the Sundance Institute Documentary Film Program and the Tribeca Film Institute. The Mole Agent screened at Sundance, and is available on demand starting September 1.

Filmmaker spoke with Alberdi from her office in Santiago.

Filmmaker: How did you start on this project?

Maite Alberdi: I wanted to make a documentary about private detectives. I’m a super fan of film noir and pulp fiction, and I realized that I never saw a documentary that centered around a detective agency. That was my starting point. I researched agencies, which is how I met Romulo, a retired police officer who had his own shop. He handled several “mole” cases. I worked with him a couple of times, and one of the cases involved the retirement home. I realized I wanted to shoot there.

Filmmaker: What did you do for Romulo?

Alberdi: I followed people. I would meet with clients, interview them, take notes. Then I had cases where parents wanted to follow their children, or I followed couples. A lot of things.

Romulo usually worked with the same mole, but he broke his hip and had to be replaced when we were ready to start shooting. Romulo put an ad in the paper to find and train a new mole.

Filmmaker: So in effect Romulo cast Sergio.

Alberdi: No, he wasn’t going to pick Sergio. I had to convince him. Romulo wanted someone else, someone I didn’t think was empathetic. The one Romulo liked was accompanied by his wife during the interview. And Romulo being super-machismo, I could say, “Maybe the wife will be there all the time. She could be a problem. That won’t happen with Sergio.”

Filmmaker: You were like a private eye yourself, investigating the investigators.

Alberdi: Exactly. I feel Sergio’s job is super-similar to my job as documentary filmmaker. Because when I’m shooting, I spend a lot of time, waiting, waiting, until I have the scene. Documentary filmmaking requires a lot of patience. Some days I never press “rec” because nothing interesting is happening. For Sergio it’s the same, he’s waiting, following people, waiting, waiting until he takes the pictures or until he gets the proof that he needs. 

I’m always spying on people. They know I’m there, that’s the big difference. I observe people without participating.

Filmmaker: How did you persuade the nursing home to agree to filming?

Alberdi: We said that I want to make a film about old age. I had previously released a film in Chile about older people, so it wasn’t weird that I wanted to shoot there. We said we would shoot both the good things and the bad things that happen there. So if we see something bad, we will show it. They signed an agreement to that effect. Then we said, if someone new arrives we want to focus on their experiences. That they allowed too. We introduced ourselves to the staff, and we started to shoot inside the retirement home for three weeks before Sergio arrives. When he came, we acted as if we didn’t know each other.

There was a real client, a real case that Sergio was working on. It was a family problem, someone wanted to prove to her brothers and sisters that their mom wasn’t okay there. Of course I started to realize that the nursing home was a good place, and then I felt super-guilty about lying to them. 

When we finished the film, they were the first people we showed it to. I said, “I lied to you, it was a film about a mole.” When they saw it, they loved it. They cried a lot. Now they are the best promoters of the film.

Filmmaker: One of the saddest aspects of The Mole Agent is that it shows how even with a good environment and a caring staff, the elderly have trouble dealing with isolation.

Alberdi: We always put the blame on the institution. Like with school, and my kids, it’s always the teacher’s fault. But I’m the one who’s not building a community there.

With retirement homes it’s the same. We put our old people there and forget them. We don’t work to make it a good place, a community. You can correct the problem by connecting them with families, integrating them into society. In Latin America it’s really common to isolate older people. It was the same with my previous film [The Grown-Ups, 2016], which was about people with Down Syndrome. Their parents put them in a special needs school, and fifty years later they’re still there.

Filmmaker: Your visual style is arresting. The Mole Agent settles into the rhythms of the elderly, and the imagery that reflects their feelings.  Can you talk about collaborating with cinematographer Pablo Valdés?

Alberdi: I have been working with Pablo for 10 years, we’ve made, I think, five films together. Here I really wanted to make a film noir, I wanted to shoot angles like a fiction film. We had some style references, but we ended up using the same techniques we always use.

We spend a lot of time with people until they get used to the camera. I would try to figure out which ones didn’t, so we wouldn’t shoot them. The people in the home have a routine that doesn’t change very much. They have lunch at the same time, for example. It’s like my life, I don’t change that much, I know my routine. So if I know, I can predict how things are going to happen, and at what time and place.

We spend a lot of time planning the frame. And then it’s wait. For example, that’s why I don’t use a handheld camera. Because we can never wait that long holding a camera. I would love to make a film with a more mobile camera, but we can’t move. 

Filmmaker: You said in an interview that reality is cyclical, and that you discover patterns within it.

Alberdi: I don’t make films about the past. I am shooting in the present in all of my films. When I’m shooting, I trust that if I wait, the things that I saw before will happen again. I don’t know when, but they are going to happen. So as I saw the other mole cases, in my mind I knew what kind of things Romulo was going to ask Sergio. So I knew what I am going to shoot.

I’m going to give you an example from the first film where I learned that. It’s called  A Lifeguard (El salvavidas, 2011). The main character thinks that the best lifeguard is the one who never needs to go into the water — he prevents accidents from happening. But he works at the most dangerous beach in Chile, where every summer someone drowns. My concern was, okay, I have a film about the lifeguard. He has to face whether or not to go into the water. And I need that in my narrative. But how can I shoot that I’m shooting a second character, or I’m running around someplace else?

Okay, I have to study the behavior at this beach. I spent a summer trying to understand the routines there. I studied the marine statistics. I learned that all of the people drowned at the same place between five and six in the evening. I didn’t know which day it was going to happen, but I knew the time and the place. So we spent all the summer in the same place at the same time waiting. We were there when it happened, and we have it in the film.

Filmmaker: But you’re still selecting, choosing as you go along. There is a scene in The Mole Agent you couldn’t have predicted, when a frightened woman breaks down into tears in front of Sergio.

Alberdi: In some ways you can predict, because you learn the world there. There were 50 women in the home, and we choose six to follow because we knew something was going to happen to them. That woman, for example, she’s saying her son didn’t come to visit. That’s something she said to other people, something she said to me. So I knew when Sergio introduced himself, she would say something similar.

Filmmaker: That moment reaches a universal truth, the fear everyone faces about growing old. It stripped away the rest of the narrative framework for me.

Alberdi: I believe that documentary filmmaking is like being a sculptor. You have this big rock that is reality, and it is big, because that place has a lot of people. You have to chisel until a figure appears. The decision about what you are taking out is more important than what you are keeping.

Filmmaker: You had 300 hours of material. How difficult was the editing process?

Alberdi: We had a lot of versions. For example that scene you mentioned, at the time I shot it I was living with Sergio in the home. I was living the same feelings as he did. I had the same emotional commitments. And I have to deal in the editing with how to balance the original case, and my emotional experiences. 

We shot the case, the client, all the details about her. In the beginning I thought I had to explain everything, and until the end what I was shooting, the narrative plot, was the case itself. In the editing room I found my heart was not in the case. Yeah, it was rational, it advanced the story. But my emotions were what was driving me forward. It was super-difficult to realize that, to say for example, “Okay, the client is not going to appear after all.”

It took me a year to remove the client and make the movie Sergio’s journey. Or, for example, the decision to put myself in a shot. That was an editing decision. We edited in the Netherlands and showed it to a lot of Dutch people who kept asking, “Is this really a documentary?” I didn’t want people to get lost, I preferred to put that in the beginning to make it easier for you to enter into the story.

Filmmaker: What’s your next project?

Alberdi: We are very early in shooting about a young couple. The man is fifty years old, he has Alzheimer’s, and it’s a love story about how the couple deals with that. Covid has made it terrible for them, and for me too because I can no longer shoot them. But she’s started shooting, and has brought a new life to the project. 

It’s frustrating for everybody, not just me. It’s difficult after working on this for so many years to try to adapt to new forms of exhibition. My mind needs to be more open.

Related posts

Facebook wants to know how it’s shaping the 2020 elections — researchers say it’s looking too late and in the wrong places (FB)

Summary List Placement

Facebook was first warned in late 2015 that Cambridge Analytica was misusing data illicitly harvested from millions of Americans in an attempt to sway the 2016 US elections.

It didn’t pull the plug on the firm’s access to user data until March 2018 after reporting from The Guardian turned the breach into a global scandal.

More than two years later — and barely two months before the deadline for votes to cast their ballots in the 2020 elections — Facebook has decided it wants to know more about how it impacts democracy, announcing last week that it would partner with 17 researchers to study the impact of Facebook and Instagram on voters’ attitudes and actions.

But researchers outside of the project are conflicted. While they praised Facebook for promising to ensure more transparency and independence than it has before, they also questioned why the company waited so long and just how much this study will really bring to light.

“Isn’t this a little bit too late?” Fadi Quran, a campaign director with nonprofit research group Avaaz, told Business Insider.

“Facebook has known now for a long time that there’s election interference, that malicious actors are using the platform to influence voters,” he said. “Why is this only happening now at such a late stage?” 

Facebook said it doesn’t “expect to publish any findings until mid-2021 at the earliest.” The company did not reply to a request for comment on this story.

Since the company is leaving it to the research team to decide which questions to ask and draw their own conclusions — a good thing — we don’t yet know much about what they hope to learn. In its initial announcement, Facebook said it’s curious about: “whether social media makes us more polarized as a society, or if it largely reflects the divisions that already exist; if it helps people to become better informed about politics, or less; or if it affects people’s attitudes towards government and democracy, including whether and how they vote.”

Facebook executives have reportedly known the answer to that first question — that the company’s algorithms do help polarize and radicalize people — and that they knowingly shut down efforts to fix the issue or even research it more.

But even setting that aside, researchers say they’ve already identified some potential shortcomings in the study.

“A lot of the focus of this work is very much about how honest players are using these systems,” Laura Edelson, a researcher who studies political ads and misinformation at New York University, told Business Insider.

“Where I’m concerned is that they’re almost exclusively not looking at the ways that things are going wrong, and that’s where I wish this was going further,” she added.

Quran echoed that assessment, saying: “One big thing that they’re going to miss by not looking more deeply at these malicious actors, and just by the design, is the scale of content that’s been created by these actors and that’s influencing public opinion.”

A long list of research and media reports have documented Facebook’s struggles to effectively keep political misinformation off its platform — let alone misleading health claims, which despite Facebook’s more aggressive approach, still racked up four times as many views as posts from sites pushing accurate information, according to Avaaz. 

But political information is much more nuanced and constantly evolving, and even in what seem to be clear-cut cases, Facebook has, according to reports, at times incorrectly enforced its own policies or bent over backward to avoid possible political backlash.

Quran and Edelson both worried that Facebook’s election study may not capture the full impact of aspects of the platform like its algorithms, billions of fake accounts, or private groups.

“You find what you go and you look for,” Edelson said. “The great problem of elections on Facebook is not how the honest actors are working within the system.”

Quran also said, though it’s too early say this will happen for sure, that because it’s Facebook asking users directly within their apps to join the study, sometimes in exchange for payment, it risks inadvertently screening out people who are distrustful of the company to begin with.

“We’re already seeing posts on different groups that share disinformation telling people: ‘Don’t participate in the study, this is a Facebook conspiracy'” to spy on users or keep Republicans off the platform ahead of the election, he said. “What this could lead to, potentially, is that the people most impacted by disinformation are not even part of the study.”

In a best-case scenario, Edelson said the researchers could learn valuable information about how our existing understanding of elections maps onto the digital world. Quran said the study could even serve as an “information ecosystem impact assessment,” similar to environmental impact studies, that would help Facebook understand how changes it could make might impact the democratic process.

But both were skeptical that Facebook would make major changes based on this study or the 2020 elections more broadly. And Quran warned that, despite Facebook’s efforts to make the study independent, people shouldn’t take the study as definitive or allow it to become a “stamp of approval.”

It took Facebook nearly four years from when it learned about Cambridge Analytica to identify the tens of thousands of apps that were also misusing data. And though it just published the results of its first independent civil rights audit, the company has made few commitments to implement any of the auditors’ recommendations.

Join the conversation about this story »

Related posts