Supervisory board asks Facebook owner Meta Platforms to rethink its policy on manipulated media

Spread the love


An oversight board is criticizing Facebook owner Meta's policies on manipulated media as “ineffective” and inadequate to tackle the flood of online disinformation that has begun to target elections around the world this year.

A review of an altered video of President Joe Biden that circulated on Facebook revealed gaps in the policy, a semi-independent board said Monday. The board said the Meta approach should be expanded to focus on media no matter how it is created, not just videos created with artificial intelligence. This includes fake audio recordings that have already been used to impersonate political candidates in the US and elsewhere.

The Meta Oversight Board said the company should also clarify the harm it is trying to prevent and label images, videos and audio clips as manipulated rather than removing posts altogether.

We are on whatsapp channels. Click to join.

The board's opinion reflects the intense scrutiny many tech companies are facing for conducting election fraud in a year when voters in more than 50 countries go to the polls. As generative artificial intelligence c and low-quality “cheap fakes” on social media threaten to mislead voters, platforms are scrambling to catch and respond to false posts while protecting users' rights to free speech.

“As it stands, the policy makes little sense,” Oversight Board co-chair Michael McConnell said of the Meta policy in a statement Monday. He said the loopholes in the company's policy should be closed while ensuring that political speech is “unwaveringly protected.”

Meta said it is reviewing the oversight board's guidelines and will publicly respond to the recommendations within 60 days.

While audio deepfakes are not specified in the company's manipulated media policy, they are eligible for fact-checking, and can be labeled or down-ranked if fact-checkers rate them as false or altered, spokesman Corey Chambliss said. He said the company will take action against any kind of content that violates Facebook's community standards.

Facebook, which turns 20 this week, remains the most popular social media site for Americans to get their news, according to Pew. But other social media sites, including Meta's Instagram, WhatsApp and Threads, as well as X, YouTube and TikTok, are also potential hotbeds where fraudulent media can spread and deceive voters.

Meta created its oversight board in 2020 to act as a referee for content on its platforms. Its current recommendations come after reviewing an altered clip of President Biden and his adult granddaughter that was misleading but did not violate the company's specific policies.

The original footage showed Biden placing an “I Voted” sticker on her chest at her granddaughter's behest, then kissing her on the cheek. The version that appeared on Facebook was altered to remove significant context, making it appear that he touched her inappropriately.

Monday's board ruling upheld Meta 2023's decision to leave the seven-second clip on Facebook because it did not violate the company's existing manipulated media policy. Meta's current policy removes videos created using artificial intelligence tools that misrepresent someone's speech.

“Because the video in this post was not altered using AI and shows President Biden doing something he didn't do (not something he didn't say), it doesn't violate existing policy,” the ruling said.

The board directed the company to update the policy and label similar videos as tampering in future. It argues that to protect users' rights to freedom of expression, Meta should label content as manipulated rather than remove it from the platform if it doesn't violate any other policies.

The Board held that certain types of manipulated media are made for humor, parody or satire and should be protected. Instead of focusing on how the distorted image, video or audio clip was created, the company ruled that the policy should focus on the harm that manipulated posts can cause, such as disrupting the election process.

Meta said on its website that it welcomes the oversight board's ruling on the Biden post and will update the post after reviewing the board's recommendations.

Meta is required to comply with the Oversight Board's rulings on specific content decisions, but is not obligated to follow the Board's broad recommendations. However, the board has gotten the company to make some changes over the years, including making messages more specific to users who violate its policies to explain what they did wrong.

Jen Golbeck, a professor at the University of Maryland College of Information Studies, said Meta can be a leader in labeling manipulated content, but follow-through is just as important as changing policy.

“Will they implement those changes and implement them in the face of political pressure from people who want to do bad things? That is the real question,” she said. “If they don't make those changes and implement them, it's going to further erode this trust that comes with misinformation.”

Also read today's other top stories:

Elon Musk's Neuralink Troubles Over? Well, Neuralink's challenges are far from over. Implanting the device in a human is just the beginning of a decades-long clinical project fraught with competitors, financial hurdles and ethical issues. Read all about it here.

Deepfake video scam busted by cybercriminals! Hong Kong police said Sunday that scammers defrauded a multinational corporation of nearly $26 million by impersonating senior executives using deepfake technology, one of the first cases of its kind in the city. Find out how they did it here.

Facebook founder Mark Zuckerberg has apologized to the families of children who have been exploited online. But that is not enough. What lawmakers in the US are now asking social media companies to do. Dive in here.



Source link

Leave a Comment