How Watermelon Cupcakes Kicked Off an Internal Storm at Meta

Arab and Muslim workers at Meta allege that its response to the crisis in Gaza is one-sided and out of hand. “It makes me sick that I work for this company,” says one employee.
A photo illustration of textured watermelons.
Photo-Illustration: WIRED Staff; Getty Images

In late May, Meta invited New York staff to what it called a “summery showcase” to learn about clubs across the company. Its promotional poster featured colorful slushies and watermelon desserts.

But when a club for Muslim workers revealed plans to spend $200 in company funds to serve nine dozen cupcakes in watermelon colors at the event, Meta management called the offering disruptive and demanded the group go another route—such as “traditional Muslim sweets,” a staffer overseeing internal community relations wrote in a chat to an organizer. “Watermelon references or imagery should not be included as part of materials or giveaways (e.g. cupcakes).”

The dispute over workplace treats, which two employees described to WIRED and a third posted about publicly on Instagram, is emblematic of the deep ruptures carved across the tech industry by the ongoing war in Gaza between Israel and Hamas.

Watermelon for decades has been a stand-in for Palestinian resistance to Israeli occupation because its colors match the Palestinian flag. The fruit’s symbolic usage has grown since the latest fighting broke out last October. Jewish and Israeli tech workers have felt targeted as pro-Palestinian rhetoric and symbols sometimes get interspersed with what they view as antisemitic or anti-Zionist hate.

Meta deemed the planned watermelon cupcakes a violation of its ban on workplace discussions about war or statehood, though its New York cafeteria served fresh watermelon slices the day of the club fair and many times since. In the end, green cupcakes with pink frosting and black pearl topping (which didn’t look much like watermelon) were served.

“I am deeply concerned and tired of the exorbitant internal censorship at Meta, that is now hinging on absurdity,” Saima Akhter, a data scientist at Meta involved in the proposed cupcake offering, wrote on Instagram on May 29 after the company squashed the plan.

She is among 15 Muslim and Arab workers across multiple tech companies who tell WIRED or expressed on social media that incidents like the cupcake dispute have left them feeling unsupported by their employers. They worry this has translated into poor product decisions that harm some users. “How can I trust that we as a company can moderate content on our platforms equitably for our users, when I see how we moderate content internally—in a discriminatory and absurd manner,” Akhter wrote in her post.

Meta fired Akhter two weeks later, which two sources say makes her one of at least four pro-Palestinian employees it let go since October 7 for various internal policy violations. Akhter declined to comment for this story, but in a public post on Instagram she said that Meta fired her for making a personal copy of a 47-page internal document compiled by employees about allegedly biased treatment of Palestine-related content on the company’s services. It described translation errors that went unaddressed, AI-generated WhatsApp stickers that portrayed Palestinians as terrorists, the disabling of users’ pro-Palestinian fundraisers, and discrepancies between how the company responded to the war in Ukraine and the war in Gaza, a former employee says.

Meta also declined to comment for this story. But in a previously unreported company-wide message from June 4 seen by WIRED, Meta’s chief diversity officer, Maxine Williams, wrote that the tech giant “decided to limit discussions around topics that have historically led to disruptions in the workplace, regardless of the importance of those topics—this includes content related to war and statehood. Some topics are, simply put, off-limits.” She wrote that Meta had considered the importance of personal expression and supporting or educating colleagues, but “landed on policies that prioritize conversation … that can be discussed without disturbance or distress.”

Workers are frustrated that the crackdown on conversation about Gaza has suppressed not only potential expression of support for Palestinians, but also what they view as legitimate efforts to investigate and demand fixes to reported issues in Meta’s services affecting pro-Palestinian users. “Our concerns are not being addressed,” one employee says. Most workers spoke to WIRED on the condition of anonymity to avoid retaliation from their employers.

In February, Meta fired Ferras Hamad, a machine learning engineer of Palestinian descent, after he tried to determine whether an algorithm had wrongly labeled Palestinian photojournalist Motaz Azaiza’s content as pornographic, which has cost Azaiza viewership on Instagram. Meta accused Hamad of violating its user data access policy, which bars employees from working on accounts of people they know personally.

Hamad has said he’s never met Azaiza and sued Meta for wrongful termination and other causes in June. His firing came just before he was to earn a significant amount of Meta shares, according to the lawsuit. Azaiza’s account still doesn’t show up when searching for his name in Instagram; results instead are fan or spam accounts. Meta has yet to formally respond to the lawsuit.

Jenn Louie, an operations manager at Facebook from 2018 to 2020 who is now at Harvard University researching the interplay between morals and innovation, says management at tech companies have focused too much on risk mitigation to protect the corporation and too little on facilitating uncomfortable discussions to protect their most vulnerable employees. The days of distancing oneself from global conflict under the premise of being an “employer” are gone, she contends.

“Companies need to stop demonizing and shutting down their employees who air grievances, or dismissing them as extremists or threats, for them wanting to have a difficult conversation,” she says. “Silencing has never been a pathway towards building lasting peace for any organization or community that aspires to be inclusive and diverse.”

Boiling Over

Tensions have been simmering at Meta for years. In 2022, a human rights report that Meta commissioned identified “unintentional bias” and other decisions that impacted “the rights of Palestinian users to freedom of expression.” Last December, at least two Arab engineers quit the company for what they described as its lack of acknowledgement of Palestinian suffering and uneven application of internal discourse rules. And that month, Human Rights Watch accused Meta of “systemic censorship” for suspending and suppressing what the advocacy organization described as “peaceful expression in support of Palestine and public debate about Palestinian human rights” on Facebook and Instagram.

Human Rights Watch’s allegations validated concerns of the Palestinian Working Group, a years-old intranet forum at Meta, consisting of more than 200 staff, for discussing issues faced by Palestinian users, according to one of the sources. The group in December organized an internal letter to top executives with a list of demands aimed at making services such as Facebook and Instagram safe and fair for Palestinian supporters.

The letter drew over 450 signatures in a few hours before workers say the internal community relations team took down the petition and deleted emails soliciting support from workers’ inboxes. Akhter, one of the organizers, had her “system access” disabled for three months, she would later say on Instagram. She said the petition had been flagged for violating Meta’s community engagement expectations, or CEE. Under the CEE, which launched in 2022, discussions of armed conflict, war, or other political matters are banned inside the company; violations can hurt performance reviews and, in some cases, lead to termination.

One former Meta employee believes the CEE was largely a reaction to the Black Lives Matter protests and the 2020 US presidential election. “Before that, you could have a profile photo that says ‘Make America Great Again,’” the source says. But the turmoil such political expressions caused within the company led Meta to crack down.

“Whether it is an opinion or a fact shared, at our size—and again, with the diversity of our population—an opposing or seemingly neutral comment could trigger feelings of contention, sadness, anger, distraction and even grief,” Williams, the diversity chief, wrote in her message last month. Williams reminded workers that they are welcome to speak up externally, including on Meta apps such as Instagram or Threads.

In May, some members of PWG issued a new demand letter, this time on a public website, Metamates4ceasefire.com, which has since been signed by more than 200 current and former Meta workers. A small collective of the current and former workers also launched a public Instagram account called Meta4employees that’s “dedicated to promoting transparency and accountability in our practices and policies.” Its profile picture is Meta’s logo made to look like a watermelon slice. Similar groups have popped up at Apple (Apples4Ceasefire), Microsoft (No Azure for Apartheid), and Amazon.

The current demands from Meta workers are that the company acknowledge “the lives lost in the ongoing humanitarian crisis in Gaza,” commit to a workplace in which all communities “feel seen, safe, and supported,” and call for “an immediate, permanent ceasefire.” Workers involved respect that some discussions may remain banned internally, but they argue that enforcement must be more even. They also doubt anything will change. “I don’t feel any accomplishment,” one Meta worker says. “Nine months into the war, I’m disillusioned. I don’t think there’s flexibility or malleability in this company.”

Meta employees point to several recent internal episodes as evidence of bias. As an experiment, a member of Meta’s Muslim employee resource group posted in the group’s internal forum encouraging members to commemorate International Nakba Day, the annual May 15 commemoration of the mass displacement of Palestinians. The post used nearly identical language to one about International Holocaust Remembrance Day, which falls on January 27, that remained up. Meta removed the Nakba post, according to Meta4employees on Instagram and two workers WIRED spoke to.

Three employees have filed Equal Employment Opportunity Commission complaints regarding the removal of their content under the CEE policy, a former employee says. Meta declined to comment on the alleged complaints.

Expressions of sympathy are supposed to be generally okay under Meta’s CEE. But planned vigils and posts expressing condolences for colleagues whose family in Gaza died have been shut down because war-related content is not allowed under any circumstance. Information about fundraisers and ways to help people in Gaza also have been removed, while some posts from last October seen by WIRED about prayer events and rallies to support Israelis have not been. Posts containing verses from the Quran encouraging others to “pray for the oppressed” were also removed.

“I just find that very disturbing,” a former employee says. “There’s definitely an imbalance of what gets to be put up there and what's not.”

Williams in her note explained that “‘Prayers for …’ any location where there is a war in process might be taken down, but prayers for those impacted by a natural disaster, for example, might stay up.” She continued, “We know people may not agree with this approach, but it’s one of the trade-offs we made to ensure we maintain a productive place for everyone.”

Pain and Distress

Meanwhile, Arab and Muslim workers expressed disappointment that last month’s World Refugee Week commemorations inside Meta included talks about human rights projects and refugee experiences and lunches featuring Ukrainian and Syrian food but nothing mentioning Palestinians. (WIRED has viewed the internal schedule for the week.)

They were similarly dismayed that Meta’s Oversight Board, which advises on content policies, wrote in Hebrew, but not Arabic, to solicit public comments about the Palestinian human rights expression “from the river to the sea,” including whether it’s antisemitic. An Oversight Board spokesperson did not respond to a request for comment.

The workers also remain frustrated that Meta hasn’t met their demands from December to remove the Instagram accounts of anti-hate watchdog groups such as Canary Mission and StopAntisemitism that have been shaming Palestinian supporters in alleged violation of platform rules against bullying. Leaders of PWG met with Meta executives including Nick Clegg, the president of global affairs, who vowed to keep dialog with workers open. But the accounts remain up, and Canary Mission and StopAntisemitism each have added about 15,000 followers since demands were drafted.

Taking it as a sign of the uphill battle they face, the employees recently seized on a photograph on Instagram showing Nicola Mendelsohn, head of Meta’s Global Business Group, posing beside Liora Rez, founder and executive director of StopAntisemitism. Rez tells WIRED that her group does not hesitate calling individuals out for antisemitic views and alerting their employers, but declined further comment. Canary Mission says in an unsigned statement that “there needs to be accountability” for antisemitism.

The disputes over Meta’s response to Gaza discussions have had cascading effects. In May, Meta’s internal community team shut down some planned Memorial Day commemorations to honor military veterans at the company. An employee asked for explanation in an internal forum with over 11,000 members, drawing a reply from Meta’s chief technology officer, Andrew Bosworth, who wrote that polarizing discussions about “regions or territories that are unrecognized” had in part required revisiting planning and oversight of all sorts of activities.

While honoring veterans was “apolitical,” Bosworth wrote in the post seen by WIRED, the CEE rules needed to be applied consistently to survive under labor laws. “There are groups that are zealously looking for an excuse to undermine our company policies,” he wrote.

Some Arab and Muslim workers felt Bosworth’s comments alluded to them. “I don’t want to work anywhere that is actively discriminating against my community,” says one Meta worker who’s nearly ready to leave. “It makes me sick that I work for this company.”

Meta hasn’t let up on CEE enforcement in recent weeks. Workers remain barred from holding vigil internally. As a result, they planned to gather near the company’s New York and San Francisco offices this evening to recognize colleagues who have lost family in Gaza to the war, according to the Meta4employees Instagram account and two of the sources. They are curious to see how the company tries, if at all, to stop the memorial, which the public is invited to attend.

Ashraf Zeitoon, who was Facebook’s head of Middle East and North Africa policy from 2014 to 2017 and still mentors many Arab employees at Meta, says discontent among those workers has soared. He used to push long-timers to quit when they were frustrated; now he has to convince recent hires to stay long enough to give the company a chance to evolve.

“Unprecedented levels” of restrictions and enforcement have been “extremely painful and distressing for them,” Zeitoon says. It seems that the emotions Meta had wanted to avoid by keeping talk of war out of the workplace cannot be so easily suppressed.