Welcome to Incels.is - Involuntary Celibate Forum

Welcome! This is a forum for involuntary celibates: people who lack a significant other. Are you lonely and wish you had someone in your life? You're not alone! Join our forum and talk to people just like you.

LifeFuel Elon Musk is suing “Centre for countering digital hate” headed by uncle tom currycel Imran Ahmed

ilieknothing

ilieknothing

كاريسل
★★★★★
Joined
Nov 8, 2017
Posts
12,031
On the day he learnt Elon Musk was suing him, Imran Ahmed contacted a lawyer, called his board, and arranged for a crisis publicist. Then, his initial shock having faded into a sense of manageability, he went to watch Barbie.

The lawsuit could yet be the downfall of Ahmed’s young campaign group, the Center for Countering Digital Hate. But so far it has mainly served to vindicate its tactics. Musk’s social media site X, formerly Twitter, claims CCDH cost it “at least tens of millions of dollars” in advertising, by highlighting hateful messages on its platform.

Ahmed’s organisation, which he launched four years ago after leaving British politics, has developed an outsized ability to get under the skin of social media companies. “I have annoyed personally the CEOs of [X, Meta, YouTube and TikTok],” he says, speaking from his home in Washington DC. “We’ve made them all sweat at some point.”

The White House cited CCDH’s research about anti-vaccine content online, with Joe Biden himself briefly going as far as to accuse Meta’s Facebook of “killing people”. A later report showed how new accounts on TikTok were shown information about suicide and eating disorders within seconds of signing up.

CCDH, which aims to change platforms’ behaviour through advertiser pressure, specialises in clear statistics, emotive terms and linking online abuse to corporate ads. “We don’t talk in the language of technology, we talk in the language of morality . . . Advertisers are also human beings. Some of them just don’t want to fund the primary vector of hate and disinformation in our society.”

The clashes with Musk began a year ago, when CCDH identified an increase in hate speech after the entrepreneur bought X. Ahmed accused Musk of having put up “a Bat Signal” to racists, homophobes and conspiracists. Musk called him a “rat”.

In its legal claim, filed in July, X accuses CCDH of scraping the network’s data unlawfully. It says its findings are cherry-picked and do not reflect billions of other posts. It adds that CCDH is funded by rival companies and foreign governments, something Ahmed flatly denies.

The charity, whose budget is about $3mn this year, does not disclose its funders. So far it has raised only $120,000 of the $250,000 it estimates it needs for an initial legal defence. But Ahmed has refused to back down.

Musk is “the most extreme member of a group of companies that feel they are beyond the criticism of normal people . . . We have to show that we will not be cowed by them.”

In follow-up research this month, CCDH said it had reported 300 abusive posts to X, but the network had failed to remove 86 per cent of them a week later.

The tweets not removed included: “A non-White will never be British”. X says all posts that breached its standards were removed. Musk responded to the research by labelling CCDH “bronze tier psy ops”.

Since then Musk has nodded to an antisemitic conspiracy theory, posting: “The Soros organization appears to want nothing less than the destruction of western civilization” (a post viewed 2.4mn times). He also commented: “I support Russell Brand,” in response to a user denying that rape allegations against the British comedian could be true.

“The Soros attack is essentially showing a bit of leg to antisemites, to conspiracists, saying, ‘You’re welcome here, and I get you guys,’” says Ahmed, likening it to Donald Trump’s flirtation with anti-vaxxers. “With Russell Brand, I think what he’s doing is different. He’s saying: ‘Hey Russell, screw YouTube, come to my platform’, because he’s desperately trying to accrue eyeballs.”

Many observers of Musk struggle to relate to his traumatic childhood. Ahmed is one who can empathise. Now 45, he grew up as the oldest of seven children in Old Trafford, Greater Manchester.

“We were very, very poor. My mum was very young when I was born. There was extreme need, there was violence around me.” His stairway out started when a teacher “took a fondness” to him and allowed him to attend a fee-paying school for free.

Ahmed empathises with the pain of being hurt by those who should love you and the fear that might instil. “I understand why the accrual of wealth might be a shield against the fear.”

After dropping out of medical school, Ahmed worked in corporate strategy for Merrill Lynch, but the 9/11 attacks shook his world view. Merrill Lynch’s New York office was damaged. What’s more, Ahmed is ethnically Pashtun, from the same tribe as the Taliban. He felt heightened responsibility. “I quit banking, I said screw that. I want to make the world safe.” He considered joining the army, but instead went to Cambridge to study social and political sciences. “That’s how I deal with fear: do something about it.”

He worked as an adviser to Labour politicians, including to the then shadow foreign secretary Hilary Benn. But in 2016, he was shaken again by the rise of antisemitism in the Labour party, the extremism around Brexit, and the murder of Labour MP Jo Cox. He found a common thread linking the three: toxicity on social media. “I couldn’t fight back against this great fear in my life when I was a child, but as an adult I can.”

To investigate social media, CCDH has set up fake accounts and examines what information is recommended to them. It seeks out posts that break a network’s community standards, reports them using the network’s reporting tools, and sees whether action is taken.

X has criticised CCDH for not engaging more privately, saying it works more constructively with other campaign groups. Initially Ahmed was happy to engage privately with companies. “They said, how marvellous that you’ve found this — why don’t you come and speak to our team here, our team there. I realised that essentially it was a scheme to contain our criticism so that only they heard it and so that it wasn’t going public where it would be really damaging. So we don’t privately brief people on research anymore.”

Researching social media platforms has obstacles. CCDH has just 20-odd staff. The platforms say they block most unacceptable content before it appears. “I don’t know if that’s true, because no one can see inside those companies. They are essentially marking their own homework,” says Ahmed.

This year X started charging more for access to its data, in an effort to lessen its financial losses. Ahmed sees that move as part of a squeeze on scrutiny. “Researchers I know are being told [by data providers]: ‘Sorry, we’re not allowed to service you.’ Twitter is deliberately becoming more opaque.” He believes the platform’s analysis may be: “If we have plausible deniability because nobody is able to research [hate] at scale, then we can gaslight our advertisers”.

X accuses CCDH of promoting censorship, a theme taken up by Republican congressman Jim Jordan, whose judiciary committee has subpoenaed the charity. Another of X’s critics, the Anti-Defamation League, concedes that free speech does protect hate speech. Ahmed is less tolerant. “You absolutely have the right to say whatever you want. But I absolutely have the right . . . to say you are spouting hate speech and it’s disgusting that anyone’s funding you by putting their ads on your platform.”

The UK’s online safety bill, passed by the House of Lords last week, provides welcome oversight, but it does not go far enough in terms of demanding transparency of algorithms, says Ahmed.

How different would social media be, in CCDH’s blueprint? Would kids still spend five hours a day on social media, just without seeing hateful content?

“You shouldn’t be spending five hours in a chaotic information ecosystem, that’s just bananas,” says Ahmed. “The reason why young people are more prone to disinformation and more conspiracist is that they’ve got lower [previous] information but also because they are spending 90 minutes a day on TikTok.”

Ahmed highlights the rise of climate disinformation. “Fourteen to 17-year-olds, our data shows, are less likely to believe in anthropogenic climate change than adults. That’s really scary.” This undermines the idea that “the kids will sort [climate change] out. They won’t.”

Ahmed sees little evidence that things are improving. “I’m looking at the fundamentals,” including “increasing usage of these platforms [and] these platforms shedding trust and safety staff.” Upcoming CCDH reports range from how social media is being used to influence the debate over reproductive rights in Africa to how it pushes young men into unsafe steroid use.

So is it ethical to stay on X? Ahmed declines to say, preferring to note that “it can be hazardous to people to be an information environment where [hateful] ideas are being normalised”. He and CCDH continue to tweet and post, unable, like the rest of us, to kick the habit.

Previous enemies of Musk have found their safety at risk from trolls. Has the lawsuit led to threats? “We don’t ever talk about threats, because if you talk about threats, you get more.” Some trolls “get off on seeing others in pain . . . Showing that you are scared or in pain makes them feel that they’ve succeeded. So why would I do that?”


Imagine it being your life goal to stop others from saying things on the internet.
 
:worryfeels: Musk is @Master

 
:worryfeels: Musk is @Master

At least we know we are definitely financially secure. Elon has our back
 

Similar threads

CircumcisedClown
Replies
43
Views
940
Julaybib
Julaybib
Lv99_BixNood
Replies
8
Views
219
Lv99_BixNood
Lv99_BixNood
Stupid Clown
Replies
16
Views
630
Darth Aries
Darth Aries
Polishfacecel
Replies
24
Views
1K
Cayden Zhang
Cayden Zhang

Users who are viewing this thread

shape1
shape2
shape3
shape4
shape5
shape6
Back
Top