This is artfully researched. A great read and a valuable take on events that we'll need as a society to learn from, as the algorithm-powered platforms are not going anywhere.
Very strange to hear any "free speech" arguments in this thread. I can only assume those commenters haven't read the article, which enumerates multiple examples of speech that are equivalent to shouting fire in a crowded theater.
I not only read the article (which was great), but I was in the same space at this time: one of the civil society groups constantly trying to raise issues with Facebook.
It all rings very true to me: I think one thing to note is that not only did Facebook lack Burmese speakers internally, but so did the human rights groups with the best access to Facebook. As Kissane notes, Myanmar came online very quickly after the reforms, and -- unlike many countries in the Middle East or even China, because the story there was not one of authoritarian oppression, but a /release/ from authoritarian pressure, it did not fit easily into the template that tech companies were slowly learning to respond to. A few years before this, Facebook and Twitter had been shaken into some kind of recognition of their responsibilities in the Arab Spring and Iranian protests, but the result of that had been a very, shall we say, US-centric view of how repression plays out globally. Bluntly, the violent genocide of muslims by buddhists, in a country led by Nobel Peace Prize winner and defender of democracy Aung Sang Suu Shi, was a story that had to cut across many political and cultural presumptions of the US and Europe before it would be listened to.
I guess that brings me to your comment about "free speech". It seems a little petty to point you to the usual put-downs about how the phrase "shouting fire in a crowded theater" has historically been deployed to /silence/ people trying to /stop/ mass violence (because they are seen to be trouble-makers).
I'll note though that the failures of Facebook and others at this time came about because people claimed to be able to moderate and provide a forum where only civil discussion and "the truth" would be discussed -- and could be swayed to stop it, if you could just get through to the right people. Many of us, both in "free speech" organizations and on-the-ground humanitarian groups, argued that this was a role that Facebook was not, and could not play: and the more it claimed it could take on this responsibility, the more terrible the consequences would be.
Thank you very much for your thoughtful reply. I appreciate your note about the history of "shouting fire in a crowded theater" as well.
As someone who was trying to open Facebook's eyes to their shortcomings at the time, what do you see as the larger lesson we can glean? It seems likely that blindness (for any reason) to the specific cultural dangers of new tech will become harder and not easier to spot as time goes on and these large organizations become further convinced of the completeness of their own understanding. I'd be curious to hear what you've learned generally that we can apply next time.
I believed, and still believe, that speech and content moderation simply isn't possible at the scales and staffing that Facebook and other tech companies of the 2010s want to operate. Civil society organizations struggled at the time to match and warn Facebook, but I don't think they can scale up either. I was at a human rights-related event the other day and somebody talked about leaving behind the "trashfire" of the Facebook Oversight Board. I can believe it -- it's the sort of solution that you end up having to knit together at those scales. If we seriously believe we can create some sort of global speech government, why haven't we?
Conversations are intimate, contextual, and should be far more directly under the local control of the speakers. This feels counter-intuitive to many when we see modern genocides like Myanmar and Facebook (and before it, Rwanda and short-wave radio) where mass media played its part in fanning the flames. But censorship and control are temporary fixes to those deeper problems -- and it's a solution that feels more comfortable the further you get from the details of each disaster, or the closer and more familiar you are with those with the power to censor.
My instincts (and my work) assumes a lot of the problems come, as you say, from the centralization of these large tech organizations, but of course there are also plenty of challenges at the more human level. It's significant for me that Western traditions of free speech emerged from decades of vicious religious wars, and appear to be more stable than the cycles of repression and counter-repression that proceeded them.
> Conversations are intimate, contextual, and should be far more directly under the local control of the speakers.
Fair enough… but what about people who use Facebook as a way of broadcasting information? (Such as Ashin Wirathu in the linked article.) To me, that case feels very different to ‘intimite, contextual conversations’. And it is fundamental to Facebook’s design that it blurs the distinction between these two cases.
Is the answer then to restrict such things more severely than ordinary conversations? I really have no idea. But our current way of doing things doesn’t seem to be working very well at all.
>I guess that brings me to your comment about "free speech". It seems a little petty to point you to the usual put-downs about how the phrase "shouting fire in a crowded theater" has historically been deployed to /silence/ people trying to /stop/ mass violence (because they are seen to be trouble-makers).
The speech regulation problem feels fairly different depending on whether the team doing the regulation lives in the society they're trying to regulate.
If you live in the society you're trying to regulate the speech of:
* People in your society will attack or praise your speech regulation actions, as moves in the local political chess game.
* As a member of society, you are likely to have a "dog in the fight" for a heated discussion where someone calls for speech regulation.
* As a member of society, your thinking (including your thinking about what to censor) is affected by what you read, which is itself affected by what speech gets censored. There can be a feedback loop.
In the US, "freedom of speech" used to be a left-wing talking point, back when the right had more cultural power. Nowadays it is a right-wing talking point, at a time when the left has more cultural power.
We generally can't expect censorship mechanisms to be used in a principled way. Censorship mechanisms are powerful political tools that powerful people will fight to obtain, and they will be disproportionately wielded by the powerful. See, for example, Elon Musk's purchase of Twitter.
Contrast all that with the Facebook/Myanmar situation, which I suspect is more a case of criminal apathy and/or greed.
I'll admit I don't know anything about the situation, but the phrase "violent genocide of muslims by buddhists" was NOT something I was expecting to hear anywhere.
The attack on Meta here isn't really justified for the reasons Kissane seems to be providing. Take this paragraph, for example:
> By that point in 2018, Myanmar’s military had murdered thousands of Rohingya people, including babies and children, and beaten, raped, tortured, starved, and imprisoned thousands more. About three-quarters of a million Rohingya had fled Myanmar to live in huge, disease-infested refugee camps in Bangladesh.
That is the situation on the ground. The country is a powder keg, and there is evil afoot. Maybe there is more context here to come in part II, but it seems unreasonable to lay this at Facebook's feet from an external perspective. Facebook does not call for beatings, rape or killings.
I can certainly see Facebook's leadership caring about the PR and doing something. But that raises alternative problems - what exactly is the standard Facebook is meant to enforce? Should it ban everyone who is responsible for pointless killing? That includes the US leadership apparatus - should it ban everyone who voted Aye for the Iraq war? Afghanistan? Wars like Vietnam? (yes, the people who voted for it knew better). They won't do that. We'll have a large, powerful company applying uneven and subjective standards.
There is an issue there, but I think the specifics of why she is calling out Facebook are actually dangerously missing the point. The problem is the same no matter what influence Facebook tries to have - the problem is that the will of the people and the will of Facebook's leadership are radically different. In this case, the will of the people is unusually evil - but the problem is the clash and Facebook's position of influence, not which side is wrong on any particular day. Eventually, Facebook will be on the evil side.
Facebook didn't remove posts promoting ethnic violence in a timeous manner. In fact, it did the opposite, its algorithms promoted these types of inflamed posts to increase user engagement. It's not hard to understand why Meta is responsible unless you're being disingenuous; you didn't touch on that aspect. Facebook is not some harmless message transceiver. And not to mention, Facebook didn't have sufficient native speakers to moderate the content. So at worst, they're negligent.
There are responsibilities to allowing people on your platform and having them speak to the internet; free speech is not absolute.
The article mentions a murderous Mynamarese monk [0]. I think he is responsible. And people like him.
There is a real question here about whether Facebook's algorithms created the zeitgeist or just reflected it. Since most countries have Facebook and most countries are avoiding Facebook-fueled mass murder sprees, I suspect it is more that their algorithms accurately reflected what a large number of people were thinking.
It is an interesting question, and maybe there will be compelling arguments made in Part II. But that hasn't happened in Part I.
[0] The gap that can emerge between the priests of a religion and the religious ideals would stun the naive. If this article is accurate then the man might be an anti-Buddha.
The article mentioned that the zeitgeist was that the communities lived in peaceful indifference, but political and religious leaders with hate, only through the mass communication ability of Facebook, were able to leverage their authority status to radicalize the common man to believe a funny hateful man on the Internet over their local customs and mores.
The ideas people are exposed to can shape their own ideas. That’s how ideas work.
I think it's a fine line between "people shouldn't be exposed to bad ideas" and "people shouldn't be exposed to good ideas"
Leaders calling for racial genocide isn't really at that "fine end". Meta should be able to distinguish between that and "corporations should pay less taxes".
should your ISP be able to distinguish between these? Should WhatsApp or your encrypted messengers? Should your mobile phone providers?
I have seen discussion of this problem by anti-tech writers become "Facebook caused these atrocities". I agree with you that it's overstating Facebook's role.
However, I don't think that places Facebook's conduct beyond criticism. I'm very interested to see part 2, as it sounds like there are more positive acts by Facebook at that time. Rather than just the passive inaction which characterises part 1.
But, "refusing to enforce your own policies" is action, and it sounds like Facebook is very guilty of this.
I stress that I'm waiting for Part II, because I don't think she's really gotten to the meat of it. But so far we've got
Case 1: Facebook doesn't enforce their policies. There is a genocide.
Plausible Case 2: Facebook enforces their policies. There is a genocide.
Plausible Case 3: Facebook enforces their policies, shapes the future of a country and learns they are bigger than the people. They start actively influencing elections worldwide to shape a pro-Facebook political consensus.
Case 2 is better than 1, but given 3 there is a strong argument that the conversation is being misframed focusing on Facebook's policies. The case where they act and succeed is just as worrying as when they do nothing. Possibly more so, given that there is no evidence at all that Facebook pushing political opinions will result in good outcomes. The mass media have traditionally been cheerleaders for any number of atrocities and help shield corrupt people.
Why even consider these counterfactuals? Is asking Facebook to not amplify bald-faced calls for ethnic cleansing too much here? Can we agree that Facebook should make a conscious effort to NOT do that, and failing that withdraw from the market?
> Is asking Facebook to not amplify bald-faced calls for ethnic cleansing too much here? Can we agree that Facebook should make a conscious effort to NOT do that, and failing that withdraw from the market?
Sounds good to me. If you meet anyone who disagrees with any of that, let me know. But I don't think Kissane is aiming for that with this article. We don't have the full story yet, but it looks like it might be an attack on Facebook, using the crimes of the Myanmarese as emotional fuel.
> If you meet anyone who disagrees with any of that, let me know
Facebook appears to have disagreed with that; they amplified calls for ethnic cleansing and did not respond to concerns about it, so they must have believed that asking them not to was too much. That's the point.
That point hasn't been made. That is why I'm putting some time into this comment chain - we've got the journalism thing happening where people did terrible things, and people used Facebook, and then the journalist is painting Facebook as guilty by association without saying much specific that provides a link. And then letting low-empathy readers join the dots without considering what the people involved were likely thinking.
Bad people use Facebook. We don't need evidence to know that. This article is strong evidence that very bad people use Facebook, but it isn't at all clear that Facebook should be considered morally involved based on what has been presented seen so far.
Maybe the killing blow is yet to come. But I'm pretty sure any objective standard that gets Facebook in trouble here will get them in just as much trouble for letting Victoria Newland or US 4 star generals post publicly. There are a lot if brutes in public office.
Furthermore getting involved in matters of war and peace is not a role that Facebook will get praise for, it'll do some really terrible things if it goes down that path. They should be biased towards inaction. Even and especially if they care.
Yes, this may have happened anyway. Yes, Facebook is not fully responsible. But I disagree with you. The lines are clear.
Facebook de facto became the internet in a country of ~50 million people through subsidising their platform through free data access.
Their platform was developed in order to further their own goals - through maximising engagement and monetisation.
The second order effects of their own personal ambition was enabling people like Wirathu to reach hundreds of thousands of people with hate speech and calls for genocide.
Facebook were informed of this multiple times and allegedly, did nothing about. During this time they had 1 Burmese speaking moderator.
Stating that they have no moral responsibility for the consequences of their actions is in my opinion horseshit. But it does align with certain aspects of the current American zeitgeist of entrepreneurship, free speech and platform "safe harbour" regulations.
This is not a view shared everywhere and should not be assumed when American tech companies scale out of the US. Thankfully this dogmatic approach is being regulated by the likes of the EU and other countries so these platforms are more aligned with their own moral frameworks.
Personally, I find Facebook absolutely morally responsible for parts of this. Just through the simple fact that provided a platform for tens of millions of people - with severely lacking moderation - all in the chase of growth and profits.
This isn't exporting "freedom and democracy" to the world like the good old days. This is abhorrent profit maximisation with no regards for the consequences of their actions, hidden behind a thin veneer of moral rationalization.
I upvoted several posts of yours in this thread but man, are you implying FB was not actively promoting one side? I don't even believe it was "the algorithm".
Stuff like this does not happen by accident, nor in a vacuum. The only reason we don't hear more about this is that important people don't want us to.
Don't play the naivety card in topics like this.
Edi: downvote all you want, you horrible apologists. FB is a weapon and you know it.
> what exactly is the standard Facebook is meant to enforce? Should it ban everyone who is responsible for pointless killing? That includes the US leadership apparatus - should it ban everyone who voted Aye for the Iraq war? Afghanistan? Wars like Vietnam? (yes, the people who voted for it knew better). They won't do that. We'll have a large, powerful company applying uneven and subjective standards.
I completely agree that Facebook isn't well placed to make these decisions, however the architects of the Iraq War faced almost no consequences. I wouldn't be upset if Facebook banned Bush. I agree with you in principle but the example you chose is exactly one that I would be comfortable with. Incidentally Facebook did ban a different former president, so I guess your point about Facebook's position of influence is true.
Several NGOs tried at the time to raise awareness that a possible genocide was coordinated on Facebook. Facebook did nothing to interfere with this, despite that their platform is not only a passive message board, but actively features posts with high "engagement" to as many users as possible.
We have to be able to value free speech and a free exchange of ideas, and accept that this means we will have shady web sites inciting war and violence, while at the same time holding mass media responsible for their actions.
Whoever runs Facebook should be just as responsible as said shady web site. We should be talking personal responsibility in a literal legal manner here, not having philosophical arguments.
Keep in mind that Facebook at the time still tried to win over web forums, had the Arab spring in their back and did not miss a chance to talk about how they could change the world and had the political clout to do it.
Basically Facebook were the equivalent of radio stations in Rwanda. The difference being if facebook had assigned adequate resources, they could have removed egregious content.
Would that have stopped the killings? It might have slowed down the spread of rage.
Or it could have furthered censorship and propaganda. It's easy to sit on your hands and think the status quo is fine in Rwanda if every bit of footage revealing how bad it was is suppressed.
You are missing the point.
Radio is a tool, used to terrible effectiveness by several groups to further their aims. Rwanda was and is _horrific_. > 3/4million people died. So was Bosnia, Timor, Lebanon.
You can't go from 0-100% genocide without a concerted effort.
Should facebook have done more? you fucking betcha. Are they guilty of fanning the flames? totally.
are they responsible for the genocide? no. That is very much down to the military junta, aided by Aung san suu kyi.
> there is evil afoot
> the will of the people is unusually evil
I'm intrigued by this language. What do you mean by it? Do you believe in evil, are you religious? Do you believe people can be evil, or are you talking more about evil acts? I'm just curious because it stood out for me.
It is fair to say I'm irreligious. And, arguably, that I do not believe in evil. But we have a word, 'evil', and I feel it is the most appropriate word to describe people who coordinate genocides and/or mass violence.
To me, the key part of this article comes about 100 paragraphs in. Describing the situation in 2014:
Facebook has a single Burmese-speaking moderator—a contractor based in Dublin—to review everything that comes in.
Burmese is the 43rd most popular language in the world, so perhaps it isn't surprising that Facebook was having trouble recruiting moderators.
http://www2.harpercollege.edu/mhealy/g101ilec/intro/clt/cltc...
It's a sad story. As much as I like "free speech" to be the answer, this is a pretty chilling downstream result of a platform not being able to do heavy human moderation.
I don't get this though, even in the US how can you perform moderation at such a large scale? Should we moderate the whole internet as well? Internet was always moderated within each communities. If you have a forum, or a page, or a group chat, or whatever, then it's on you to moderate your community. The same is true for a church, or a volleyball club, or a school, etc.
Those hate freespeech , let me say this. I live there and you will never know how valuable and important is free speech when
You and your family could get killed , detained, tortured by
- just doing a HungerGames `3 finger salute` .
- just posting a photo of empty city , to share the truth to the world.
- just answering to a BBC Reporter.
Don't know anything about Myanmar. But I would like to add that we should not confuse/bias role of medium, the message and participants. In India, I see maximum no. of toxic messages flowing on WhatsApp groups. These are personal have not ranking or recommendations algorithm, just people forwarding the messages which is primarily because of the increased radicalization of the underlying society.
The article claims that "Arturo Bejar" was "head of engineering at Facebook", which is simply false. He appears to have been a Director, which is a manager title overseeing (typically) less than 100 people. That isn't remotely close to "head of engineering".
I point this out because I think it calls into question some of the accuracy of how clearly the problem was communicated to relevant people at Facebook.
It isn't enough for someone to tell random engineers or Communications VPs about a complex social problem. Those people are not trained in identifying or responding to a genocide, nor do they have the organizational power or professional experience to initiate a serious reaponse.
It is saddening but not surprising to me that there was a communication breakdown.
Read this. Once I started, I couldn't stop. It's a far quicker read than it looks (though this is just part 1) and utterly essential.
Kissane has put together a concise, readable, and horrifying account of both the background of the Rohingya genocide and how Meta actively fanned the flames.
Her argument for Meta's culpability is simple and effective. Meta faced two problems in Myanmar: (1) not enough Facebook users, and (2) increasing incitement and hate speech that was clearly and repeatedly stoking anti-Rohingya violence. The company repeatedly showed that it was only interested in growing the user base, even though they were clearly aware of how that growth fed a genocide.
Is there another side to the story? Seems like revenue on (at the time) a couple million users in Myanmar would not be worth the PR hassle. What’s Facebook’s incentive here? What content are they supposed to ban, and why is that content so popular with Burmese people?
> What’s Facebook’s incentive here
Corporate inertia. There wasn't clear ownership within FB to manage these kinds of political minefields back then. You have to remember that this was the 2010-2015 timeframe and social media was riding high due to the impact it had during the Arab Spring only a couple years prior.
> why is that content so popular with Burmese people
Because society in Myanmar is extremely communalized, with both ethnic strife among the 70 different ethnic groups and also religious strife between the 4 main religions.
Myanmar has been in a state of civil war since independence, and this is further exacerbated by regional powers fueling the flames by supporting militias along with the army.
I would say corporate inertia but also a mix of corporate immaturity and single minded incentives.
Meta’s corporate culture is single mindedly focused on engagement. They don’t care about side effects or other ambitions. They don’t have a corporate structure that encourages anything else that may be beneficial to their consumer base.
It means that anything that detracts from that engagement is counter to any personal goals an employee might have. And if you can’t personally relate to the problem, you’re far less likely to put your career on the line to push for something that runs contrary to the singular goal of engagement.
> Meta’s corporate culture is single mindedly focused on engagement. They don’t care about side effects or other ambitions. They don’t have a corporate structure that encourages anything else that may be beneficial to their consumer base.
I dont really see how that is unique to Meta though. The mandate of any corporation is essentially sociopathic. Unless the leaders really go out of their way for the company to behave in ways that are perhaps good for people (customer or not) and bad for the company the company is just going to be as selfish as it can be.
[flagged]
Facebook isn’t a dumb pipe. It is an algorithm which preferentially selects content which will increase engagement, in this case apparently sparking a genocide. Your analogy falls flat.
And the algorithm that maximizes engagement is the algorithm that gives the user exactly the content he most wants to engage with. In other words, with iteration, it will converge on an algorithm whose output is entirely driven by the user, not the developer. In this way it is basically a dumb pipe just like a phone that lets the user say exactly what he wants when he wants.
You are assuming users visit Facebook to engage with content and that Facebook's definition of engagement is consistent with the users' understanding of it. (One example, merely viewing my brother's photo is not considered engagement but sharing or liking it is.)
I think many people would agree that a dumb pipe in this context would mean FB not ranking and filtering content. I am not saying all users want unfiltered content. I am just arguing against calling FB a dumb pipe. If anything, it is a smart pipe, or a pipe bomb.
> the algorithm that maximizes engagement is the algorithm that gives the user exactly the content he most wants to engage with.
Nothing about any system that opaquely ranks and filters content resembles a dumb pipe. Moreover, maximizing the attention content captures from viewers IS NOT the same as maximising the value users derive from content.
The whole situation is awful. I do wonder what the internal discussions in Facebook at the time were like. How did all the human-rights groups find themselves talking with the department of fobbing people off instead of the one that would do something. Maybe the department of fobbing people off were trying to do something but had insufficient political power within Facebook. Though I guess those discussions/emails would be very unlikely to come out except in a court case and I’m not sure what case that would be.
If you look at the things Facebook work on, it does seem that they do care a lot about not wanting various kinds of bad behaviour on their platform. I don’t know if the scope was materially narrower in the past or if this was a problem of being too disconnected from the users in Myanmar (ie language barriers or applying policies designed for the US).
I have a hard time thinking about the counterfactuals. Could this have happened with Twitter for example? It feels to me like the biggest advantage facebook had was being subsidised and if it didn’t have the advantage (and maybe if prices had gone down a bit over a few years) the same thing could have happened over Twitter. It’s not clear to me how much the government wanted the genocide to happen. Presumably the big advantage to them to it happening over Facebook is some kind of deniability and if they didn’t care about that and were sufficiently competent they could have put the hate preachers on the radio instead of letting them find their audience on the internet.
> Could this have happened with Twitter for example?
It did - https://www.theguardian.com/technology/2018/dec/09/twitter-c...
FB was much more destructive simply because it was part of FB's Internet.org initiative [1] which drove FB adoption.
> It’s not clear to me how much the government wanted the genocide to happen.
The Arakan National Party, the primary party in Rakhine/Arakan, is anti-Rohingya and its leadership participated in the 2012 riots along with the 969 movement. They entered a coalition with the National League for Democracy (Aung Sang Syu Ki's party) and even got an Ethnic Affairs Minister [2] - that they held on to during the Genocide.
That said, the issues in Rakhine were further exacerbated by the India-China rivalry, as both countries turned a blind eye to the Tatmadaw and would arm ethnic militias within Rakhine and across Myanmar in general, as Myanmar is a buffer between the two countries, ethnic issues in Myanmar blowback across Southwest China (eg. Kokang Chinese * ) and Northeast India (eg. Manipur Ethnic Violence **), and both India and China have competing defense and infrastructure projects within Myanmar [3]
A lot of this could have been avoided if Jinnah annexed the Rohingya majority regions of Rakhine into Pakistan like Rohingya asked in 1946 [4]. It's sadly another forgotten chapter of Partition and decolonization of the British Raj
* If you live in San Francisco, most ethnic Chinese in Chinatown are now Kokang. They own most of those trendy Burmese restaurants like Burma Love and Manadaly
** The Myanmar subgroup of the Kukis (Chin/Zo) have a significant presence in SF and Daly City as well.
[1] - https://en.m.wikipedia.org/wiki/Internet.org
[2] - https://en.m.wikipedia.org/wiki/Arakan_National_Party
[3] - https://www.lowyinstitute.org/the-interpreter/how-china-indi...
[4] - https://thediplomat.com/2018/01/rohingyas-and-the-unfinished...
>The whole situation is awful. I do wonder what the internal discussions in Facebook at the time were like.
What's the bare minimum we can do while making it look like we're doing something?
If people can understand this article but why can’t they understand China banning western social media? In this case Facebook’s intention was to drive engagement, who’s to say it won’t be used by adversaries to selectively drive narratives?
As bad as it is in Myanmar -- replace s/Meta/The Internet/g and then check if that makes any difference.
Control your population, not the tools they use. Stop the use of one, they will find an alternative that serves the same purpose.
Unfortunately, this sort of glibness misses the fact that Meta’s algorithms push topic engagement and in doing so, amplify high-emotion content.
A passive user of the general internet is not as likely to encounter the same concentration of singular topics as they would on Facebook. Your comment would largely apply only to active seekers of said content.
I don't think it's that simple. "The algorithm" historically favoured engaging posts, which basically means "what people care about".
This same algorithm pushes all the good things that someone like Wirathu was doing (for the Buddhist population) as well as the horrific things (promoting genocide).
So let's talk about the obvious -- disabling "the algorithm". With no algorithm the problem might be even worse! If someone like Wirathu runs 20 accounts which each post 10+ times per day, his genocidal speech might be even more over-represented to the general populace. Timeline order rewards spammers, which rewards people with resources to simply scale output to more of the same.
Well ok, why not improve the algorithm then? It's easy to say "just promote the good stuff", but it's very difficult to do in practice at scale. Let's examine why.
Since it's all happening in Burmese, any ML models for sentiment or radicalization or hate speech will hugely lag behind English in effectiveness. A poorly trained auto-moderator is often worse than none at all, because it acts at random.
It is also a cat and mouse game -- simple models are stymied by basic (for humans) techniques like misspelling, swapping letters for symbols, inventing new slang words, dogwhistles, appropriating benign words, etc.
Ok, so why not scale up moderation? Well, they need to find sufficient Burmese speakers who can be in person in their offices. (In person because of data protection and resources like mental health support.)
It's not a popular language, so finding any candidates is already difficult, let alone people who are willing to wade through the horrors of humanity day after day for a paycheck.
Sure you can replace Facebook with whatever you want and come up with some scenario that will lead to a genocide. I bet this genocide would happen at some point regardless of facebooks involvement. There have been plenty before it. But the fact of the matter here is that in this case it _was_ facebook.
I live in Myanmar. I started my software company at 2010 , as soon as we gained democracy and here are my take as a citizen and a founder who had gone through various stages of the country.
The article is more about bashing Facebook and its algorithms but the ground situation is not much due to Facebook at all. The main problem is political players are using Racism , Nationalism , Brainwash , Multimedia as a tool to induce instability so they can go back to non-democratic country , just to take back our freedom.
The violence aren't real cause by hatespeech on the Facebook. Sure it increased Racism by a lot but also it leads to find out about truth fast.The actual genocide ware done by the Junta military and Junta assigned thugs who are infused within riots are by a group known as Ma Ba Tha , which was known before as Swan Arr Shin ( Meaning Super Heros) - which ironically used to kill peacefully protesting monks back in 2007 (Saffron Revolution), Now they are known as Pyu Saw Htee who are killing innocent people now regardless of Race or Ethnicity in suspicion of supporting Spring Revolution.
They are Ex Junta , Prisoners , low ranking members from the Rival Hardline Military Party lead by extremist nationalists . They are the one who raze and displaced millions of Rohingyas.
The start of Rohingya crisis :
- At first when the news breakout that innocent girl was raped and killed , the uncensored mutilated body of the underage girl along with caught preps had been posted online who are identified as rohingyas, that is the first time many people had seen violence on the Facebook and it was spread like wild fire , and there was a lot of hate online.
- And then. The MaBaTha movement started ( the group i mentioned above) , goes on ground , spreading the image of the post , they organize and formed other nationalists , and then they started using monks , most of them are military spies robed and planted into there since after Saffron Revolution - as a tool for political play.
- Soon after they started using monks many real Buddhist are shunning away and stop following as soon as they started using Buddhism as a tool for violence which is totally against buddha's ways and it become appearant that is a political play. - But the riots were organized by military , in Meikhtila case , polices guards the people who burned the whole town to ashes - who are Later Identified as MaBaTha .
So here is take away
- The crisis is totally fuled by junta nad organized by Junta Swann Arr Shin group ( MaBaTha , and now Pyu Saw Htee) (The organized criminal group , members existed since 1988, used again and again in 1988 , De Pe Yin massacre , Saffron Revolution) .
- If there were no Facebook , the Junta would use state owned media and journal outlets and would have the same effect but because of Facebook we have a chance to speak-out , we have a chance to find out truth, report massacres .
- Free Speech is very important for us , for a country who had lost freedom for 70+ Years
- Free Speech and Facebook has to do very little on Rohingya crisis since it is mostly done by the on-ground , organized criminal .
- Facebook Algorithms boost controversial topics that is undeniable
- Rohingya genocide was a military sponsored terrorism it was fueled by Military Junta and hard liners and organized criminal group founded by junta.
- When the crisis had been controlled by Democratic government , the military start to lose power so they stage a coup in 2021
Now situation is a lot worse
The Junta who organized the Rohingya crisis , stage a coup and killed over 5000 innocent civilians of all races and genders , and that had lead to people arming up and fighting against Junta . Details of which i couldn't say much here , because of safety reasons. Please contact me if you want to know more.
I feel awful saying this about an article about a genocide, but can someone tldr it? What is the substance of the issue?
There was a pre-existing prejudice towards muslims/rohinyga in Burma/myanmar by the dominant Buddhist population. As technology developed, specifically mobile phones, it became easier for anti-muslim voices in Myanmar to spread their message. Specific violent acts against Buddhists, such as rape/murder, were, without proper evidence, attributed to the Muslim minority, further stoking prejudice and hate toward the Muslim population. Facebook was warned on many occasions that users attributing the violence against Buddhists to Muslims was leading to calls for violence against the Muslim minority, with multiple meetings/discussions by Myanmar and western sources at Facebook hq itself, only for Facebook to not take any action but instead focus on increasing engagement in Myanmar regardless of the situation on the ground. Slippery slope ensues contributing to the death/genocide of potentially hundreds of thousands of Rohingya/Muslims in Myanmar. More evidence in part 2 which seems like hasn’t been posted yet.
- There was a pre-existing prejudice towards muslims/rohinyga in Burma/myanmar by the dominant Buddhist population.
No , i am 40 years old and before Rohingya crisis we have no hate towards muslim but have to admit we don't even know Rohingya race exist at all . The general populace is very peaceful towards muslim .
- it became easier for anti-muslim voices in Myanmar to spread their message.
The anti-muslim voice are not from civlians , they are from Junta and Hardline Junta who lost power due to democratic transition and the Hardline Party called USDP . So it is political as evident now as their failure and lead to coup.
- Facebook was warned on many occasions that users attributing the violence against Buddhists to Muslims was leading to calls for violence against the Muslim minority, with multiple meetings/discussions by Myanmar and western sources at Facebook hq itself, only for Facebook to not take any action but instead focus on increasing engagement .
Deaths and killing are done by military junta , with and without uniform - to look like civilians and their sponsored grouped called MaBaTha .
My apologies. I was just trying to summarize the op’s contention about what’s transpired with Facebook there. I have no reason to doubt what you say is mostly true, but do you think that some of the general Buddhist population, beyond the Junta/leadership/dictatorship, had anti-Muslim/Rohingya sentiment before Facebook was widely available and so it was those voices who were amplified when Facebook’s growth accelerated in the country? The primary person the article cites was a Buddhist monk from what I understand.
Budhhism can be mainly catgeorized as follow
- Theravada Buddhism
- Mahayana Buddhism
Those are original but in myanmar.
- Real Buddhist Who actually practices meditation and dharma
- Buddhist because of Ancestors
- MaBaTha Nationaist Buddhist - All about nationalistom actually which is brain washed and organized by junta.
- Nat Koe Buddhist , who is not after dharma , but follows rites , traditional spirit worshipping
Those latter 2 sact are the ones recurited into MaBaTha
That monk is a military planted spy especially for Rohingya crisis. He was pictured together with ex spy chief after coup.
> we don't even know Rohingya race exist at all
It has been long-standing government policy to deny the existence of Rohingya as an ethnic group, claiming that they're immigrants from Bangladesh, in order to deny them citizenship; this being a manifestation of pre-existing prejudice towards Muslims/Rohingya by the dominant political groups, both military junta and democratically elected civilians. (Though I think attributing this to Buddhists in general is just as much a mistake as attributing a crime to Muslims in general.)
So it's unsurprising if you hadn't heard anything negative about Rohingya, but maybe you heard something negative about ကုလား or some other moniker meant to deny them recognition.
Aung San Suu Kyi herself famously refused to even utter the word "Rohingya" while defending her dear generals against the genocide accusations, so I don't think you can put blame only on the junta/USDP and absolve the NLD from all responsiblity.
Recently, the NUG has started making noise about finally reforming the citizenship law, so maybe the situation will improve in the future, but I could also see them renege on their promises in the event they win the civil war and are no longer as reliant on international support.
thanks
It sounds bad but it does still sound primarily like neglect and not some kind of genuine active involvement, as I hear frequently suggested.
I might be the most naive person on Earth for thinking this, but I can’t imagine any circumstance where this was anything but neglect on FB’s part. Or maybe neglect isn’t the right word. Maybe FB, like so many tech companies, optimized for the wrong metrics. So it’s more proactive than neglect? If your audience is a vampire and you want to get engagement you show a lot of bulging veins. I don’t mean to suggest that the broader Buddhist population (ie most of Myanmar) is bloodthirsty, but if a tiny (tiny!) slice of it had hate on the mind it’s not hard to understand the argument that FB amplified the bloodlust (wow, look at the spike in engagement on the internal dashboard!). Chart goes up and to the right and some PM gets paid. And it’s super bad when it’s the ruling government (the Junta in this case) that’s the vampire, as a sibling comment contends. All so brutal.
Free speech is important. People don't get mad at air because it allows people to transmit their thoughts. People don't get mad at pencils because it doesn't censor what people write. Internet companies transmitting people's ideas is no different and wanting to censor people is against the principles of a free society.
This is juvenile.
Facebook is not free speech, nor does it merely "transmit" speech. It is a machine for algorithmically amplifying speech.
Facebook is not a pencil. Notice the article does not criticise mobile phone vendors for including keyboards in their phones.
Who do you think was pushing to subsidise data usage for Facebook? And to what end? Air doesn't do that.
I completely agree that free speech is important. I'm sure the people of Myanmar agree with that too after decades of regime rule. But it's irrelevant to this discussion.
I am on the ground and let me say this. I am Burmese , I am from Myanmar . I born there. Genocide is done by the organized criminal group sponsored by military junta. I totally disagree with freespeech cause the crisis.Junta is the one responsible for mass killing and genocide.Freespeech is helping us to fight junta . I am risking too much my self to say this already.
Freespeech is what they afraid a lot. Even with algorithm , Facebook is platform to deliver our words. I wish there are non-algorithmic , un-mainipulatble , non-moderated platform that can be reached out to the world.
My heart goes out to you. I can't imagine what you and your country are going through. I pray that your words will win out over those of hate and ignorance, over force and violence.
While air is a passive requirement, I would suggest that writing, the printing press, speakers, radio, television, and the web are all more similar. They are intentionally used to increase engagement of content intentionally created.
I respect agree with your belief that freedom of speech is important, but I disagree with your conclusion that "Internet companies transmitting people's ideas" is no different. The key idea here is that freedom of speech is not the same as freedom of reach: Facebook might give everyone the same text box to write posts or upload content, but they need to answer for HOW and WHY they recommend certain content over other types of content.
The comparison pencils or air isn't fair because they aren't being boosted by a recommender system, powering the application layer of the content delivery platform for 90% of a country's people. If I write a message on paper containing a call to action which incites violence (e.g. "We should round up all people with six toes on each foot and shoot them") and hand it to one person, the message has reached one person. I would say, the digital equivalent to "pencil and paper" or "talking in a public square" more like "Writing a message in an HTML file and putting that HTML on a web server you own." I think it would be much more difficult for your average user in Myanmar to reach millions, even if they were given a web server and the means to write HTML.
Comparing it to "getting mad at air because it allows people to transmit their thoughts" equates Facebook the Internet, when one is an organization of people who provide technology and the other is technology. There are data scientists and engineers who created the models which learned how to optimize for engagement, and there are product managers who help set the KPIs for those engineers (i.e. encouraged them to solve for engagement) and there are execs who benefit from all of the above. While no single person is solely to blame for what happened in Myanmar, it's wrong to claim that we can't at least reason and discuss the products still created, maintained and own by company (made of people).
My opinion on this is heavily formed by the training I received when I wanted to DJ for my university's radio station. Even in the US, where freedom of speech is protected in the Amendments to the Consitution, there were very specific things you could and could not say on the air (while being classified as a non-profit radio station). You could receive massive fines for profanity, obscenity, inciting violence, hate speech, with a large general emphasis on calls to action. Generally, I think this training was good and it encouraged everyone who learned to DJ to be more intentional with their language.
I really think that if Facebook is gonna enable users to reach millions, there absolutely should be some caveats to that. People should be able to say whatever they want if they are capable of providing the means by which their own speech is amplified--and even then there are clear cases where the megaphone needs to be taken away.
Facebook is giving megaphones to people and some people use it to dispense hate speech. While I might've been willing to give Facebook a pass in their early years, it is clear that between Cambridge Analytica and this that governments around the world need to do better when it comes to working with Facebook to better identify content on their platform which incites violence or calls to action which present a clear and present danger to other people if acted upon. Facebook isn't some amorphous inanimate entity that we can't negotiate/regulate/punish. If someone uploads a hateful message which has a 0.00001% of encouraging a hate crime in Burma, Burma can expect around 5 hate crimes as a result of this message. With great power comes great responsibility.
>Facebook might give everyone the same text box to write posts or upload content, but they need to answer for HOW and WHY they recommend certain content over other types of content.
It seems that Facebook's "crime" is not censoring people enough and not not their reccomendation algorithm was configured to boost an opinion the article's author did not agree with. Since the problem is with the contents I see it as relating to free speech.
>Even in the US, where freedom of speech is protected in the Amendments to the Consitution
While the constitution makes it hard to write laws that limit speech the US does not have full freedom of speech.
"not boosting" is not "censoring"
Well someone has to have free speech since someone has to make the censorship decisions. It can be:
1. The government
2. Internet company executives
3. Everyone
Sounds like you don't like 3. So please explain why only the government or internet company execs should have freedom of speech.
Liberty is not an absolute, positive liberty is one thing, negative liberty can't be accepted, we mean for positive liberty the ability to act within the boundaries of the liberty possessed by the rest of society, the ability for libertarian to be idiots affect themselves only, but until their idiocy doesn't become negative liberty, and for negative liberty we mean that set of liberties that go to affect the liberties of the rest of the society, saying that we should fucilate all libertarians is a negative liberty, because it spreads hate and put someone else in the society in danger and so my liberty directly limits the liberty of someone else
>and put someone else in the society in danger
Words can't put someone in danger. Sharing people's words doesn't limit other's liberty.
It's the safest thing we can do to let some violent "ideas"/"ideals" spread, and then acting when they're acting on it only no? We've seen it with Nazi on Africans, Gays and Jews (alphabetically ordered), it ended up so well.
I'd say, the freedom of speech can't contain concept that put third party in danger, or that could limit the liberties of others, because bad ideas have always become sufference
>It's the safest thing
I am not arguing about what is the safest thing. A free society should be free to make themselves unsafe if they so wish.
>the freedom of speech can't contain concept that put third party in danger, or that could limit the liberties of others
If a free society is not capable of relinquishing their freedom, then are they truly free?
...and I guess those part of the minority who instead would like to live in safety and health can also go somewhere else, right? Because how free is a society that hasn't come out of the clockwork orange
I think we can't go anywhere with our conversation, also because you've ignored the part about "putting someone else in danger"
How about we blame genocides on, I don’t know, the people committing the genocide? Meta didn’t burn down villages or shoot anyone. People did that. Are we supposed to believe those people have no agency and aren’t responsible for their actions? Like if you see a Facebook post you just have to mindlessly do whatever it says?
People have free agency but are also susceptible to outside influence and propaganda. Nobody is a truly rational agent -- otherwise they wouldn't be committing genocide.
> How about we blame genocides on, I don’t know, the people committing the genocide
We are, yet FB's failure to moderate as well as their failure to provide a legitimate method to escalate to HQ during an active genocide show a severe form of negligence.
Facebook is nothing . Its about 300k strong military and Criminal organized by military killing burning , scotching earth around the country .
I 100% agree, but FB needs to face some punishment for not moderating sooner, simply so they can make sure not to enable stuff like this again.
Also not all cultures / societies are at the same level and educated about these things. Letting something like FB loose in a world where people don’t understand algorithms and targeted information campaigns should’ve have obvious consequences to those building these platforms.
I have friends who live in Myanmar and can confirm the ground situation was a lot worse. When all your see are hate speech posts ( about beheading or at best driving away these low lives), it normalizes the hate and dehumanize the victims. History was full of atrocities like this . Free speech is great to have . But violence propaganda is a real problem. You see some of that in America too.
Edit : to give people who don’t bother to read the article some idea, many villages were burned down and before that happened , women got raped and killed , the entire village were looted and then killed and then they burned down the villages. The goal was to create fear so that the victims would have no choice but flee their own country. It worked .
You have to ask yourself why are we obsessed with free speech but you never hear anyone preaching free act? Why the arbitrary restriction to speech/expression? Ah, acts can cause harm to others; people can't just be allowed to do whatever they want, though they should be allowed to say whatever they want, because you can't harm other people by what you say, according to ... the "sticks and stones" principle.
The entire idea of free speech really rests on something as shaky as the sticks and stones principle!
Free speech lets us as a society determine which acts should be restricted.
My objection to speech restrictionists is that they rarely give a robust mechanism for deciding which speech should be restricted, a mechanism that's hardened against people abusing it to further their narrow self-interest.
Speech restrictionists also tend to ignore the "circular dependency problem": without anyone to defend a position, how do we know if that position is defensible? For example, suppose you live in a theocracy. You're an atheist. You start making your case for atheism. Just as you're about to make your case, the theocrats interrupt: "This speech is killing people. It's preventing them from reaching the blissful afterlife by converting them to atheism. This person is attempting eternal murder." And then throw you in jail.
I'm in favor of a well-designed terms and conditions for a platform like Facebook. But I also think it's too easy for people to say: "I don't like what you said! You shouldn't be allowed to say it!" The de facto impact of that rhetoric essentially amounts to mob rule.
A lot of the problem is definitions. I'm not a free speech 'absolutist', do I guess I'm 'restrictionist'. But what defines 'speech'. It can cover the spoken word, the written word, communication in art, photograph, video.
If a King said "off with his head", and someone carried out that order in a country where this was illegal, who did wrong? Is the King simply exercising his rights to free speech? What about paedophiles sharing images?
I suspect that it comes down to power imbalances. And that is something difficult to measure. A dictator and an influencer both have power to cause a lot of harm in what they say, as does an adult who shouldn't be near a child.
I agree with your example of the atheist, and in fact that has been the case for hundreds of years. The 'harm' in that case is something that people disagree on. An atheist and most scientists would disagree in the harm caused in that situation.
Look no further than Denmark’s attempt to ban burning of the quran.
https://apnews.com/article/denmark-quran-burning-law-proposa...
Doch[0], never limit yourself to a single example.
I grew up in the UK, where teachers at the time were banned by law from saying it was OK to be gay:
> "shall not intentionally promote homosexuality or publish material with the intention of promoting homosexuality" or "promote the teaching in any maintained school of the acceptability of homosexuality as a pretended family relationship"
- https://en.wikipedia.org/wiki/Section_28
And there are socially taboo topics today. I suspect even mentioning some of them here might create a flame war and annoy dang, so I won't.
Instead I'll point to drug references in films and TV, where weed was for a long time as taboo as LSD and heroin; and that in Anglosphere media, sex is more taboo than violence ("Straight up murder? Put that in a kid's film. Nipples, on mammary glands, the defining characteristic of mammals and a thing that infants have a biological imperative to stick into their mouths in order to not starve to death before the invention of fake ones on milk bottles? Banned for being too sexual.")
Despite these examples of mistakes when restricting speech, I am not a free speech absolutist. This is because I'm not an anything absolutist: there are limits to all things, finding the true boundaries isn't as trivial to pointing out the first two examples that come to mind, if that's one on either side saying the standard is half way between them, if they're on the same side rejecting the possibility of the other.
[0] a German word that should exist in English: to be used to deny a negative, where "yes" or "no" might be ambiguous.
I don't think "doch" is particularly fitting, because the GP did not make a negative statement (rather a positive one about not looking further than some limit). Its usage felt weird to me as a response to the GP.
> [0] a German word that should exist in English: to be used to deny a negative, where "yes" or "no" might be ambiguous.
Do "ugh" or "uhmm" fit the bill?
Danyet.
Indeed, you do have to ask yourself, you do have to learn something about this, rather then making up your own headcanon, and a good place to start is John Stuart Mill's Of the Liberty of Thought and Discussion: https://en.wikisource.org/wiki/On_Liberty/Chapter_2
> The entire idea of free speech really rests on something as shaky as the sticks and stones principle!
The "Free Speech means I can say anything I want on any platform" stance is actually very new in the United States.
This only started after the counterculture in the late 1960s-70s, when libertarian individualism (yes, the hippies had a very socially libertarian stance) in the New Left and the New Right lead to the revocation of laws such as the Fairness Doctrine, the Comstock Act, etc.
Before the 1970s-80s, American Free Speech doctrine was much closer to what you see in Canada, the UK, or Germany today - you are free to criticize and protest against the US Govt, but that doesn't mean you are free from moderation or censorship on a private or public platform.
Jurisprudence in the 70s-80s changed that by essentially removing the need for moderation, and that's how we have the irresponsible form of free speech that we have in the US today.
I blame Boomers and Gen X.
Because since we live in a post-enlightment society we believe that we should discuss everything, especially before doing something. We should be free to argue and discuss whatever whenever, but not do anything on a whim.
I don’t see that distinction as arbitrary at all. It seems like, among no other good or obvious options, exactly the place to put the line between free speech and restricted action.
There is much “bad” free speech, but “bad” action is intolerable. The distinction between the results of bad speech and bad action are clear - even if it is unfortunately true that speech can inspire action.
> among no other good or obvious options
It seems to me that there's an obvious alternative option. After all, if the reason we find the idea of free speech intuitively appealing is because we subscribe to some version of "speech can do no harm", why not just base everything upon the more fundamental harm principle ("one is free to V as long as V-ing causes no harm to others") which covers both the cases of speech and of act?
Because language is software, mind is hardware and acts are behavior, and we don't have toolings necessary to manipulate the software and it's dangerous to do things without proper toolings. Anyone on the topic of specifically freedom of speech and not general freedom are not real anarchists or anything of that sort if ... that is what you are after. But I'm not sure what your stance even is in the first place...
> The entire idea of free speech really rests on something as shaky as the sticks and stones principle!
I think that, rather, it rests on observation that restricting speech is basically jamming the communication in society, which creates ossification and prevents development.
This isn't the least bit shaky to well-adjusted adults.
For that to be true, you would have to deny the well-adjusted-ness of significant fractions of the population of Myanmar in recent years; of the persecutors of the Uyghur, the Yazidis, the Darfur genocide, the Effacer le tableau, the Hutus, the Rwandan genocide, Bosnian genocide, Isaaq genocide, Anfal genocide, …
…, the forces responsible for, and senior to, the Mai Li massacre, …
…, the general civilian population voting for the explicitly racist Nazi party, …
Why? Because these things only happened as a result of the fact that speech is convincing.
Best you can do here is say "those people are not well-adjusted", which is fine except for where the mal-adjustments come from: speech.
Think about it in reverse: if speech had no power to change us, it would not even matter if it was free or not.
All of things you said are happening again after the coup, and the world wouldn't even notice, again.
It was happening since before the coup, and has been happening since independence. Kachin, Saigang, Rakhine, Shan, and Chin States have all been facing violence insurgencies since independence.
and btw , the person you replying to lives in myanmar.
Oh dang!I didn't realize that!
This is not about free speech. Problem is algorithmically boosting hate.
What part of the Myanmar justice system failed? How do people get away with raping and killing people on camera with their identities known by the wider public without going to jail?
It’s ethnic cleaning and we know that it works and you can get away with it. That’s what the early founders of Israel did and hundreds of thousands of Palestinians fled and remain refugees to this day. Some of those who implanted these strategies went on to become leaders in Israeli society.
1. https://en.m.wikipedia.org/wiki/Deir_Yassin_massacre
2. https://en.m.wikipedia.org/wiki/List_of_Irgun_members