Views expressed in opinion columns are the author’s own.

Facebook has recently been criticized for allowing hate speech and fake news on the platform, most visibly during the 2016 presidential election. But Facebook being a U.S.-based company doesn’t mean it should be ignorant of its contribution to violence around the world. It needs to develop stricter guidelines that account for the contexts of other countries and protect vulnerable communities from violence.

Last month, Sri Lanka blocked access to Facebook throughout the country. Facebook had become a platform for spreading hate against Muslims and calling for violence against them. There have been attacks on mosques, as well as on the homes and businesses of people in the Muslim community.

[Read more: Blame Facebook for allowing hate speech and fake news to flourish]

The Sri Lankan government indicated that Facebook was too slow in taking down posts flagged by users. Across the world, complaints about posts calling for violence against minority groups, advertising rape videos and spreading lies about communities were ignored or addressed far too late.

Another example is in Myanmar, where an ultranationalist and anti-Rohingya Buddhist monk turned to Facebook to disseminate his hate speech after the government barred him from preaching in public. Facebook is big in the United States, but it is even bigger in countries like Myanmar, where many people consider it “the internet itself.”

Free speech arguments aren’t quite valid in the debate over Facebook’s role in political violence in Sri Lanka and Myanmar. Hate speech in those countries is not only expressing bigoted opinions but also explicitly inciting violence and encouraging genocide. This is speech that explicitly calls for attacks on groups of people. Even in the U.S., speech that directly incites illegal action is not permitted, so Facebook has no reason to protect it.

Facebook can’t lose sight of its international base of users. While moderating hate speech on the platform is the morally right thing to do, it may also be good for business. Surely Sri Lanka’s temporary ban on Facebook was not good for the company, as it meant losing more than 6 million users. So, if the platform has no moral motivation to regulate hate speech, perhaps the market can encourage it to do the right thing.

[Read more: Maryland’s social media regulation bill makes sense. No, really.]

Ultimately, Facebook has a responsibility to do what it can to prevent these instances of violence. As the platform has grown and reached into most parts of the world, it has lost sight of national and local contexts. Hate speech and fake news in the U.S. haven’t had the same violent impacts as they’ve had in other parts of the globe, but Facebook has to think globally and focus more of its attention on its role outside this country.

Liyanga de Silva is a sophomore English major. She can be reached at liyanga.a.ds@gmail.com.