When Microsoft acquired GitHub in 2018, there were concerns that the company would struggle to moderate controversial code on the world’s largest repository of open-source software. Chief among them was whether GitHub would continue to host deepfake code that’s used, among things, to transpose faces of celebrities onto pornographic videos without their consent. GitHub power users were also worried if Microsoft would remove code that undermined its business interests. The worst predictions never came true, but four years on, the lack of moderation on GitHub has brought Microsoft into the eye of a completely different storm.

On January 1, hundreds of Muslim women in India found their names and Twitter profile pictures displayed on a fake auction site named “Bulli Bai” — a slur against Muslim women — without their consent. The code for the app was hosted on GitHub.

While GitHub quickly took down the app, following massive social media backlash, this is the second time in seven months that the platform has been used to target Muslim women in India. In mid-2021, a similar web application called “Sulli Deals” was hosted on Github to trade Muslim women without their consent. The app was online for weeks before it was taken down.