While a number of scholars have studied online communities, research on games has been mostly focused on the business, experience, and content of gameplay. Interactions between players within games has received less attention, and toxic behavior is a newer area of investigation in academia. Inquiry into toxicity in gaming is part of a larger body of literature and public interest emerging around disruptive and malicious social interactions online, cyberbullying, child-grooming, and extremist recruiting. Through our research we reaffirmed that toxicity in gaming is a problem at a global scale, but we also discovered that on a micro scale, what behavior gamers perceive as toxic, or how toxicity is enacted in gaming is different depending on cultural context amongst other things. The generalized problem at scale, and its particular manifestations on the micro level raise philosophical and technology design questions, which we address through examples from our own research...
Supporting communities on its platforms has been a part of Facebook's core mission since 2017. Early understandings of the needs of groups and organizers largely centered around groups that began on Facebook itself. This paper is the result of ethnographic research conducted in 2019 to better understand the needs of different types of groups and the corresponding ways that technology platforms do and could support them. The initial orientation towards online groups led to the recognition of the difficulty of managing fast-growing groups but failed to consider whether groups might want to avoid growth in members altogether. We found in our research that many groups in fact did want to avoid or limit their growth in numbers. For these groups, growing as a community meant different things: offering more to existing members, raising awareness, or promoting the group to an outside audience, or simply maintaining over time. Our research was able to connect the dots of why organizers...