Blog

Promoting a safe and inclusive environment by tackling online toxicity and misogyny

Share
Collage of a woman posing in front of the Big Ben on the left and the same woman posing with a votes for women statue at an event..

I had the honour of participating in the Parliamentary Internet, Communications and Technology Forum’s (PICTFOR) thought-provoking discussion on emerging technologies and women's safety online, titled Tackling Online Toxicity and Misogyny. As we approach Saturday 25 November, which marks the International Day for the Elimination of Violence against Women, it is crucial that we prioritise the creation of a safe and inclusive online environment.

The prevalence of online toxicity and misogyny not only creates a hostile atmosphere but also poses significant risks to individuals, particularly women and marginalised groups. By addressing these issues head-on, we can cultivate a space where people feel secure in expressing themselves, engaging in meaningful discussions, and participating in online communities without fear of harassment, abuse or discrimination.

One crucial topic that emerged during the discussion was the issue of gender bias in AI data. Gender bias refers to the unfair and discriminatory treatment of certain genders within the datasets used to train and develop artificial intelligence systems. It occurs when the training data reflects or perpetuates existing societal biases and stereotypes related to gender.

The impact of gender gaps in data cannot be underestimated as they can have severe consequences, even to the point of being life-threatening. When artificial intelligence systems are constructed based on biased or predominantly male-centric data, they can produce detrimental outcomes. These biased algorithms can perpetuate and amplify existing gender disparities, reinforcing inequality and hindering progress.

To mitigate these risks and work towards a fair and equitable society, it is important that we address gender gaps in data collection. We must ensure that the data used to train AI systems is representative, inclusive and free from biases. This involves actively seeking diverse perspectives and experiences, as well as implementing rigorous evaluation processes to identify and rectify any inherent biases in the data.

By taking steps to address gender gaps in data collection, we can promote fair outcomes for everyone. This not only fosters a safer society but also supports economic progress. It enables us to leverage the full potential of emerging technologies while minimising the potential harm caused by biased algorithms.

As we commemorate the International Day for the Elimination of Violence against Women this weekend, it is essential to recognise the interconnectedness between online toxicity, misogyny and violence against women.

Creating a safe and inclusive online environment requires us to actively tackle these issues.

Additionally, we must address gender bias in AI data to ensure fair and equitable outcomes. By prioritising diverse and representative data collection, we can mitigate the risks posed by biased algorithms and foster a society where everyone can thrive. Let's fix our systems, not the women, and work together to build a digital landscape that is truly inclusive and supportive for all.

Nicky Danino is Head of School of Computer Science at Leeds Trinity University.

Follow Nicky on Twitter/X @NickyDanino

Leeds Trinity University's Corporate Communications team is the first point of contact for local, national and international media.

Looking to source a comment or would like to arrange an interview with one of our academic experts? Contact the team on +44 (0) 113 283 7100 or email the Communications team.

More about the Press Office
r