Columbia’s Orfanidou says training AI using biased past risks excluding women from shipping
Summary
Christina Orfanidou, head of AI at Columbia Group, warns that using biased historical data to train AI could perpetuate gender exclusion in the shipping industry.
Why It Matters
This article highlights the critical issue of bias in AI training, particularly in industries like shipping where historical data may marginalize women. Understanding these biases is essential for developing fair and inclusive AI systems that reflect diverse perspectives and promote equality.
Key Takeaways
- Bias in historical data can lead to exclusionary AI practices.
- The shipping industry faces legacy challenges that impact gender representation.
- Addressing these biases is crucial for fostering inclusivity in AI.
Columbia’s Orfanidou says training AI using biased past risks excluding women from shippingCypriot tech expert argues that legacy problems in the industry could be viewed as the norm in machine learningChristina Orfanidou is the new head of AI at Columbia Group. Photo: ColumbiaGary DixonTradeWinds correspondentLondonTechnology