“… UNESCO argues that if people do, in fact, prefer the female-sounding helper, it’s mainly because we’ve been conditioned to feel more comfortable making requests of women. But by not questioning this conditioning, says Miriam Vogel, executive director of Equal AI, a nonprofit formed to address and reduce unconscious bias in the field of artificial intelligence, ‘we’re teaching our next generation to double down on those stereotypes.’

“…Vogel likes Q but thinks the fix isn’t quite that simple, because un-gendering AI is about perfecting the persona as much as the voice. ‘Once you’re asking consumers to view an assistant as a replacement to a human, as part of the family, you have a responsibility to be thoughtful not only about how it sounds but how it interacts,’ she says. ‘Electronic assistance is not going to destroy humanity, and using it as it is now doesn’t mean all our kids will treat others with less respect. But I do think a piece of the puzzle is looking under the hood at who’s making the tech, the intentions we’re programming into it, and how it will effect later AI systems. The long-term implications of gendered technology,’ says Vogel, ‘are far-reaching…'”

Read the full article here: