Microsoft claims that developing AI without women likely to lead biased results

As Artificial Intelligence (AI) becomes the buzz of the town, building AI-based solutions without the inclusion of women would give way to a technology that is inherently biased, a top Microsoft executive said on Friday.

According to the “World Economic Report 2018”, only 22 percent of AI professionals globally are female while almost a third (32 percent) believe that gender bias is still a major hurdle in the recruitment process in the industry.

“If AI systems are built only by one representative group such as all male, all Asian or all Caucasian, then they are more likely to create biased results,” Mythreyee Ganapathy, Director, Programme Management, Cloud and Enterprise, Microsoft, told IANS.

Data sets that will be used to train AI models need to be assembled by a diverse group of data engineers.

“A simple example is data sets that are used to train speech AI models which focus primarily on adult speech samples unintentionally exclude children and hence the models are unable to recognize children’s voices,” Ganapathy added.

India is at the 108th spot in the gender gap index, according to the “World Economic Forum 2018” report. It also has one of the lowest participation rates of women in the labor market at 27 percent.

A different set of people should be included to increase the diversity of AI teams as more than half (52 percent) women globally, perceive the tech sector to be a “male” industry, the report adds.

To balance the gender gap in the country, the tech giant promotes the study of computer science at traditionally female colleges and other universities.

“We believe that attracting, developing and helping women in STEM fields is vital to ensuring a well-rounded, inclusive society without which we risk having hundreds of thousands of jobs left unfilled and decades of innovation absent of female perspectives,” the Microsoft executive noted.

Corporate and academic AI teams have inadvertently made systems biased against women.

For example, tech giant Amazon’s ML experts scrapped a “sexist” AI recruiting tool in October 2018 after they discovered the recruiting engine “did not like women”.

Members of the team working on the system said it effectively taught itself that male candidates were preferable.

Go to Source

Leave a Reply

Your email address will not be published.