Is Artificial Intelligence The New Gender Discriminatory?

Gender Discriminatory

It seems that the ability of the human mind to form prejudices and negative opinions about a person’s sexual orientation was not enough. Therefore, we now have Artificial Intelligence (AI) (not totally intelligent in this case) to find out sexual orientation just by analysing the face of person. Yes, you read it right! A study at Stanford University has revealed that a computer algorithm could successfully detect whether the person is straight or homosexual for a fair amount of cases- 81% for men and 74% for women.

The software analyses the photographs of people and takes into account only the facial features to draw a result about their sexual preferences. The research suggests that homosexual individuals exhibit gender atypical features. Gay men and women appear more feminine and vice versa, according to the study.However, this algorithm has its own share of flaws. It only features two preferences- Gay and straight. There is no inclusion of bisexual preferences. It also didn’t study non-white community as only photographs of whites were a part of the examination.This study, conducted by researchers Michal Kosinski and Yilun Wang, has garnered a lot of attention globally for the influence that it will exert on a lot of issues. Does this software not violate the right to privacy of an individual? Can this technology not be used as a potential weapon to stand against LGBT issues? What about the ethics of face detection technology and their extent in this? Apart from these questions, scientists also have expressed their concerns about the biological origins of sexual orientation- whether it’s a choice or an inborn characteristic.

Such a technology has dangerous implications. Although AI has supported many positive causes in the past such as problem solving, social interaction, crime resolution but the adavancement in this technology is alarming. This software has the potential to turn into a tool for cyber crime. There are millions of photographs available on Facebook, Instagram and other social media platforms to serve as data for this tool, which can be used to identify people’s sexual orientation without their knowledge and consent. This can be later used as bait for any wrong purpose.

In countries where gay rights are not recognized yet, the technique to determine the gender can snatch away people’s right to education, employment and  fundamental living as whole. The algorithm can also be used by communities and groups that deter LGBT interests against individuals to persecute them. Not only does it pose as a threat to an individual’s sexual preference, but also has great possibility to develop into a software for larger crimes and ill purposes based on animosity and disapproval for this chunk of the society. AI can and has made our lives a lot better, undoubtedly, but when it has the capacity to be misused for people’s basic rights, is it any good?