Reference no: EM133243306
Business Problem: Racial, Gender and Disability bias in the workplace
"Natural language processing (NLP), the branch of AI that helps computers understand and interpret human language, has been found to demonstrate racial, gender and disability bias. Inherent biases such as a low sentiment attached to certain races, higher-paying professions associated with men and negative labelling of disabilities then get propagated into various applications, from language translators to resume filtering.
For example, researchers from the University of Melbourne published a report demonstrating how algorithms can amplify human gender biases against women. Researchers created an experimental hiring algorithm that mimicked the gender biases of human recruiters, showing how AI models can encode and propagate at scale any biases already existing in our world"
In general, this application of artificial intelligence has great potential since it could help organizations manage their resources. On the other hand, non-human selection of data raises several ethical, privacy and security-related questions. For example, is it compliant? Is it still favouring some groups of people over others? What data is it used to arrive at its output, and how secure is it? What data do you think would provide the best predictions?
Question
As an HR Business lead, I will explain how the Sports industry will benefit from AI and the technology's opportunities in hiring employees.