06/14/2021
Artificial intelligence is ubiquitous today. Most of us do not know where AI is being used and are unaware of the biased decisions that some of these algorithms produce. There are AI tools that claim to infer "criminality" from face images, race from facial expressions and emotion recognition through eye movements. Many of these technologies are increasingly used in applications that impact credit card checks, fraud detection, criminal justice decisions, hiring practices, healthcare outcomes, spreading misinformation, education, lifestyle decisions and more.
Addressing this issue with robust tools for evaluating transparency, bias and fairness, as well as the ethical evaluation of algorithms, is a good first step. Expanding the scope to include the teams and environments in which the products are built would enable building fair products that not only reduce unfair outcomes but also ensure that AI systems do not disadvantage some sectors of the population.
Who Shapes AI Today?
To address the complexities of who is shaping AI today, we need to understand who builds AI systems, and how race, gender and other protected classes are represented within AI products.
Studies show that only 12% of machine learning researchers are women, 15% of AI research staff at Facebook are women and just 10% at Google. According to one report (via Fortune), 2.5% of Google's workforce is Black, whereas it is 4% at Microsoft and Facebook.
The diversity problem is not just about women, gender or race — it is most fundamentally about what AI products get built, who they are built for and who benefits from their development.
Current technical tools exist to identify bias in datasets, create transparent algorithms or increase interpretability of AI products through explainability. In addition, we need to look at increasing the diversity of AI engineers, stakeholders and decision-makers who can focus on negative social, economic, health or legal outcomes.
What can CEOs and their top management teams do to lead the way in building diverse AI teams and communities of practice? Among others, we see five essential steps:
1. Restructuring Talent Acquisition
If you are a startup, you have the advantage of building your diverse pipeline talent at the onset. Looking into hiring from historically Black colleges and universities (HBCU) and Hispanic-serving institutions (HSI) is a good first step. When startups fill in candidates from outside their network, they build a strong foundation of diverse leadership. For larger and older organizations, increasing the talent pool of candidates appearing for interviews, looking outside of their regular talent sourcing strategies and paying attention to job descriptions would go a long way to bring in diverse talent.
2. Sustainable Inclusivity
It is not only sufficient to look at talent pools and increase candidates; it is equally important for business leaders to create an inclusive culture and retain diverse talent. Data scientists working on AI products are part of larger engineering departments that traditionally are not diverse. Making diverse teams feel inclusive means having open communication, addressing micro-aggressions and establishing concrete feedback mechanisms.
An example could be identifying employee resource groups (informal communities for like-minded team members) and obtaining feedback about what changes the organization can make to build a sustainable inclusive culture. Asking for regular feedback via email or Slack about how employees feel about the organizational culture can help build a culture that embraces diversity on an ongoing basis.
3. Pay Parity
Fair compensation is crucial to retain data science talent, make a huge difference in AI research, build diverse AI products and shape the way AI impacts society.
Compensation policies around providing guidelines for negotiation, narrowing the comp bands and reviewing compensation decisions of hiring managers can go a long way toward attracting the right talent pool and retaining them. Tracking compensation patterns of underrepresented groups and ensuring there are minimal outliers can lead to less turnover.
Load older comments...
Loading comments...
You've Been Timed Out
Please login to continue