Platinum Business Magazine - issue 99

Page 88


Whilst having made strides in workplace equality, women have left the workforce at disproportionally higher rates during the pandemic (as part of the phenomenon known as the Great Resignation), and wage inequality has continued, writes Dr. Margrét Vilborg Bjarnadóttir, Founder of PayAnalytics

USING TECHNOLOGY TO ACHIEVE PAY EQUITY As last year marked the 50th anniversary of the UK Equal Pay Act, these issues are shining a renewed spotlight on pay equity. Furthermore, in the post-Covid era, organisations are reimagining how and where work gets done. Changes are also being made to legal codes worldwide, requiring increased transparency related to pay equity. Altogether, people analytics, the field that applies quantitative tools to the employee value chain, needs some thoughtful reimagining which requires the right tools to identify biases and gaps in our data we use, make recommendations to correct those gaps, and ultimately remove them. With the right tools, supported with the right data, we can bring about workplace transformation and pay equity.


Whilst currently being used to support human resources (HR) professionals in their recruiting, retention, promotion, and compensation data and decisions, AI’s ability to be fair, robust, and secure depends on the quality of the data it uses, the assumptions that underpin its systems, and the AI model itself. However, this retrospective data often reflects both old and current biases that will not fully support complex people management environments, diversity initiatives or reimagined workplaces. AI should help us broaden our talent pipelines, not narrow them down to candidates like those we have hired in the past. In this context, most AI models

With the right tools, supported with the right data, we can bring about workplace transformation and pay equity


perform best with individuals in majority demographic groups and worse with those in less well-represented groups. Typically, algorithms maximise overall accuracy, so performance for the majority population carries more weight than performance for the minority population in determining algorithm parameters. For instance, say that an algorithm gets its data from employees who are either married or single, and childless. The algorithm might then predict that an employee who takes more personal days is more likely to quit. Yet, this prediction may be very untrue for a single parent needing time off to care for their child. This means that when we develop AI for people analytics, we need to understand the nuances of different groups within an organisation. In the context of pay equity, PayAnalytics has extensive subgroup analysis abilities that can help companies measure and close the pay gap separately for each group of employees. It is tempting for companies to simply assume that dropping demographic information from their AI applications will make their models ‘demographically neutral’,