Disinterested employers were 20% less likely than engaged employers to have narrowed their UK gender pay gap between 2017 & 2021 . I draw this conclusion from a statistical model using ~6,000 employers which removed the effect of confounders such as size, sector, furlough, gender ratio, etc. Once accounted for, I found 61% of those reporting their pay gap for 2019 had narrowed their pay gap by 2021 compared to 55% of those not reporting 2019 data.
A shorter version of this article can be found on my LinkedIn feed where you can also leave comments.
This article was edited on 17th May 2022 to correct typos and to add the link above.
History of UK Gender Pay Gap Reporting
The 2021 reporting deadline has passed and marks the 5th anniversary of GPGR (Gender Pay Gap Reporting). Here is a brief timeline.
- April 2017 – The 2017 GPGR regulations come into force as authorised by the 2010 Equality Act. Employers have up to 12 months to report a variety of pay gap statistics for their snapshot date of either 31st March 2017 (public sector) or 5th April 2017 (other employers).
- April 2018 – The first round of reporting is complete and media interest spikes with numerous articles.
- March 2020 – The government suspends enforcement of the deadline for employers to submit their 2019 snapshot data due to the COVID19 pandemic. Approximately 1/3 of employers fail to report 2019 data.
- April 2020 – The furlough scheme comes into effect enabling employees to be furloughed on 80% pay. This counts as reduced pay leave under GPGR rules and means such employees are excluded from gender pay gap calculations for the 2020 & 2021 snapshot dates.
- February 2021 – The government confirms employers will have to submit their 2020 snapshot data but extends the deadline for reporting to 5th October 2021.
- April 2022 – the deadline for a government review of the GPGR regulations passes without anything being published.
5 Years of GPGR in a nutshell
As of 1st May 2022, 10,292 employers have reported their 2021 snapshot data. The median employer of 2021 based on their median gender pay gap paid their median woman 90.2p for every £1 paid to their median man.
When trying to see if this has changed since 2017, the incorrect approach is to calculate the equivalent figure for each of the years 2017 to 2020 as shown in the NAIVE row in the above table. This appears to show that the median woman’s pay has fallen since 2017 since the median employer back then paid their median woman 90.6p for every £1 paid to their median man. However, the 10,250 employers reporting in 2017 are different from the 10,292 reporting in 2021 since many will have gone bust, merged, split, dropped out or moved into GPGR as a result of their headcount moving above or below 250 employees.
The correct way to identify the underlying trend is to focus on like for like employer comparisons. A simple option is to only look at the 5,561 employers who reported data for all 5 years between 2017 & 2021. Of these, the median employer of 2021 paid their median woman 90.0p which is 1.3p higher than what the median employer of 2017 paid.
Whilst simple, we end up excluding ~6,000 employers who may not have reported data for all 5 years but who can still provide insight on the underlying trend provided they have reported data for at least 2 of the 5 years. Out of 12,715 employers who have ever reported pay gaps, 1,345 have only reported for a single year. Such employers provide no insight into trends and can be excluded. This leaves an additional 5,809 employers who can provide insight into trends between at least 2 years. When I merge the like for like trends from these 5,809 employers with the 5,561 5-year employers (using a method known as imputation), I end up with the bottom row of the above table (Like for Like Simple). This shows that the median woman was paid 89.7p in 2017 which is 0.5p lower than 2021.
There are in fact a number of ways of combining multiyear employers to arrive at an underlying trend and a slight variation is used in the table below. The important point is the imputed trend method (based on like for like employers) can be used for all types of pay gap statistics. I have done this to end up with the table below which shows what I consider to be the true underlying trend between 2017 and 2021 for GPGR employers across all the major statistics.
In all cases, the 2021 figure is the median employer for that statistic and the 2017 to 2020 figures are imputed from like for like trends as just described.
Whilst this article will focus on the median gender pay gap, I find the increase in the number of women in the upper pay quarter from 37.6% to 39% to be the most notable change since GPGR became mandatory.
The significance of 2019
The median woman’s pay from the table above is shown in this chart as a pay gap i.e. the difference between the median woman’s hourly pay and the median man’s hourly pay, which for 2021 is -9.8p.
The chart shows the underlying trend as the bars along with lines for the two largest groups of like for like employers.. The black squares represents the 5,561 employers from group A who reported data in all 5 years. The next largest group E are 2,597 employers who reported data in 2017, 2018, 2020 & 2021 but not in 2019 when enforcement was suspended. It is these two groups, which I will denote as 1718192021 and 17182021, that I want to focus on here.
How do they differ? From the chart, I see the following two points.
- The 1718192021 group started with a larger pay gap in 2017 of -11.3p than the 17182021 group who started at -6.7p.
- The 1718192021 group narrowed its pay gap by 2021 to -10p whereas the 17182021 group widened slightly to -7p.
Does that mean employers who did not report their pay gaps for 2019 were less likely to have narrowed their pay gap since 2017? 55% of the 1718192021 group narrowed their pay gap over the 5 years compared to 49% of the 17182021 group. This difference is statistically significant and suggests that non-reporters in 2019 were 20% less likely than 2019 reporters to narrow their pay gap by 2021. But can we be certain this difference really is explained by non-reporting in 2019 or could other factors be responsible for this? It turns out we can conclude the difference is explained by non-reporting in 2019 and the rest of this article explains why.
Note – I make no distinction between employers who pay their median woman more than their median man and those who pay their median woman less. Both are considered undesirable outcomes and I want to know if employers are making progress on reducing the gap between their median man and median woman regardless of what direction the gap was to begin with.
Factors that could explain the 2019 effect
There are a number of factors (also known as confounders) I need to rule out before I can confirm the significance of 2019 non-reporting.
- Reversion to the mean – a phenomenon that often occurs in time series whereby if a value deviates significantly from an expected value, then in the next time period, it can be expected to “revert” to the expected value. In the pay gap world, this means an employer with a large pay gap are more likely to narrow their pay gap than an employer who has a small pay gap even if no activity to promote gender diversity is undertaken. You can see examples of this effect in my articles “What is the trend in 2020” and “Close your pay gap by playing Blackjack“. From the chart above, we already know that the 1718192021 group started in 2017 with a larger pay gap than the 17182021 group so one would expect the former to be more likely to narrow their gap than the latter.
- Gender dominance – if at least 80% of a workforce is of the same gender then I say such employers are gender dominant. I showed in my article “Life on Mars” such employers are more likely to have wider pay gaps through random chance alone than gender balanced employers. That creates a variant of the “reversion to the mean” effect whereby if only a few of the minority gender leave and replaced by the other gender, the effect on the pay gap can be dramatic. If 1718192021 employers are more likely to be gender dominant than the 17182021 employers, this could explain the 2019 effect seen.
- Employer size – If you put together the two effects above, it should not be a surprise to learn that it is easier for a small employer to close a large pay gap through chance alone than a large employer. Again this is an effect I have written about before in my article “The good, the bad and the Unilever“. If employers in the 1718192021 group tend to be smaller than the 17182021 group then that could explain why the former was more likely to close their pay gap.
- Furlough likelihood – The 2020 & 2021 data could be distorted by employees being on furlough, thus on reduced pay leave and being left out of the pay gap calculations. We know from HMRC data men were slightly more likely to be furloughed than women in 2020 but in 2021, men were more likely to have come off furlough than women. If high paid employees are furloughed and more likely to be of one gender, then that gender’s pay will fall by more relative to the other gender. Conversely if low paid employees are more likely to be furloughed with a bias to one gender, that gender’s median pay will rise relative to the other gender. We know certain sectors such as Hospitality & Travel were heavily affected by furlough and if they are more likely to be in either of the two groups here, that could explain the 2019 effect.
- Public or Private sector – This matters in 2 ways. First, public sector payscales tend to be fixed at a national level and are harder to change than a private sector’s (see recommendation 5 of this link for an example of a private sector employer who had a larger pay gap than an equivalent public sector employer but could make faster progress on closing it). Second, private sector employers are more likely to merge or split with another employer. If the employer’s name remains unchanged in such circumstances it can be hard to tell straightaway if an employer is now twice or half the size due to a merger. Such large scale changes can cause large changes in the pay gap which are not the result of activities to promote gender diversity.
- Reporting Errors – I first got into the pay gap field because I noticed so many errors in the 2017 data. I estimated between 5-15% of employers had made errors back then and if so, any apparent closing (or widening) of a gap might simply be due to this effect than gender diversity efforts. One method of detecting errors is to compare my gender swap number concept with the median gender pay gap and my article about how Cleveland Police got its pay gap wrong was based on an earlier version of this concept. Clearly if errors are more common in one of the two groups of employers, they may be the reason for the 2019 effect.
To test whether these effects were occurring, I pulled together a dataset of 6175 employers as follows.
- Only 8773 employers who reported data in 2018, 2020 & 2021 were included.
- This equates to the 1718192021, 17182021, 18192021, 182021 groups of employers.
- These 4 groups account for 69% of all 12715 employers who have reported pay gaps in at least 1 year.
- The reason I wanted 2018 & 2020 reporters is to see if being late reporting in those years was correlated with not reporting in 2019.
- The years 2018, 2019 & 2020 can therefore be used as a measurement of employer disinterest with GPGR.
- 1986 Employers with suspected or known errors in their calculations were excluded.
- The main reason for exclusion was if a conflict between the gender swap number and median gender pay gap existed. See this article for an explanation.
- In some cases, the median gender pay gap is probably correct but unless I look at their data and report in detail, I can’t verify this.
- I am erring on the side of caution here but errors are a fact of life in GPGR and I continue to see them regularly.
- By excluding these employers, I minimise the likelihood of errors in the remainder distorting the underlying trends.
- 612 employers with small pay gaps in 2017/2018 were excluded.
- I defined small to be less than 2.5p in the pound.
- By definition, an employer with a small pay gap to begin with is much less likely to be able to narrow it by 2021 because their pay gap can fluctuate due to chance alone. See this article for an explanation.
- I felt it was unfair and perhaps distorting to include these employers.
- I should point out that if I allowed more employers through step 2 above, many would have still been excluded at this stage. This is because the conflict between the gender swap number and the median pay gap tends to be more common when the pay gap is small in the first place.
Verifying these effects with Logistic Regression
I built a statistical model to correlate the probability of an employer narrowing their pay gap by 2021 with the factors I’ve listed above. The output variable I used was a 0/1 binary variable where zero means the employer did not narrow their median gender pay gap between 2017 (or 2018 if that was their 1st year of reporting) and 2021 and one means they did narrow their median pay gap by then. I was not concerned with the magnitude of the change i.e. I treat an employer who narrows their pay gap from 50% to 49% to be the same as one who narrows it from 50% to 5%.
Statistical modelling when the output variable is binary requires the use of Logistic Regression which comes under a class of models known as Generalised Linear Modelling or GLM. Such models require the use of statistical software so I won’t be describing the details of the modelling here. Please contact me to receive a copy of the R script and the data file used if you’re interested.
The first effect I fitted was the size of the original pay gap. In the table shown here, I’ve grouped employers into 4 groups based on the size of the first reported median pay gap in 2017 (or 2018) and then calculated what percentage of employers had narrowed their pay gap by 2021. You may notice that the overall % of employers narrowing their pay gap is higher than what I mentioned earlier. That is because, I’ve excluded employers who started with small or no pay gaps in 2017/18 and who by definition could only widen rather than narrow.
This table confirms that employers with large pay gaps to begin with are more likely to have narrowed their gap than employers who started out with small gaps due to the reversion to the mean effect. Within each of 4 pay gap groups, those who did not report 2019 data were ~20% less likely to have narrowed their pay gap compared to those that did report 2019 data.
Digression – probability, odds and odds ratios
My statement that non-reporters are 20% less likely to have narrowed their pay gap may puzzle some people. On the face of it, if 61% of 2019 reporters narrowed their pay gap then surely 55% of non-reporters so narrowing is only 10% less? If you are willing to accept my statement then by all means skip this section otherwise I will explain the different ways we can measure Likelihood.
The confusion arises if you think likelihood is a synonym for probability. Likelihood is in fact a general statistical concept which can be measured in a number of ways. When I observe that 61% of 2019 reporters narrowed their pay gap, that is functionally equivalent to saying that the Probability of a 2019 reporter narrowing their pay gap is 0.61. However this is the not only way of measuring likelihood.
If you place bets with a bookmaker, you will be quoted Odds instead of probabilities. The odds of a 2019 reporter narrowing their pay gap is the number (or percentage) of employers who did narrow their pay gap divided by the number (or percentage) of employers who did not narrow their pay gap. Using the table above, the odds are 1.58 (=61%/39%) for 2019 reporters and or 1.24 (=55%/45%) for 2019 non-reporters.
When a logistic regression model is built to test the significance of the 2019 effect, it calculates the Odds Ratio. This is the ratio of the odds for a non-reporter divided by the odds for a reporter which turns out to be 1.24/1.58 = 0.78 (or 78%). In other words, the odds of a 2019 non-reporter narrowing their median gender pay gap is 22% lower than the odds of the 2019 reporter narrowing their gap. It is this measure I’ve used to make my statement above.
When comparing the relative likelihood of reporters and non-reporters narrowing their pay gap, the odds ratio is a standard metric but it’s not the only one. An alternative is to say 2019 reporters are twice as likely as non-reporters to narrow their pay gap. This can be worked out by first observing that if an employer has a notable pay gap and makes no effort to close it, then the laws of chance mean their pay gap will not be static going forward, it will instead fluctuate around an expected value. Such an employer would then have a 50% probability of narrowing their pay gap without any efforts. On this basis, the 61% and 55% from the table above should be compared with 50% and it is the increased probability that interests us. So we should compare 11% (=61%-50%) with 5% (=55%-50%) which allows us to say 2019 reporters are twice as likely as non -reporters to have narrowed their pay gap.
The latter is not a standard metric since I could say the same thing if the figures were 52% & 51% but it goes to show that likelihood can be measured in a number of ways. The bottom line is that if someone claims that something is more likely than expected, you need to check how they measured likelihood in the first place.
What else drives the likelihood of narrowing a pay gap?
I identified 3 other effects that change the likelihood of an employer narrowing their pay gap.
- Large employers with 5,000 or more employees are 50% more likely (odds ratio measure) than smaller employees to have narrowed their pay gap if all other factors are equal. This doesn’t surprise me given that smaller employers are more exposed to small sample size fluctuations that can cause your pay gap to fluctuate more even if your underlying trend is in the right direction.
- Employers in sectors with high rates of furlough (Hospitality, Travel, High St Retail, Culture) are 30% less likely to have narrowed their pay gap if all else is equal. I am not surprised that furlough had an effect but it was not clear to me if the effect was to make narrowing a pay gap more likely or less likely.
- Female dominated public sector employers (workforce >80% women) are 66% less likely to have narrowed their pay gap if all else is equal. Only 43% of such employers narrowed their pay gap by 2019 compared to 60% of all other employers.
The female dominated public sector effect was not something I expected but it doesn’t surprise me if I think about it. My data set included 244 such employers of varying sizes and 75% of such employers were in the Education sector with the remainder in the NHS. I have stated before the two sectors with the largest pay gaps and the largest struggle to close them are airlines and primary schools. The former is due to the dire shortage of female pilots (see my article about Ryanair) and the latter is due to the massive shortage of men in lower paid roles such as cleaners, dinner ladies, teaching assistants, etc (see this article which includes Rayleigh Schools Trust from my list of 244 employers). You can see this effect clearly in the two learning trusts below which are typical of the 189 education employers in my data set. One of these saw their pay gap narrow, the other saw it widen but both face the same fundamental issue in that 95% of their lower paid roles are held by women compared to 75% of higher paid roles.
Once I take these additional effects into account, my final model shows 2019 non-reporters were 16% less likely to have narrowed their pay gap by 2021. However, due to some residual confounding I briefly touch on in the next section, I am satisfied that one can round this up to 20% when drawing a final conclusion which is what I did at the very start of this article.
Are 2019 non-reporters disinterested in GPGR?
Does the last paragraph mean we can say any employer who did not report their 2019 data is not engaged with the pay gap reporting process? Do we need more evidence before we can say this?
To find out, I looked at the relationship between the probability of reporting 2019 data and whether an employer was early or late in reporting their 2018 & 2020 data.
- For 2018, an employer was deemed Early if they submitted their data at least 2 weeks before their deadline, On Time if they submitted within the last 2 weeks and Late if submitted after their deadline.
- For 2020, when the deadline was pushed back to 5th October, an employer was deemed Early if they submitted their data by their normal deadline of either 31st March/5th April, On Time if they submitted by 5th October 2021 and Late if submitted after this date.
The tables show a clear picture. Of the 669 who were early in both 2018 & 2020, only 1% failed to submit 2019 data and over 2/3 had narrowed their pay gap by 2021. Of the 65 employers who were late in both 2018 & 2020, 42% failed to submit 2019 data and only 52% narrowed their pay gap. You may recall from my digression earlier that an employer who has no interest in their pay gap can be expected to close their gap half the time due to random fluctuations in their work force so what we see for these 65 is more or less this. On this basis, I am comfortable with describing the 27 employers (the 42% of 65 who did not publish 2019 data) as being disinterested in pay gap reporting since they have 3 strikes against them.
I am sure a number of other employers could be added to the list. My colleagues at Spktral have drawn a similar conclusion by looking at whether an employer was compliant with the GPGR regulations in terms of whose signature is on the report, whether a narrative has been published and made available on their website and other things. We intend to combine our results to derive an employer disinterest score.
I am satisfied that the top line picture of 2019 non-reporters being up to 20% less likely than 2019 reporters to have narrowed their median gender pay gap by 2021 is a true picture. I identified a number of potential confounders which include the original pay gap when reporting started, the size of the employer, the likelihood of furloughed employees and female dominated public sector employees. Once these were taken into account, my model showed the odds of a 2019 non-reporter narrowing their pay gap was 16% lower than the odds of a 2019 reporter narrowing their pay gap.
— Would you like to comment on this article? —-
— Subscribe to my newsletter to receive more articles like this one! —-
If you would like to receive notifications from me of news, articles and offers relating to diversity & pay gaps, please click here to go to my Newsletter Subscription page and tick the Diversity category and other categories that may be of interest to you. You will be able to unsubscribe at anytime.
— Want to know more about pay gaps? —
You will find a full list of my pay gap & diversity related articles here which are grouped by theme.
— Need help with understanding your pay gap? —
I offer the following services. Please click on the headings for more details.
- Analysis – I can dig deep into your data to identify the key drivers of your pay gaps. I can build a model using a large number of variables such as pay band, seniority, job function, location, etc and use this to identify the priority areas for closing your gaps.
- Training – I run training courses in basic statistics which are designed for non-statisticians such as people working in HR. The courses will show you how to perform the relevant calculations in Microsoft Excel, how to interpret what they mean for you and how to incorporate these in an action plan to close your gaps.
- Expert Witness – Has your gender pay gap data uncovered an issue resulting in legal action? Need an expert independent statistician who can testify whether the data supports or contradicts a claim of discrimination? I have experience of acting as an expert witness for either plaintiff or defendant and I know how to testify and explain complex data in simple language that can be easily understood by non-statisticians.
If you would like to have a no-obligation discussion about how I can help you, please do contact me.