Can You Trust AI With Your Next Pay Raise?
What once took Monica Seiter hours of manual reviews can now be resolved with the press of a button and a few prompts.
As director of payroll at Lindenwood University in St. Louis, Seiter uses Payroll Agent, an AI-powered assistant in the management software Workday, to automate the payroll process. (Workday lists The Washington Post as a client.) Some of her favorite features include automated scans to find missing data ahead of payday, and notifications to managers about minimum wage increases that could impact Lindenwood’s budget.
Payroll Agent is just one of many AI tools released last year, joining a wave of AI-powered products designed to automate HR processes. The developers behind these tools, commonly referred to as AI agents, promise efficiency and precise information for managers when conducting performance reviews or evaluating who is eligible for a raise or promotion. But by partially automating HR functions, organizations are calling on AI to help steer one of the most consequential relationships between employers and employees, one where a deft human touch was long considered a requirement.
“People don’t want to be judged by a black box,” said John McCarthy, a professor researching workplaces and emerging technologies at Cornell University. “Sometimes, even the people using or deploying these systems don’t know what’s in the black box.”
While companies like Workday are rolling out narrow AI agents designed for specific tasks, managers are already using general-purpose models such as ChatGPT and Google’s Gemini to inform high-stakes calls. More than 60 percent of managers say they use these tools to inform decisions on their employees, according to a June survey by ResumeBuilder, including to draft performance reviews. Of these, over half said they use AI to help determine raises, promotions and even layoffs.
Industry reports suggest that when used correctly, automating certain HR tasks can lead to significant time savings for managers – up to 25 percent – according to one analysis by consulting firm Bain & Company. But as companies race to integrate AI, critics worry that untrained managers could use it irresponsibly. The ResumeBuilder survey found that only one third of managers who used AI to manage people had received formal instruction on how to do so, and around 20 percent often allowed AI to make decisions without human input.
“It’s a wild, wild west out there,” said Stacie Haller, chief career adviser at ResumeBuilder, adding that AI-assisted decision-making at work could expose companies to legal action, including wrongful dismissal cases.
“If you are let go and it was based on some AI evaluation, I guarantee you there are going to be lawsuits, because today people bring up lawsuits when they feel they’re unfairly fired anyway,” she said.
For Workday, the answer is for a person to have a final say and remain accountable, even if they tapped an AI agent for help, explained Aashna Kircher, a group general manager for HR products at the company.
“AI can’t make decisions around people’s performance,” she said. “We are very much anchored on having a human in the loop and amplifying potential, not replacing human judgment.”
When used responsibly, AI agents could even help improve transparency, says Maria Colacurcio, CEO of Syndio, a company developing workplace equity solutions. In October, Syndio released its own AI agent called Syndi, which provides hiring managers with salary offer recommendations for individual job candidates based on internal pay policies, market rates and company targets. Agents like Syndi are designed to explain each recommendation, a step Colacurcio says is essential for maintaining trust.
“The real value of AI is helping leaders make good decisions with better confidence. And when you’ve got that clear and consistent reasoning, the people who are on the receiving end feel respected,” Colacurcio said.
Even when AI agents are transparent about how a recommendation was formulated, a human arbiter will still likely be required to make most decisions. In processes like performance feedback, soft skills that are harder to quantify might slip past what algorithms are ready to reward, said John Hausknecht, a human resources professor at Cornell. Recognizing qualities like congeniality and willingness to train colleagues, or context from workers’ personal lives, is one area where human managers still have an edge over machines.
“There’s a ‘what’ and a ‘how’,” Hausknecht said. Evaluating what an employee has produced can be relatively easy for automation to capture, “but how they got there, and did they take the right steps and build the right relationships along the way, I still think has that judgmental quality that’s hard to get away from.”
The Washington Post · Tristan Bove
