Robots Reject Human Job Candidates

Robots may take over human jobs because they are more precise, more consistent, more accurate, cheaper, less demanding of benefits and privileges, and faster at their jobs. But there’s another reason: robots are making hiring decisions.

No, you’re not likely to sit down with a robot for a job interview, but increasing numbers of companies are using software to screen applications and resumes. The software’s algorithms can quickly sort through lots of applications and filter out the ones with no relevant experience, or with characteristics that are deal breakers.

If you’ve done an initial sort on applications for a popular job, you know how tedious this can be, and how time consuming. A robot can do it fast, pulling forward only the resumes including applicable experience and the desired qualifications. Then the human hiring committee can work through a dozen applications rather than a hundred.

Are robots more fair?

Bots are theoretically more likely to be fair in their judgements. We humans famously tend to favor people more like ourselves. In one famous experiment, repeated last year with similar results, hypothetical job candidates with randomly-chosen names considered likely to be characteristic of Black people got significantly fewer contacts than those with names considered more likely to be used among white people. Robots wouldn’t do things like that, right?

Unfortunately, that’s not the case. When machine learning is used to tag successful hires, algorithms will tend to hire more people like those who have already been successfully hired. That’s not much different from a company’s recruiters choosing candidates like himself.

Amazon’s recruitment engine taught itself that men were preferred over women. It rejected resumes that used the phrase “women” in, for example, the names of organizations the applicant might have joined, such as the Association of University Women.

Other such tools may reject any candidate with a gap between jobs, even though that may signal pregnancy or family leave rather than unemployment or incarceration. They may reject any candidate who doesn’t use the tern “floor-buffing” rather than a synonym, or favor candidates who played particular sports in high school.  Using AI clearly doesn’t eliminate bias.

Robotic hiring decisions

There’s widespread agreement that AI doesn’t make hiring decisions as well as human beings. There is also widespread agreement that machines can’t bake bread as well as human artisan bakers.

In both cases, though, the speed and consistency of the automated systems, not to mention the price difference, make it likely that we’ll keep letting machines do most of that work.

When you have Indramat motion control systems in place, you can expect your automation to be reliable and precise. If there are problems, contact us for expert support.