人工智能(AI)的進步和大數(shù)據(jù)的使用,正在改變許多大公司招聘入門級和初級管理職位員工的方式。如今,畢業(yè)生的簡歷很可能不得不打動某個算法,而不是一位人力資源高管。
“There’s been a dramatic increase in the use of automation in [high] volume selection processes over the past two years,” says Sophie Meaney, managing director, client solutions and strategic development at Amberjack, which provides and advises on automated recruitment processes.
“過去兩年里,在大流量篩選的過程中使用自動化的情況出現(xiàn)戲劇性增加,”Amberjack負(fù)責(zé)客戶解決方案和戰(zhàn)略發(fā)展的董事總經(jīng)理索菲•米尼(Sophie Meaney)說。該公司提供自動化招聘流程以及相關(guān)咨詢服務(wù)。
While algorithms supposedly treat each application equally, experts are divided about whether so-called robo-recruitment promises an end to human bias in the selection process — or whether it may in fact reinforce it.
盡管算法理應(yīng)平等對待每份申請,但在機器人招聘(robo-recruitment)將會終結(jié)遴選過程中的人類偏見、還是實際上也許會強化人類偏見的問題上,專家們看法不一。
“AI systems are not all equal,” says Loren Larsen, chief technology officer for HireVue, which has developed an automated video interview analysis system. It has been used by companies including Unilever, the consumer goods group, Vodafone, the telecoms company, and Urban Outfitters, the retailer. “I think you have to look [at] the science team behind the work,” says Mr Larsen.
“AI系統(tǒng)并非完全平等,”HireVue首席技術(shù)官洛倫•拉森(Loren Larsen)說。該公司開發(fā)出一套自動化的視頻面試分析系統(tǒng)。包括消費品集團聯(lián)合利華(Unilever)、電信運營商沃達(dá)豐(Vodafone)和零售商Urban Outfitters在內(nèi)的很多公司已采用了該系統(tǒng)。“我認(rèn)為,你必須考察一下這項工作背后的科學(xué)團隊,”拉森說。
The problem, experts say, is that to find the best candidates an algorithm has first to be told what “good” looks like in any given organisation. Even if it is not fed criteria that seem discriminatory, an efficient machine-learning system will quickly be able to replicate the characteristics of existing workers. If an organisation has favoured white male graduates from prestigious universities, the algorithm will learn to select more of the same.
專家們表示,問題在于,要想找出最佳候選人,首先必須告訴算法在任何一個給定組織里,“好”是什么樣子。即便沒有饋入似乎有成見的標(biāo)準(zhǔn),一套高效率的機器學(xué)習(xí)系統(tǒng)將很快能夠復(fù)制現(xiàn)有員工的特點。如果某個組織喜歡知名大學(xué)的白人男性畢業(yè)生,算法將學(xué)會選出更多這一類別的人。
The growing reliance on automation to judge suitability for everything from a loan to a job or even to probation in the criminal justice system, worries Yuriy Brun, an associate professor specialising in software engineering at the University of Massachusetts.
從一筆貸款、一份工作,到刑事司法系統(tǒng)中的緩刑決定,在判斷眾多事情的合適性方面越來越依賴自動化,讓馬薩諸塞大學(xué)(University of Massachusetts)軟件工程副教授尤里•布朗(Yuriy Brun)感到不安。
“A lot of the time a company will put out software but they don’t know if it is discriminatory,” he says. He points to the Compas tool in use in several US states to help assess a person’s likelihood to reoffend, which was reported to have discriminated against African Americans.
“很多時候,一家公司推出軟件,卻不知道軟件是否有成見,”他說。他提到了美國好幾個州正在使用的幫助評估一個人再犯罪可能性的Compas工具。據(jù)報道,該工具傾向于歧視非洲裔美國人。
Prof Brun explains that, given the use of big data, algorithms will inevitably learn to discriminate. “People see that this is a really important problem. There’s a real danger of making things worse than they already are,” he says. His concern led him to co-develop a tool that tests systems for signs of bias.
布朗教授解釋稱,鑒于大數(shù)據(jù)的使用,算法將不可避免地學(xué)會歧視。“人們看到,這是一個真正重要的問題。有一種讓事情變得比現(xiàn)在更糟糕的真實危險,”他說。這種擔(dān)心導(dǎo)致他與人聯(lián)合開發(fā)出一種檢測系統(tǒng)偏見跡象的工具。
Many of those working with robo-recruiters are more optimistic. Kate Glazebrook, chief executive of Applied, a hiring platform, says her mission is to encourage hiring managers to move away from what she calls “proxies for quality” — indicators such as schools or universities — and move to more evidence-based methods.
與機器人招聘合作的很多人更為樂觀。招聘平臺Applied首席執(zhí)行官凱特•格萊茲布魯克(Kate Glazebrook)表示,她的使命是鼓勵招聘經(jīng)理遠(yuǎn)離她所說的“質(zhì)素指標(biāo)”,比如學(xué)?;虼髮W(xué)等,轉(zhuǎn)向在更大程度上基于證據(jù)的方法。
“In general, the more you can make the hiring process relevant, the more likely that you will get the right person for the job,” she says. “總體來說,你能讓招聘流程變得越相關(guān),你就越有可能為工作崗位找到合適人選,”她說。
Applied anonymises tests that candidates complete online and feeds them, question by question, to human assessors. Every stage of the process has been designed to strip out bias.
Applied把候選人在線完成的測試隱去姓名,然后把所有問題逐一提供給人類評估者。整個流程每一階段的設(shè)計都是為了剔除偏見。
With the same aim, Unilever decided in 2016 to switch to a more automated process for its graduate-level entry programme, which has about 300,000 applicants a year for 800 positions.
帶著相同目的,聯(lián)合利華在2016年決定將其畢業(yè)生招聘計劃轉(zhuǎn)向一個自動化程度更高的流程。每年有大約30萬名候選人申請該公司的800個工作崗位。
Unilever worked with Amberjack, HireVue and Pymetrics, another high volume recruitment company, which developed a game-based test in which candidates are scored on their ability to take risks and learn from mistakes, as well as on emotional intelligence.
聯(lián)合利華跟Amberjack、HireVue以及另一家大流量招聘公司Pymetrics合作。Pymetrics開發(fā)了一種基于游戲的測試,通過測試對候選人在承擔(dān)風(fēng)險和從錯誤中學(xué)習(xí)的能力、以及情商進行打分。
Unilever says the process has increased the ethnic diversity of its shortlisted candidates and has been more successful at selecting candidates who will eventually be hired.
聯(lián)合利華表示,這種方法提高了入圍候選人名單的民族多樣性,而且在遴選最終將被聘用的候選人方面更為成功。
“The things that we can do right now are stunning, but not as stunning as we’re going to be able to do next year or the year after,” says Mr Larsen.
“我們現(xiàn)在做得到的事情令人驚嘆,但我們明年或后年能夠做到的事情將會更加令人驚嘆,”拉森說。
Still, robo-recruiters must be regularly tested in case bias has crept in, says Frida Polli, chief executive of Pymetrics. “The majority of algorithmic tools are most likely perpetuating bias. The good ones should have auditing.”
話雖如此,Pymetrics首席執(zhí)行官弗里達(dá)•波利(Frida Polli)說,機器人招聘系統(tǒng)仍必須接受定期測試,以防偏見滲入。“大部分算法工具很可能會強化偏見。好的算法應(yīng)當(dāng)有審核。”
瘋狂英語 英語語法 新概念英語 走遍美國 四級聽力 英語音標(biāo) 英語入門 發(fā)音 美語 四級 新東方 七年級 賴世雄 zero是什么意思南昌市省送變電建設(shè)公司宿舍英語學(xué)習(xí)交流群