Tags

    News

    Onboarding Best Practices
    Good Guy = Bad Manager :: Bad Guy = Good Manager. Is it a Myth?
    Five Interview Tips for Winning Your First $100K+ Job
    Base Pay Increases Remain Steady in 2007, Mercer Survey Finds
    Online Overload: The Perfect Candidates Are Out There - If You Can Find Them
    Cartus Global Survey Shows Trend to Shorter-Term International Relocation Assignments
    New Survey Indicates Majority Plan to Postpone Retirement
    What do You Mean My Company’s A Stepping Stone?
    Rewards, Vacation and Perks Are Passé; Canadians Care Most About Cash
    Do’s and Don’ts of Offshoring
     
     

    In 2021, AI-Powered Recruitment Needs A Different Type Of Data

    AI alone is not sufficient to eliminate unintentional discrimination from processes

    Posted on 08-17-2021,   Read Time: Min
    Share:
    • Currently 2.9/5 Stars.
    • 1
    • 2
    • 3
    • 4
    • 5
    2.9 from 53 votes
     

    The Covid-19 pandemic has brought some of society’s greatest inequalities to light. It’s also provided the space for both individuals and organizations to reassess currently accepted processes. In the recruitment sphere, research has shown that levels of discrimination haven’t changed since the 1960s, while over a third of workers believe Covid-19 has delayed diversity efforts in their company.



    Businesses know that they need to take a more active role in shaping fairer, more inclusive environments, but what if they’re searching for solutions in the wrong place?

    Artificial intelligence (AI) has long been positioned as a resource that can drive diversity forward because it has the potential to remove any unconscious human bias and provide insights about a company’s makeup that might otherwise go unnoticed. Still, AI is only as neutral as the data and the logic it’s applied to.

    In October 2018, Amazon had to abandon its AI recruiting tool after it was discovered that the model had been inadvertently programmed to favor male applicants because the data was so heavily skewed by men - who make up a disproportionate amount of the tech industry’s workforce. Rather than neutralize the skewed demographics, Amazon’s tool gave male candidates greater visibility.

    AI in recruitment thus needs to be powered by a different kind of data than currently available. Organizations trying to be fairer (skills and results-oriented, providing equal opportunities regardless of gender, socio-economic background or location, etc.) will have a very hard time working with the existing recruiting solutions as well as their available data points.

    The Current Application of AI is Falling Short

    AI alone is not sufficient to eliminate unintentional discrimination from processes. If businesses don’t realize how to operationalize fairness in AI, they leave themselves exposed to a number of biases that can be replicated and generate undesirable outcomes. Generally speaking, there are two types of biases that businesses need to be aware of: intentional and unintentional. The first refers to protected classes like race, age, and gender, while the second is a variable that serves as a proxy for a protected class. For example, data about home ownership could convey disparities around race.

    Even the Natural Language Processing element of AI has been shown to conserve biases. Employees from Microsoft found that word embeddings - a visual mapping of words and their learned meanings - complied with negative stereotypes. For instance, the search analogy ‘man:woman, doctor: ’ produced the term ‘nurse’. In a recruitment scenario, these associations could mean that candidates are recommended positions that are discriminatory based on the language they use on their resumes. Web embeddings are trained on billions of words and contexts shared freely online, so the data input is far from neutral or representative. In fact, it’s plain wrong.

    Another obstacle in AI application is the distinct lack of diversity among the teams building the tech. The field is overwhelmingly white and male - more than 80% of AI professors are male, only 10% of AI researchers at Google are female, and there is little data on trans workers or other gender minorities in AI. Considering that technology is playing an increasingly important role in society and across businesses, companies have to acknowledge that AI’s ability to replicate diversity is limited if its pool of creators is not diverse themselves.

    Structured Data that Matches Real-world Selection Criteria, as well as Intentional Bias-avoidance, Will Make Recruitment Cycle Fairer

    For a recruitment sphere that’s genuinely fairer, teams have a responsibility to use data-driven systems throughout the full talent lifecycle. This may sound counterintuitive, but it’s possible with the right data - meaning data that isn’t personal. By anonymizing recruitment data, organizations can remove unintentional proxy indicators like location or education, and pave the way for more equal access to available positions.

    Ultimately, recruiters need to know how a person works, how they think, and how they fit within a team. Models have been developed for years to identify precisely these traits. Using and developing those models will supply a more telling and unbiased depiction of a candidate than conventional resume details. What’s more, there are tools that harness AI to provide an objective rendition of those characteristics.

    Ideal’s platform is trying to use AI to filter applicant databases and remove variables that commonly lead to biased screening. Meanwhile, Joonko recreates talent pipelines by applying AI to a qualified, diverse, and skills-sourced pool of applicants.

    If the data is oriented around structured and equally-assessed data-points, the hiring cycle will be more balanced from the very beginning, and emanate that equality throughout the later phases. For example, data can help make the interview stage more impartial. AI can use quality data to generate prompts that focus only on the needs of the job and don’t stray into personal questions that could provoke discrimination in the selection process.

    Honing in on skills data can give businesses a noticeable advantage in the long run, too. A report from WEF states that 54% of all employees will require reskilling and upskilling by 2022 - a figure that is likely higher due to the COVID-19 pandemic. In anticipation of the Great Reset, people are looking to companies with an environment of constant learning and professional development. Businesses that don’t provide this are at risk of losing interest from prospective hires, as well as existing employees.

    The next generation of structured data will offer a deeper understanding of the strengths in a workforce, as well as the gaps that can be targeted internally and through recruitment. Not just from a skill or ability perspective, but from a 360 analysis of a candidate’s personality, professional culture, leadership and team match. Being aware of these gaps encourages businesses to develop more in-house opportunities to reskill and upskill, and subsequently find and retain fulfilled talent for longer.

    Author Bio

    Andrés Cajiao, is Co-Founder & Chief Growth Officer at Torre.

    Error: No such template "/CustomCode/topleader/category"!
     
    ePub Issues

    This article was published in the following issue:
    August 2021 Talent Acquisition Excellence

    View HR Magazine Issue

    Error: No such template "/CustomCode/storyMod/editMeta"!

    Comments

    😀😁😂😃😄😅😆😇😈😉😊😋😌😍😎😏😐😑😒😓😔😕😖😗😘😙😚😛😜😝😞😟😠😡😢😣😤😥😦😧😨😩😪😫😬😭😮😯😰😱😲😳😴😵😶😷😸😹😺😻😼😽😾😿🙀🙁🙂🙃🙄🙅🙆🙇🙈🙉🙊🙋🙌🙍🙎🙏🤐🤑🤒🤓🤔🤕🤖🤗🤘🤙🤚🤛🤜🤝🤞🤟🤠🤡🤢🤣🤤🤥🤦🤧🤨🤩🤪🤫🤬🤭🤮🤯🤰🤱🤲🤳🤴🤵🤶🤷🤸🤹🤺🤻🤼🤽🤾🤿🥀🥁🥂🥃🥄🥅🥇🥈🥉🥊🥋🥌🥍🥎🥏
    🥐🥑🥒🥓🥔🥕🥖🥗🥘🥙🥚🥛🥜🥝🥞🥟🥠🥡🥢🥣🥤🥥🥦🥧🥨🥩🥪🥫🥬🥭🥮🥯🥰🥱🥲🥳🥴🥵🥶🥷🥸🥺🥻🥼🥽🥾🥿🦀🦁🦂🦃🦄🦅🦆🦇🦈🦉🦊🦋🦌🦍🦎🦏🦐🦑🦒🦓🦔🦕🦖🦗🦘🦙🦚🦛🦜🦝🦞🦟🦠🦡🦢🦣🦤🦥🦦🦧🦨🦩🦪🦫🦬🦭🦮🦯🦰🦱🦲🦳🦴🦵🦶🦷🦸🦹🦺🦻🦼🦽🦾🦿🧀🧁🧂🧃🧄🧅🧆🧇🧈🧉🧊🧋🧍🧎🧏🧐🧑🧒🧓🧔🧕🧖🧗🧘🧙🧚🧛🧜🧝🧞🧟🧠🧡🧢🧣🧤🧥🧦
    🌀🌁🌂🌃🌄🌅🌆🌇🌈🌉🌊🌋🌌🌍🌎🌏🌐🌑🌒🌓🌔🌕🌖🌗🌘🌙🌚🌛🌜🌝🌞🌟🌠🌡🌢🌣🌤🌥🌦🌧🌨🌩🌪🌫🌬🌭🌮🌯🌰🌱🌲🌳🌴🌵🌶🌷🌸🌹🌺🌻🌼🌽🌾🌿🍀🍁🍂🍃🍄🍅🍆🍇🍈🍉🍊🍋🍌🍍🍎🍏🍐🍑🍒🍓🍔🍕🍖🍗🍘🍙🍚🍛🍜🍝🍞🍟🍠🍡🍢🍣🍤🍥🍦🍧🍨🍩🍪🍫🍬🍭🍮🍯🍰🍱🍲🍳🍴🍵🍶🍷🍸🍹🍺🍻🍼🍽🍾🍿🎀🎁🎂🎃🎄🎅🎆🎇🎈🎉🎊🎋🎌🎍🎎🎏🎐🎑
    🎒🎓🎔🎕🎖🎗🎘🎙🎚🎛🎜🎝🎞🎟🎠🎡🎢🎣🎤🎥🎦🎧🎨🎩🎪🎫🎬🎭🎮🎯🎰🎱🎲🎳🎴🎵🎶🎷🎸🎹🎺🎻🎼🎽🎾🎿🏀🏁🏂🏃🏄🏅🏆🏇🏈🏉🏊🏋🏌🏍🏎🏏🏐🏑🏒🏓🏔🏕🏖🏗🏘🏙🏚🏛🏜🏝🏞🏟🏠🏡🏢🏣🏤🏥🏦🏧🏨🏩🏪🏫🏬🏭🏮🏯🏰🏱🏲🏳🏴🏵🏶🏷🏸🏹🏺🏻🏼🏽🏾🏿🐀🐁🐂🐃🐄🐅🐆🐇🐈🐉🐊🐋🐌🐍🐎🐏🐐🐑🐒🐓🐔🐕🐖🐗🐘🐙🐚🐛🐜🐝🐞🐟🐠🐡🐢🐣🐤🐥🐦🐧🐨🐩🐪🐫🐬🐭🐮🐯🐰🐱🐲🐳🐴🐵🐶🐷🐸🐹🐺🐻🐼🐽🐾🐿👀👁👂👃👄👅👆👇👈👉👊👋👌👍👎👏👐👑👒👓👔👕👖👗👘👙👚👛👜👝👞👟👠👡👢👣👤👥👦👧👨👩👪👫👬👭👮👯👰👱👲👳👴👵👶👷👸👹👺👻👼👽👾👿💀💁💂💃💄💅💆💇💈💉💊💋💌💍💎💏💐💑💒💓💔💕💖💗💘💙💚💛💜💝💞💟💠💡💢💣💤💥💦💧💨💩💪💫💬💭💮💯💰💱💲💳💴💵💶💷💸💹💺💻💼💽💾💿📀📁📂📃📄📅📆📇📈📉📊📋📌📍📎📏📐📑📒📓📔📕📖📗📘📙📚📛📜📝📞📟📠📡📢📣📤📥📦📧📨📩📪📫📬📭📮📯📰📱📲📳📴📵📶📷📸📹📺📻📼📽📾📿🔀🔁🔂🔃🔄🔅🔆🔇🔈🔉🔊🔋🔌🔍🔎🔏🔐🔑🔒🔓🔔🔕🔖🔗🔘🔙🔚🔛🔜🔝🔞🔟🔠🔡🔢🔣🔤🔥🔦🔧🔨🔩🔪🔫🔬🔭🔮🔯🔰🔱🔲🔳🔴🔵🔶🔷🔸🔹🔺🔻🔼🔽🔾🔿🕀🕁🕂🕃🕄🕅🕆🕇🕈🕉🕊🕋🕌🕍🕎🕐🕑🕒🕓🕔🕕🕖🕗🕘🕙🕚🕛🕜🕝🕞🕟🕠🕡🕢🕣🕤🕥🕦🕧🕨🕩🕪🕫🕬🕭🕮🕯🕰🕱🕲🕳🕴🕵🕶🕷🕸🕹🕺🕻🕼🕽🕾🕿🖀🖁🖂🖃🖄🖅🖆🖇🖈🖉🖊🖋🖌🖍🖎🖏🖐🖑🖒🖓🖔🖕🖖🖗🖘🖙🖚🖛🖜🖝🖞🖟🖠🖡🖢🖣🖤🖥🖦🖧🖨🖩🖪🖫🖬🖭🖮🖯🖰🖱🖲🖳🖴🖵🖶🖷🖸🖹🖺🖻🖼🖽🖾🖿🗀🗁🗂🗃🗄🗅🗆🗇🗈🗉🗊🗋🗌🗍🗎🗏🗐🗑🗒🗓🗔🗕🗖🗗🗘🗙🗚🗛🗜🗝🗞🗟🗠🗡🗢🗣🗤🗥🗦🗧🗨🗩🗪🗫🗬🗭🗮🗯🗰🗱🗲🗳🗴🗵🗶🗷🗸🗹🗺🗻🗼🗽🗾🗿
    🚀🚁🚂🚃🚄🚅🚆🚇🚈🚉🚊🚋🚌🚍🚎🚏🚐🚑🚒🚓🚔🚕🚖🚗🚘🚙🚚🚛🚜🚝🚞🚟🚠🚡🚢🚣🚤🚥🚦🚧🚨🚩🚪🚫🚬🚭🚮🚯🚰🚱🚲🚳🚴🚵🚶🚷🚸🚹🚺🚻🚼🚽🚾🚿🛀🛁🛂🛃🛄🛅🛆🛇🛈🛉🛊🛋🛌🛍🛎🛏🛐🛑🛒🛕🛖🛗🛠🛡🛢🛣🛤🛥🛦🛧🛨🛩🛪🛫🛬🛰🛱🛲🛳🛴🛵🛶🛷🛸

    ×


     
    Copyright © 1999-2025 by HR.com - Maximizing Human Potential. All rights reserved.
    Example Smart Up Your Business