How To Avoid The Pitfalls Of AI
AI allows companies to process data faster and more efficiently
Posted on 06-17-2019, Read Time: Min
Share:
The HR industry increasingly is turning to AI to solve talent management problems. From automating recruitment tasks to interviewing applicants to creating more efficient recruitment processes, AI is playing an expanded role in every area of recruitment.
There are good reasons for this shift. For example, companies have more applicants and data than ever before. AI allows companies to process these data faster and more efficiently, which makes for a more streamlined process for candidates and organizations.
And when developed in a scientifically driven and valid manner, AI in assessment can lead to better predictions and outcomes. AI can create online assessments, for example, that adapt to the answers given by a candidate, leading to a more individualized assessment for each person, which wouldn’t be possible without AI’s computing capabilities.
So, there are real benefits to AI when applied to assessment, but there also are limitations as well as potential ethical and legal snares to avoid. Here’s how companies can navigate using AI in recruitment successfully.
Leverage Data Appropriately
The quality of your data informs the quality of your hiring decisions (e.g., garbage in, garbage out phenomenon). To maximize the benefits of AI for decision making, employers need to ensure their data are being used appropriately for the task at hand and that it’s structured and cleaned properly.In a more traditional assessment process, hiring managers can be confident that assessments accurately and effectively measure knowledge, skills, abilities, and other characteristics that relate to job performance and understand how these data fit into the decision-making process.
But when leveraging AI processes in recruitment, employers can be less aware of how candidates are measured and how data affects hiring decisions. Considering the thousands of data points machines must analyze, a black-box approach that uses data analysis techniques to allow machines to learn independently can result in a less than clear understanding of how AI is processing data for decision-making. To ensure that you’re using data appropriately, you should consider these questions:
- How does your data compare with more traditional assessment data?
- Are the relationships you observe between the data and outcomes expected?
- Are the variables (your inputs and outputs) appropriate for making hiring decisions?
Looking closely at these factors will help ensure you are using the data appropriately and determine whether your recruitment processes are successfully identifying quality candidates.
Embrace Transparency and Fairness
An AI system that’s built into your talent assessment must be transparent and open to challenge, and you should be transparent with applicants about what is being measured and why it is important for the job. For some applications it may not be necessary to understand how an outcome is achieved, but when the outcome can influence a candidate’s opportunity to gain employment, it’s critical that both employers and candidates fully understand the role that assessment plays.
Legal issues can also affect the framework in which employers make hiring decisions. Simply being able to use an automated decision process doesn’t remove the company’s responsibility for ensuring that it fairly assesses job-relevant skills. The complex algorithms used in AI can make selection decisions difficult to justify when their reasoning isn’t explained. And if your selection decisions can’t be easily explained, they are more easily challenged by applicants in court.
To create a transparent, “glass box” approach, you should consider:
Legal issues can also affect the framework in which employers make hiring decisions. Simply being able to use an automated decision process doesn’t remove the company’s responsibility for ensuring that it fairly assesses job-relevant skills. The complex algorithms used in AI can make selection decisions difficult to justify when their reasoning isn’t explained. And if your selection decisions can’t be easily explained, they are more easily challenged by applicants in court.
To create a transparent, “glass box” approach, you should consider:
- Is our predictive model predicated on variables and relationships that could create bias in the hiring process against a certain group?
- Was the data obtained and used in a way that is understood by the candidate?
- Are the methods used not fully transparent, such as scraping social media data?
Understanding how the algorithm collects data, ensuring that it’s sourced in a transparent way and understanding how it’s used to support decisions are all part of promoting fair and transparent hiring practices.
Manage Candidate Perceptions
Our research indicates that candidates are still skeptical about what an AI-driven assessment process entails and how it factors in hiring decisions, which in turn affects their reactions to an organization’s process.
This perception largely stems from a lack of knowledge about the various forms of AI, how they’re applicable to an assessment process and how AI can benefit or negatively affect one’s chances to perform their best.
Candidates who are more familiar with AI are generally more accepting of its use. Aon’s research has found that participants who were highly familiar with AI were equally as trusting of an AI-augmented process as an assessment process without AI.
Organizations that use AI in hiring will likely find they also need to increase the amount of interpersonal contact they have with applicants during the selection process. Even if an AI system is used to automate the decision-making process, applicants may find comfort in having open lines of communication with a designated person (e.g., recruiter) during the hiring process.
This perception largely stems from a lack of knowledge about the various forms of AI, how they’re applicable to an assessment process and how AI can benefit or negatively affect one’s chances to perform their best.
Candidates who are more familiar with AI are generally more accepting of its use. Aon’s research has found that participants who were highly familiar with AI were equally as trusting of an AI-augmented process as an assessment process without AI.
Organizations that use AI in hiring will likely find they also need to increase the amount of interpersonal contact they have with applicants during the selection process. Even if an AI system is used to automate the decision-making process, applicants may find comfort in having open lines of communication with a designated person (e.g., recruiter) during the hiring process.
Additional Resources
Webinar: Spotlight on AI - How to Avoid the Most Common Pitfalls
Taming the Rage Against AI in Hiring
Exploring New Frontiers in Talent Assessment
Applied AI in Talent Handbook
Why a Glass Box Rather than a Black Box Approach is Important When Using AI in Talent Assessment
Taming the Rage Against AI in Hiring
Exploring New Frontiers in Talent Assessment
Applied AI in Talent Handbook
Why a Glass Box Rather than a Black Box Approach is Important When Using AI in Talent Assessment
Author Bio
Evan Theys leads Aon's Assessment Solutions Product Development team in North America. Prior to joining Aon, Evan worked at Google as a Selection and Assessment Specialist where he was responsible for end-to-end development and maintenance of the “science-related” aspects of the online assessment program. Connect Evan Theys Visit www.assessment.aon.com/en-us/ |
Error: No such template "/CustomCode/topleader/category"!