Tags
Administration
Benefits
Communication
Communication Programs
Compensation
Conflict & Dispute Resolution
Developing & Coaching Others
Employee Satisfaction/Engagement
Executive Coaching
HR Metrics & Measurement
HR Outsourcing
HRIS/ERP
Human Resources Management
Internal Corporate Communications
Labor Relations
Labor Trends
Leadership
Leadership Training & Development
Leading Others
Legal
Management
Motivating
Motivation
Organizational Development
Pay Strategies
Performance Management
Present Trends
Recognition
Retention
Staffing
Staffing and Recruitment
Structure & Organization
Talent
The HR Practitioner
Training
Training and Development
Trends
U.S. Based Legal Issues
Vision, Values & Mission
Work-Life Programs & Employee Assistance Programs - EAP
Workforce Acquisition
Workforce Management
Workforce Planning
Workplace Regulations
corporate learning
employee engagement
interpersonal communications
leadership competencies
leadership development
legislation
News
Onboarding Best Practices
Good Guy = Bad Manager :: Bad Guy = Good Manager. Is it a Myth?
Five Interview Tips for Winning Your First $100K+ Job
Base Pay Increases Remain Steady in 2007, Mercer Survey Finds
Online Overload: The Perfect Candidates Are Out There - If You Can Find Them
Cartus Global Survey Shows Trend to Shorter-Term International Relocation Assignments
New Survey Indicates Majority Plan to Postpone Retirement
What do You Mean My Company’s A Stepping Stone?
Rewards, Vacation and Perks Are Passé; Canadians Care Most About Cash
Do’s and Don’ts of Offshoring
Error: No such template "/hrDesign/network_profileHeader"!
Blogs / Send feedback
Help us to understand what's happening?
Reason
It's a fake news story
It's misleading, offensive or inappropriate
It should not be published here
It is spam
Your comment
More information
Security Code
To Benchmark or Not To Benchmark? A Benchmark Tutorial
Created by
David Youssefnia
Content
<p><font size="2" color="#000000" face="Verdana,Geneva,Arial,Helvetica,sans-serif">When examining the results of your survey, you may be asked about how your results compare with those of other organizations. Some consulting firms and survey vendors provide benchmarks. But before making the decision to examine these external points of comparison, it is important to understand what goes into a benchmark and with whom you are actually comparing your results against.</font></p>
<p>The following tutorial is designed to give you an overview of various approaches to survey benchmarking.</p>
<p><b>1. Client Additive Benchmarks:</b> This approach is based on the compilation of survey results from the clients of a given consulting firm. These benchmarks may contain responses from hundreds or even thousands of employees across hundreds of companies. The sheer number of responses can be impressive but also potentially misleading. Here´s why: Unless each client uses a core set of survey questions furnished by the vendor, the database likely will vary with respect to the number of companies contributing to each benchmark question. So, whereas your survey responses to one question may be compared with results from 13 companies and 34,000 respondents, responses to another question may only be compared with results from only 3 companies and 800 responses. So if your results differ from benchmark findings, it is difficult to determine whether the discrepancy is attributable to a difference between your company and the external environment or a difference between one benchmark and another. Additionally, there needs to be standardization and consistency in the order and method in which survey questions are presented, and in the scales that are used. Slight differences in these factors also can impact the integrity of the benchmark and the accuracy of the comparison (i.e., is the difference between your score and the benchmark a real difference or does it relate to how the questions may have been asked?). Client additive benchmarks have an additional limitation: your results are being compared with those of a single consulting firm´s client list. This list excludes many companies that may be doing their own survey on their own or who may have decided to hire a different firm.<b> </b></p>
<p><b>2. Workforce Studies:</b> This approach to survey benchmarks evolved as survey vendors realized the drawbacks in quality and the high expense of developing and maintaining quality client additive benchmarks. Workforce studies are similar to public opinion polls conducted by newspapers (e.g., the <i>USA Today</i> Poll) as well as research conducted by market research firms. A set of questions is administered to a sample of the workforce, typically segmented by industry and/or geography. Unlike the respondents in client additive benchmarks, these respondents are individuals who work in various organizations. These benchmarks do not reflect the responses of groups of employees who work for a single organization, but rather those of individual employees who work for individual companies. In other words, a workforce study database could have 1,000 respondents from 1,000 different companies. Although claims of statistical representation can be made, there is less control over what types of companies go into the database. Moreover, there is the question of who is actually completing the survey. Respondents usually are offered an incentive (such as a monetary reward, gift certificate, or entry into a raffle) to encourage participation in multiple surveys. Furthermore, comparing the results of your survey, which is sponsored by an employer, with those of a workforce-study survey, which is sponsored by a polling or market research firm, also may be questionable. Employees typically want to help their employers by sharing important feedback. Respondents to workforce surveys are not participating under the same assumptions, and thus comparisons also may be tenuous.</p>
<p><b>3. Consortium Benchmarks:</b> The third type of benchmark is the consortium benchmark. In our opinion, this is the most valuable, accurate, and worthwhile external benchmark for comparing your survey results. These benchmarks are based on results from a collection of companies that share survey results and best practices in survey research. Membership is by application, and member companies must commit to the terms of membership, which include asking a minimum set of questions, attending meetings, and contributing best practices.</p>
<p>The Mayflower Group (<a href="http://www.mayflowergroup.org/">www.mayflowergroup.org</a>) is one of the oldest survey consortia; it comprises large, multinational organizations with established survey programs. Other groups also have been formed, such as the Information Technology Survey Group (ITSG; <a href="http://www.itsg.org/">www.itsg.org</a>) and MIDAS, a similar group for the financial services. </p>
<p><b>One Final Note:</b> We hope this primer has helped you understand the different types of survey benchmarks available today. Of course, there´s one survey benchmark we didn´t discuss - and that is your own survey data. Looking at a specific group within your organization relative to other groups, as well as in relation to your previous survey results, often can provide the most useful and relevant comparisons. </p>
<p>For more information about Critical Metrics, LLC, please visit <a href="http://www.critical-metrics.com/">www.critical-metrics.com</a>, or contact us via phone 212 675 9211 or email <a href="mailto:info@critical-metrics.com">info@critical-metrics.com</a>.</p>
Copyright © 1999-2025 by
HR.com - Maximizing Human Potential
. All rights reserved.