10 Key Questions to Ask Organizational Climate Survey Purveyors Who Want Your Dough

HR professionals have sometimes asked me what they should know when scrutinizing the many purveyors of organizational climate surveys (e.g., satisfaction, engagement, alignment, etc.). There’s no one right answer to the question but I understand why people ask. There are lots of survey options out there, from established organizations with a special brand of survey to sell to less knowledgeable and/or ethical firms that don’t have a solid understanding of what they’re doing. So, here’s what I personally consider the top 10 questions to ask when doing due diligence:

1 – Is your survey statistically reliable? That is, if you and others were to take the survey several times, would you get essentially the same, consistent answer? If the answer is yes, then ask how has this reliability has been verified. What methods were used, both in terms of how the survey was administered and how it was statistically analyzed?

2 – Has your survey tool been validated? That is, has it been proven to measure what it says it will measure? (If no, then I think the discussion should be over if you’re talking about doing a major organizational survey). If so, in what ways has it been validated?

  • If they say the survey has already established “face validity,” then you need to investigate what they mean by this. In itself, it has little meaning. Do they just mean, “Well, it just looks like it does what we say it does” or do they mean, “We’ve had experts look at it and they agreed our tool looks valid”? The latter is better. Better yet is “content validity,” which means that subject matter experts have been systematically questioned about specific survey items. There are specific methods for doing this, such as those created by Lawshe and Cohen.
  • If they say the survey has “construct validity,” dig more deeply because this is an overarching term that includes other types of validity. Ask about specific types of validity analysis, such as concurrent validity, discriminant validity, and predictive validity.

3 – Do you have a deep reservoir of reliable benchmarks from comparable organizations? In some cases, companies that are trying to get into this business by the seat of their pants. The established players will have good benchmarks of large samples of people in various organizations. But keep in mind that there won’t be as many benchmarks for more innovative or customized questions because these haven’t been in circulation as long.

4 – Will we be able to see statistically significant differences and, if so, how? In a survey that’s been run several times in the same company, it’s always nice to be able to see where there have been areas of progress or decline. But spreadsheets and bar charts can be misleading because numbers that look statistically different may not be, once sample size and other variables are taken into consideration. Therefore, I suggest working with firms that will, as a matter of course, show whether gains or losses are statistically significant.

5 – How will the various levels of management be able to access the data? Survey data can be dangerous stuff and needs to be managed carefully. The last thing you want is for a supervisor to be able determine which employee responded to specific questions in specific ways. That would be a breach of confidentially, which is crucial for these surveys. Nor do you necessarily want all middle level managers to have access to all data of other mid-level managers. But you probably do want them to have access to aggregate data for their own work groups. A good system will give different managers access to different levels of data access, with high-level execs getting overall access and other managers getting access only to subgroups.

6 – Will we be able to filter and visualize the data in the ways we want? These days, a good system will allow users to filter the data to gain insights. For example, executives might want to see how people in specific job categories compare to overall data. The trick is to make sure data is filtered in useful and legal ways that do not breach confidentiality. There are a number of good data visualization tools available and, with any luck, your team will be able to “slice and dice” much of the survey information as needed. Sometimes things such “statistical significance” between numerical differences will get removed from the data during this process but I think it’s still a great way of “playing” with information to look for patterns. Ask about this in advance so you know what you will and won’t be getting.

7 – How will your company and ours communicate during the process of building, implementing and analyzing the survey? There will almost certainly be a central contact person who communicates with the client company, but you might also want to ensure that, if needed, your team has access to research analysts (the people crunching and interpreting the numbers) and IT professionals during this process.

8 – Will we be able to see sophisticated practices in areas where we need to improve? That is, if your company clearly needs help in a given area, can the survey company provide you with best practice information about how to improve? Some survey companies will not have this capability to any high degree, but larger consulting companies that do surveying probably will. The question is, will those companies use the survey findings as a “Trojan horse” to offer expensive, add-on consulting services? This can get tricky and, again, there’s no one right answer. Still, this should be part of the initial conversation, I think.

9 – Will you provide us with a well-written report that provides top-line findings to our executive staff? This should be a given.

10 – Do you include and analyze qualitative data? Qualitative data usually comes from  open-ended questions, which can be very time-consuming to analyze. If you want that kind of analysis, I suggest asking about it well in advance so you understand techniques, costs and logistics. And you’ll want to know if such data will be included in the top-line findings of the report.

Okay, so those are my own top 10 but I’m sure the opinions of others would differ. Moreover, the majority of my professional research work has been in the area of cross-organizational rather than internal climate surveys, so I don’t want to hold myself up as a guru. In fact, I’d love to get ideas and feedback from other researchers who’ve been deeply involved in climate surveys, especially if they can come from a non-affiliated (that is, they don’t currently work for a climate survey company) perspective.


Link to original post

Avatar

Leave a Reply