This is for HR professionals on how to use surveys throughout the employee lifecycle. For even more detailed best practices, tips, and a library of HR-specific survey templates.
So you’ve defined your goals, put your plan together, and sat down to write your survey. Even if it’s not your first survey rodeo, we’ll acknowledge that the process can be a bit intimidating, especially when you know the audience (which in the case of HR initiatives, is probably your coworkers).
To put your mind at ease and enhance your survey skills, we spoke to our in-house experts: our very own survey research team. With a combined 95 years of research experience, they gifted us with valuable tips on:
- Drafting surveys
- Sending surveys
- Analyzing surveys
- Common mistakes to avoid
One of the great things about surveys is that just about anyone can write one. However, a well-crafted survey that yields actionable feedback makes the difference between reliable data and potentially misleading data. When drafting a survey consider these things:
- Identify the primary goals of your survey and stick to those goals when drafting questions. It’s often a bad idea to include too many things in one survey.
- Be thoughtful and inclusive when including demographic questions.
- Don’t forget to ask which department and office the employee works in, so you can slice and compare the data by those factors. However, only ask your employees to enter their names when it’s absolutely necessary.
- Brand your survey with your logo and theme so that employees know it’s coming from your company.
Properly setting up your survey and communicating to employees in a clear manner will impact the response rate and quality of your data. So follow these practices:
- Upload your employees’ email addresses and relevant attributes to the email collector to reduce the number of questions you have to ask.
- If possible, announce the survey at a company gathering before sending out the survey invitations.
- In the survey invitation, state clearly why you are asking your employees to take the surveys and how the data will be used.
- Explicitly tell your employees when the survey is going to be closed and whether participation is mandatory.
- Specify whether responses will be anonymous. Use survey settings to collect data anonymously if you have promised your employees anonymity.
- Send a reminder halfway to three-quarters of the way through the data collection period (i.e. 3 days if open for a week, 5-7 days if open for two weeks).
- If you have a low response rate, try to get executives and managers to encourage responses.
Analyzing your data is the payoff for all your hard work and should yield valuable insights about your workplace. Sharing this information with the appropriate members of your organization helps to empower your team with data to shape their decision-making. Some tips to keep in mind:
- Look at the results by demographics (race, age, gender, department, location, tenure), but don’t filter down to too few responses (like 5). This is because small sample sizes provide unreliable data.
- Additionally, small sample sizes could potentially identify individual employees. In particular, don’t make statements from data where you can identify the people through the statement (example: all the African American people on the accounting team think XYZ).
- Share the relevant levels of insights with your employees, managers, and leadership team, helping to increase transparency and trust between employees and the workplace.
Have you written a bad survey that didn’t provide the insights you needed? You’re not alone! Here are a few of the most common survey mistakes:
- Using too much industry-specific language, like “D&I,” or benefit-related language (PPO versus HMO). Don’t assume folks know your acronyms—take the time to spell them out.
- Making the survey too long (we recommend keeping it under 20 questions).
- Having too many topics covered in one survey.
- Using non-inclusive demographic questions.
- Including too many questions that need open-ended text responses. Your audience may start to skip the questions, and the data takes more effort to analyze.