NAIT 2014 PD day presentation: A step-by-step guide to developing better, more useful surveys

summary of pd presentation

Hi all,

Apologies for the long delay since our last post! We’ve been busy working on other projects for the institution including launching the Student Engagement survey and preparing the Entering student survey to be launched Fall 2014.

We also spent some time preparing a presentation for NAIT Professional Development days (June 4th and August 28th). Our presentation focused on key steps involved in developing a survey, including:

  • Set your objectives (What do you need to know?)
  • Sampling (Who knows what you don’t know?)
  • Choose a methodology (How are we going to learn what they know?)
  • Develop strong survey questions
  • Launch survey, monitor results
  • Clean data/explore findings
  • Interpret results (through the lens of your objective)

The five key takeaways from our session were:

(1)    The importance of defining your survey objectives. This is important because it prevents scope creep (avoiding “wouldn’t it be interesting to know X….” questions) and survey length. Objectives are also important as a focusing tool during the analysis phase.

(2)    Survey development is an iterative process. Despite the step-by-step format we delivered the presentation, developing a survey is an iterative process. Throughout the survey development process (and into the analysis phase), we consider how a decision at one step will impact other decisions we have made. For example, you may refine or change your objectives based on a discussion about sampling or methodology.

(3)    Question wording is important. Developing survey questions is as much as a science (avoid double barreled and leading questions) as an art form (know your audience, consider emotional triggers that may bias responses).It’s always a good idea to pilot your survey with a few members of your sample population to ensure your questions are clearly written and easy to understand.

 (4)    Be careful when interpreting results (and reading charts from other researchers or media publications!). Remember that correlation does not equal causation (just because two factors are correlated doesn’t mean that one factor is causing the other to occur – see example below).

correlation does not equal causation

Also, always consider whether the author is purposely presenting data in a way to make a point. See Media Matters terrific post covering “A History of Dishonest Fox [News] charts”  for examples of misleading charts.

 (5)    NAIT Institutional Research department is here to help as you develop surveys or interpret results!

We’ll be running another session on August 28th. We encourage all NAIT staff members to attend!

We put together a guide for best practices in survey design that may be helpful as you develop your own surveys. You can access it on our website here.

Survey best practices – (2) choose an audience and delivery method

studentsIn the last post, we discussed the importance of setting concrete objectives as a first step in developing a survey. Once you have clearly articulated your objective, the next step is to identify who you want to fill out the survey. After going through the process of developing your objective you should have some idea of what your sample population will look like.

In this step, you want to make explicit who will be included (and excluded) in your survey sample as well as decide whether information collected will be confidential (name included, but not shared with anyone) or anonymous (name not asked of respondents).

Your survey sample should reflect your survey objective. Who you decide to survey will change depending on what your research objectives are. For example, consider the required survey sample for the following objective:

To review student’s food preferences in the school cafeteria in order to better understand what students would like to eat while on campus so we can make informed decisions about food services at X institution going forward.

We obviously want to sample X institution’s students, but we need to be more specific:

  • Should we ask all enrolled students (on-campus, at satellite campuses, online learners)?
  • Should we ask only those students who actually eat on campus or all students to examine why these students choose not to eat on campus?
  • Should we ask both credit and non-credit students?

My point is not to be exhaustive with the above questions but rather to highlight the importance of considering who you want to included (and exclude) from your sample.

At this point you should also consider how much information about the respondents you want to collect and whether survey responses will be merely confidential or anonymous. Unless you plan to follow-up with respondents (and therefore need to know their name and contact details) or link their responses to another data set (for example, linking Student Engagement Survey responses to responses on an Entering Student Survey to track changes in views/attitudes over time) anonymous responses is likely sufficient.

Once you have decided on who your sample is going to be and whether their responses will be treated anonymously or confidentially, the next step is to decide on a data collection method. There are a number of options, including mail-outs, telephone, face-to-face interviews, and web-based (email) surveys. Each option has associated pros and cons such as cost, human resource (i.e. time) requirements, and response rate. The method you choose should align with the objective of the survey with consideration of the population being surveyed. Other realities such as cost and delivery deadlines may also influence this decision.

In our example, the cost and time commitment to complete a telephone, F2F, or mail-out survey would be far too great. Moreover, most students have access to email so our chosen data collection method is fairly clear – email invitations to participate in a short survey.

At this point, you should have:

  • Developed a clear and detailed survey objective
  • Developed a description of your sample, including who is included and who is NOT included
  • Made a decision whether your sample’s responses will be treated confidentially or anonymously
  • Chosen a data collection method based on your objectives and survey sample

For the next post we will get into the creation of survey items, including deciding on a measurement scale and the proper wording of questions.

Cheers,

David

Follow us on Twitter @nait_ir

Survey best practices – (1) develop and follow an objective

Creating and administering a survey is relatively easy – draft a dozen or so questions, load these questions into one of the many online survey tools available, send invitations to a group asking them to complete the survey, then review the responses.

Following these steps, however, does not guarantee reliable and useful survey results. To achieve reliable and useful survey results requires rigorous planning and testing. The first key step is to identify all objectives you have in administering the survey. For example:

Objective

To review student’s food preferences in the school cafeteria in order to better understand what students would like to eat while on campus so we can make informed decisions about food services at X institution going forward.

Notice how the first part of the objective focuses on the subject of the survey while the end of each objective identifies how the data will be used. If, at the outset of developing a survey, you are forced to consider how you will use the data once it is collected you will save yourself (and your respondents) time. Students and staff members are busymake it a rule not to waste their (or your) time asking questions if you have no plan for how to use their responses. 

An important consideration is whether a survey is the right tool to get at the information you seek. In developing your objective, you might realize that an alternative method, e.g. mine existing data (e.g. previous surveys, enrolment stats, etc.) to get your answer or host a focus group. Some of these alternative information gathering methods will be reviewed in future posts.

Once you’ve decided to use the survey methodology and developed an objective, commit the objective to memory. Alternatively: steal my strategy and write down your objective in big font on a piece of paper and refer to it as you add questions to your survey. With each question you consider adding, ask yourself: “Does this question help me reach my survey objective?” Obviously, if the answer is no, don’t include it in the survey (even if it’s a really really interesting unrelated question that you just have to know the answer to).

How do you know if a survey question fulfills the objective? It has to pass through the two parts of your objective. For example:

ObjectiveTo review student’s food preferences in the school cafeteria in order to better understand what students would like to eat while on campus so we can make informed decisions about food services at X institution going forward.

(1) Does your question relate to student’s food preference in the school cafeteria? (Is it related to the topic at hand?)

(2) Will the responses inform decisions about food services at X institution? (Are the results from this question actionable?)

In other words, can we feasibly make a change to an existing policy or service offering if the responses indicate we should?

For example, say you had the following question related to food services: “Would you want X institution to have a late night food option?”  (responses: yes, no, don’t care)

Students may want to have food services available until midnight every night but it probably doesn’t make sense financially to offer this level of service considering the small number of students that are on campus after 10 pm. Moreover, even if enough students would use this service to make it worthwhile, staffing and safety considerations may prevent hours from being extended.

In short, if a survey item is not actionable then don’t include it on the survey.

Over the next few posts, we’ll continue to outline what we have found to be best practices in survey design and implementation. If you have any other tips on survey design please feel free to leave them in the comments section! I may pull a few tips into future posts (with credit, of course).

Cheers,

David (on behalf of the @NAIT_IR team)

Conference Board of Canada: highly skilled population linked to overall social well-being and health of individuals

The Conference Board of Canada’s Centre for Skills and Post-Secondary Education recently posted an interesting article exploring the evidence linking a highly skilled population and overall social well-being and health of individuals within that population. In their analysis, the Conference Board broke down a number of key factors related to skills and education:

  1. Skills and education are key determinants of economic productivity and growth.
  2. Individuals with advanced skills and education do better in the labour market than those without.
  3. Highly educated Canadians are more active in their communities and politics.
  4. Advanced skills and higher education are associated with better physical and mental health.

Overall, the Conference Board reports that,

“…evidence shows that skills are critically important and that efforts to ensure that Canada’s PSE system continues to produce highly skilled graduates are well-founded”

More specifically,

“Canadians with less than a high school diploma have an employment rate of only 55 per cent, while those with university degrees or college diplomas have employment rates of 82 and 81 per cent, respectively” (Ministry of Training, Colleges, and Universities, Ontario Labour Market Statistics for January 2012, 2.)

“…although there are differences across disciplines, higher education credential holders aged 25 to 64 earn, on average, 39 per cent more than high school graduates.” (OECD Education at a Glance, 2014)

The full post, which includes linked citations to Statistics Canada data and other independent research, is available on the Conference Board’s website here. While these findings may seem intuitive it is interesting to see our assumptions regarding the link between PSE participation rates and overall societal well-being quantified.  

Welcome to NAIT IR’s “By the Numbers”

HP CentreOur vision for the NAIT Institutional Research “By the Numbers” blog is to create a space to highlight interesting trends in post-secondary education. Often, these trends will relate specifically to NAIT students and staff since much of IR’s incoming data is from our own registrar’s office as well as surveys we have conducted on campus.

Our goals with NAIT By the Numbers:

  1. To promote the use of Institutional Research resources internally at NAIT, including reports, data sets, and survey tools, and, where possible, to provide support for our colleagues who are needing to conduct surveys or analyse data as part of their own work at NAIT.
  2. To engage our colleagues at NAIT and outside of the institution in a discussion about Canadian post-secondary education, focusing on the unique perspective of Canadian polytechnics.
  3. Above all, we plan to tell interesting data-driven stories about NAIT, hopefully providing some useful insight into the latest policies and trends impacting post-secondary education in Canada.

Find us on twitter @NAIT_IR

Thanks for reading!