The Office of Institutional Research and Decision Support (IRDS) leads efforts for designing, collecting, and analyzing survey research related to key university stakeholders including students, faculty, and staff. Data collected via survey research is used to inform planning, policy, and reporting obligations.

IRDS Survey Process

When planning a survey, an important first step is to clearly define the question to be answered by the survey results. Once a question or topic has been identified, consider if a survey is the best way to answer your question. IRDS suggests reviewing existing information to see if it is possible to answer the question using data that has already been collected. IRDS is happy to assist with this step.

The following guide outlines the process of administering a survey with IRDS or hosting a survey yourself:

IRDS conducts survey research on behalf of the institution. Surveys conducted by IRDS are intended to be comprehensive analyses of full institutional populations. While program reviews and targeted surveys are important, these are best left for individual schools or programs. When considering your survey population, ensure you are thinking holistically about the information you need. Do you need information on participants? What about non-participants? Are you going to need information about sub-populations such as by major or by demographic? When you are developing your roster, ensure you will have enough responses from each population and sub-population to find meaningful relationships in the data.

Before administering a survey to a population, and therefore emailing a large segment of that population, it is essential to notify and get permission from the stakeholder(s) responsible for that population. Ensure that your targeted survey population and institutional sponsor(s) are well aligned.

IRDS does not conduct surveys independently but serves as a neutral party conducting research on behalf of institutional stakeholders. IRDS will always designate the survey sponsor transparently to respondents by indicating that the survey is being conducted by IRDS on behalf of a sponsored party. When surveys are identified as affecting multiple offices or units across campus, IRDS will help to facilitate conversations among key stakeholders to avoid duplication of efforts and encourage data transparency.

In some cases, the survey sponsor may need to ask for permission to send the survey invitation on behalf of a recognizable Emory stakeholder in order to improve responsiveness.

IRDS may assist sponsors in various aspects of survey research, as requested. This assistance may include designing survey instruments, setting up survey collection using specialized software, data collection, and analyzing results.

In instances where third-party survey designers are hired as consultants or as administrators, IRDS will provide them with rosters and contact information in a format that protects the privacy of the respondents according to the agreed upon design.

When designing survey instruments, best practices outline that the length and breadth of the instrument is of utmost importance. Surveys should be designed as efficiently as possible in order to maximize survey responsiveness and completion rates. More specifically, questions should directly address the stated intent of the survey to avoid becoming prohibitively lengthy or complex, which may encourage respondents to abandon the survey part-way or select arbitrary answers to speed to the end. Only ask questions that you need and intend to report and/or address. Questions not directly related to the topic or population of interest may best be included as a supplement or in a separate survey.

Typical survey design takes approximately one month to develop and test before launch. The design process includes:

  • Meet with the client to review scope and intent (Varies based on scheduling)
  • Revise or Design survey instrument Questions, scales, and responses (2 weeks depending on the size of the survey)
  • Review and revise instrument (1 week depending on the number of stakeholders)
  • Load and test instrument in survey software (1 week depending on complexity of skip logic)

Where necessary, additional time may be required for executive review and sponsorship.

IRDS has a number of resources to assist with survey design. In particular, we are happy to share our question bank of demographic items and best-practices for using various scales for satisfaction, agreement, frequently, or quality.

All surveys administered by IRDS will designate how the data will be used and disclose the data management roles and responsibilities.

Surveys listed as confidential will include unique identifiers that may be matched to other administrative records across campus including data found in the student information system, HR system, and others. If the survey promises that the responses will be treated as confidential that means that the results are confidential to the research team; i.e., the results will not be reported at a level of granularity that would allow someone outside the research team to infer the identity of an individual.

Surveys indicating anonymity will not include any identifying information and will have no mechanism for including additional data. A downside to an anonymous survey is that it is difficult to restrict the responses to one per individual. However, for sensitive subject matters, anonymity may yield more honest responses.

For all surveys, a disclaimer will be included specifying that any response provided to open-ended questions will be given back to the survey sponsor as-is with no redaction. Surveys wishing to use the information collected to contact respondents must present respondents with the option to opt-in to be contacted.

Unless otherwise specified and clearly included on the survey, IRDS will only provide aggregated results to the sponsor.

Any survey sponsor requesting unit record responses will need to disclose this designation to respondents in the survey email invitation and instructions. In addition, any survey sponsor requesting unit record responses must agree to the principles listed in the IRDS Survey Data Sharing Agreement.

Where in question, standard research protocol as specified by the institutional review board (IRB) and collaborative institutional training initiative (CITI) on human subjects research will be applied to surveys. While most of these surveys are for administrative purposes and are not considered traditional research, the same principles of ethical conduct will apply.

Survey software licenses and rosters managed by IRDS will not be shared with school or program units unless otherwise authorized. However, IRDS will consult with all levels of those interested in conducting survey research – we can assist with helping to identify populations, design survey instruments, and timelines that are best for the institution and project.

IRDS manages the requests to conduct surveys to ensure there is not significant overlap of instruments at the same time. This approach ensures each survey has the best opportunity to maximize response rates.

IRDS publicly posts a survey research calendar listing all scheduled surveys for the semester with associated topics and populations. Where possible, we encourage others conducting research on campus to contribute to this calendar and include their scheduled surveys. This calendar is intended to transparently provide information to students, faculty, and staff on campus about current or upcoming surveys they may receive. This transparency provides a better understanding of the survey research being conducted on campus, topics of interest, and provides legitimacy of the request to respondents when receiving notification to participate.

Depending upon the population and needs of your survey, it may be desirable to offer incentives to survey respondents. Two guidelines to consider:

  • Incentives must be positive (i.e. no negative consequences for non-respondents) and
  • Incentives should be relevant to the topic of the survey (e.g., if you are conducting a transportation survey with the purpose of encouraging public transit, free parking passes would not be an appropriate incentive).

The standard language below has been vetted by Emory's Office of General Counsel to ensure compliance with Georgia State Law, and must be included in all surveys offering incentives for participation:

Students/Faculty/Staff who participate in the survey by DATE will automatically be entered in a drawing OR choose to participate in the drawing by providing their email on separate form for INCENTIVE (worth $X). Your chances depend on how many students/faculty/staff participate; X students/faculty/staff were invited to participate. Recipients will be notified by DATE.

You do not have to complete the survey to enter the drawing. If you would like to be considered for the drawing but do not wish to complete the survey, please send a self-addressed letter to Institutional Research and Decision Support, 201 Dowman Dr, Suite 313, Atlanta, GA 30322 and include your contact information including your name, email address, and phone number.

IRDS recommends reminders only when necessary, and a limit of two rounds. Given the high number of surveys on campus, we monitor response rates and send additional reminders only as needed.

We have created a Survey Invitation Template for you to use.

IRDS employs best practices for managing, analyzing, and reporting survey data. To protect the privacy of Emory student, faculty, and staff records, IRDS only reports summary results and uses the standard practice of only reporting cell sizes of 5 or greater.

Analysis of survey results can take many weeks to complete based on the complexity of the data and the requested deliverable. Executive summaries and descriptive statistics may only take two weeks while interactive dashboards or advanced statistical inference may take six weeks or more. If additional data is merged into the survey results, the analyses may take longer. For recurring surveys, templates can be developed to pre-populate results and update the findings. Designing the templates and dashboards may take a longer time investment to set up, but they pay dividends if results are replicated multiple times over terms or years.

Whenever possible, we encourage you to share the results in a way that is accessible to the community surveyed. We have found that being transparent with survey results shows the Emory community the value in answering our surveys.

University Survey Calendar

Consult the schedule below to view the current list of surveys administered to members of the Emory University community.

Please email our office at to suggest additions to this list, which features survey projects conducted to inform policy, planning discussions on campus and reporting obligations.