2016 Suggested Field Topics

Share the challenges you’ve encountered and solutions you’ve developed from your organization’s work in the following areas:


  • The Lone Interviewer: With the increasing use of technology, how do we avoid having interviewing staff “feel alone” in their work? Specifically, with a heavy emphasis on cost reduction and ongoing expansion of technological solutions to guide case selection, interviewer strategies, training (online videos), and remote supervision, what are organizations doing within this environment to build a sense of comradery, team, connection, morale among their staff who work from home? Having remote staff meet in person is costly and usually not within a survey budget.
  • Project Management Approach: The philosophy and consequences of moving to a more formalized Project Management approach (PMI) with survey centers, changes in the approach to survey management (i.e. expert in technology vs. expert in survey for some projects), dealing with all of the day-to-day issues.
  • Managing and sustaining your survey center: Effective approaches to developing management skills in your staff, tools and strategies such as the use of social media for marketing your center, changing your center’s image, building and sustaining new client relationships, standardizing procedures across programs
  • Working within your organization or university: Tips for working within your university or larger organization to educate and adapt policies to meet the needs of survey research, working with Human Resources on unusual or unique staffing needs.
  • Human Subjects Issues: Educating accounting and IRB staff about incentives, sensitivity of collecting biomarkers, educating IRB’s about best practices in survey research.
  • New Issues with Interviewer Hiring: Background check requirements, federal background checks and investigations, and other new(er) requirements that impact the hiring process.
  • Federally Funded Survey Issues: The OMB role, timing, research design concerns.
  • Interviewer Quality Control: Barriers, concerns, and applications of recording interviews. What impact does the move to Voice Over Internet Protocol (VOIP) have on your center, if any. How are shops consenting respondents and providing feedback to interviewers. What do you do with low performers? Retrain, terminate, provide shadow observations, develop group building experiences, etc.?
  • Managing Clients and Client Satisfaction: How to re-shape current client-center relationships, approaches to successfully convince clients of your expertise, how to evaluate client satisfaction, and act on satisfaction feedback.


  • 50th Anniversary of IFD&TC: What has changed? What has stayed the same? Trends or patterns over the last 50 years.
  • Interviewers: How do you hire the “perfect” interviewer? What traits or capabilities make an interviewer successful? How much of this can be “trained” to new and existing interviewer staff? What do interviewers need to know prior to training (i.e. familiarity with health care terms)? What skills do they need (i.e. biomarker measurement)? Different qualities for CATI vs. Field interviewers?
  • Respondent Refusals: When is a “no” a “NO!”? How do we convince respondents to participate without harassing them? Are there lines that should not be crossed with respondents regarding persuasion, cajoling, urging, and convincing participation? How does respect for the respondent play into getting acceptable response rates? How can you avoid getting a “no”? How can you turn a “no” into a “yes”?
  • Data Falsification: What counts as falsification? Why does it happen? What are the methods to detect it? Investigate it? How do you repair the damage? How do you prevent it?
  • Using Social Media: How can social media be used to recruit, retain, and inform our respondents? What are the risks? Benefits?
  • International or Cross-National Research including Multi-cultural and/or multi-language research: Survey translation, cultural adaptation of survey measures, working with multi-cultural data collection teams, gaining community cooperation, identifying appropriate sampling frames.
  • Evaluation research: Bringing best practice in data collection methods to research evaluating programs, interventions, and organizational initiatives.
  • Response Rate, Sample design, and weighting: Experience with address-based sampling, targeted mail or telephone lists, (small geographic or demographic groups), mixing random-digit dialing and cell phone lists, how to find statistical support to for complex sampling and weighting needs, tools and tips for managing web-based samples. Do response rates still matter or have most moved to new(er) measures of data quality? How are declining response rates explained or justified to clients?
  • Operational Issues: Studies with experimental designs that help determine operational changes. Any topic welcome—for example, incentive amounts, call attempts, call windows, etc.
  • Respondent incentives: Use and effectiveness of differential incentives (why and how you use them, implications for communication to respondents, navigating institutional review boards), effective alternatives to cash incentives, use of electronic incentives (e.g., gift card codes delivered electronically), impact of Federal limitations on incentives, what do we do when incentives are not an option.
  • Mining interviewer observation data: Using interviewer observations from interview attempts, refusals, or completed attempts to inform study management (reduce cost, reduce selective non-response, and increase response rates).
  • Community-based participatory research: Collaborating with community-based organizations, building and sustaining community partnerships, engaging the community.
  • Mixed & Multi-Mode: Maximizing participation rates and representativeness of respondents, managing sample and contact attempts, increasing data quality, especially blending of “tried and true” methods and mobile phone samples.


  • Optimization of mobile devices and visual design: Practical advice/solutions for optimizing surveys based on high quality visual design across platforms and mobile devices. How do you technologically manage standardization across platforms and devices?
  • Using Paradata: In what ways are you building paradata collection into your projects? How is this being done in a way that is accessible for field and project management staff? What are the technical considerations?
  • Technology, friend or foe: Communication and interaction with study participants and/or project staff across a range of technology platforms and devices (social media, email, SMS, web, Smartphones, handheld devices, etc.). What are the implications for our centers of the Telephone Consumer Protection Act (TCPA), including specific phone & software set-ups that are compliant)? Advantages and disadvantages of different web and CATI software packages.
  • Embracing new technologies: Use of web panels or crowd sourcing to collect data, new approaches to tracking and locating respondents, web-based focus groups, using social media and other technology to communicate with longitudinal or panel study participants or invite study participation.
  • Project Management Software: Available options and pros and cons of different approaches, including a discussion on how events and budgets are handled.